Novais, J L; Titchener-Hooker, N J; Hoare, M
2001-10-20
Time to market, cost effectiveness, and flexibility are key issues in today's biopharmaceutical market. Bioprocessing plants based on fully disposable, presterilized, and prevalidated components appear as an attractive alternative to conventional stainless steel plants, potentially allowing for shorter implementation times, smaller initial investments, and increased flexibility. To evaluate the economic case of such an alternative it was necessary to develop an appropriate costing model which allows an economic comparison between conventional and disposables-based engineering to be made. The production of an antibody fragment from an E. coli fermentation was used to provide a case study for both routes. The conventional bioprocessing option was costed through available models, which were then modified to account for the intrinsic differences observed in a disposables-based option. The outcome of the analysis indicates that the capital investment required for a disposables-based option is substantially reduced at less than 60% of that for a conventional option. The disposables-based running costs were evaluated as being 70% higher than those of the conventional equivalent. Despite this higher value, the net present value (NPV) of the disposables-based plant is positive and within 25% of that for the conventional plant. Sensitivity analysis performed on key variables indicated the robustness of the economic analysis presented. In particular a 9-month reduction in time to market arising from the adoption of a disposables-based approach, results in a NPV which is identical to that of the conventional option. Finally, the effect of any possible loss in yield resulting from the use of disposables was also examined. This had only a limited impact on the NPV: for example, a 50% lower yield in the disposable chromatography step results in a 10% reduction of the disposable NPV. The results provide the necessary framework for the economic comparison of disposables and conventional bioprocessing technologies. Copyright 2001 John Wiley & Sons, Inc.
Pieterman, Elise D; Budde, Ricardo P J; Robbers-Visser, Daniëlle; van Domburg, Ron T; Helbing, Willem A
2017-09-01
Follow-up of right ventricular performance is important for patients with congenital heart disease. Cardiac magnetic resonance imaging is optimal for this purpose. However, observer-dependency of manual analysis of right ventricular volumes limit its use. Knowledge-based reconstruction is a new semiautomatic analysis tool that uses a database including knowledge of right ventricular shape in various congenital heart diseases. We evaluated whether knowledge-based reconstruction is a good alternative for conventional analysis. To assess the inter- and intra-observer variability and agreement of knowledge-based versus conventional analysis of magnetic resonance right ventricular volumes, analysis was done by two observers in a mixed group of 22 patients with congenital heart disease affecting right ventricular loading conditions (dextro-transposition of the great arteries and right ventricle to pulmonary artery conduit) and a group of 17 healthy children. We used Bland-Altman analysis and coefficient of variation. Comparison between the conventional method and the knowledge-based method showed a systematically higher volume for the latter group. We found an overestimation for end-diastolic volume (bias -40 ± 24 mL, r = .956), end-systolic volume (bias -34 ± 24 mL, r = .943), stroke volume (bias -6 ± 17 mL, r = .735) and an underestimation of ejection fraction (bias 7 ± 7%, r = .671) by knowledge-based reconstruction. The intra-observer variability of knowledge-based reconstruction varied with a coefficient of variation of 9% for end-diastolic volume and 22% for stroke volume. The same trend was noted for inter-observer variability. A systematic difference (overestimation) was noted for right ventricular size as assessed with knowledge-based reconstruction compared with conventional methods for analysis. Observer variability for the new method was comparable to what has been reported for the right ventricle in children and congenital heart disease with conventional analysis. © 2017 Wiley Periodicals, Inc.
Bonini-Rocha, Ana Clara; de Andrade, Anderson Lúcio Souza; Moraes, André Marques; Gomide Matheus, Liana Barbaresco; Diniz, Leonardo Rios; Martins, Wagner Rodrigues
2018-04-01
Several interventions have been proposed to rehabilitate patients with neurologic dysfunctions due to stroke. However, the effectiveness of circuit-based exercises according to its actual definition, ie, an overall program to improve strength, stamina, balance or functioning, was not provided. To examine the effectiveness of circuit-based exercise in the treatment of people affected by stroke. A search through PubMed, Embase, Cochrane Library, and Physiotherapy Evidence Database databases was performed to identify controlled clinical trials without language or date restriction. The overall mean difference with 95% confidence interval was calculated for all outcomes. Two independent reviewers assessed the risk of bias. Eleven studies met the inclusion criteria, and 8 presented suitable data to perform a meta-analysis. Quantitative analysis showed that circuit-based exercise was more effective than conventional intervention on gait speed (mean difference of 0.11 m/s) and circuit-based exercise was not significantly more effective than conventional intervention on balance and functional mobility. Our results demonstrated that circuit-based exercise presents better effects on gait when compared with conventional intervention and that its effects on balance and functional mobility were not better than conventional interventions. I. Copyright © 2018 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
Marolf, Angela; Blaik, Margaret; Ackerman, Norman; Watson, Elizabeth; Gibson, Nicole; Thompson, Margret
2008-01-01
The role of digital imaging is increasing as these systems are becoming more affordable and accessible. Advantages of computed radiography compared with conventional film/screen combinations include improved contrast resolution and postprocessing capabilities. Computed radiography's spatial resolution is inferior to conventional radiography; however, this limitation is considered clinically insignificant. This study prospectively compared digital imaging and conventional radiography in detecting small volume pneumoperitoneum. Twenty cadaver dogs (15-30 kg) were injected with 0.25, 0.25, and 0.5 ml for 1 ml total of air intra-abdominally, and radiographed sequentially using computed and conventional radiographic technologies. Three radiologists independently evaluated the images, and receiver operating curve (ROC) analysis compared the two imaging modalities. There was no statistical difference between computed and conventional radiography in detecting free abdominal air, but overall computed radiography was relatively more sensitive based on ROC analysis. Computed radiographic images consistently and significantly demonstrated a minimal amount of 0.5 ml of free air based on ROC analysis. However, no minimal air amount was consistently or significantly detected with conventional film. Readers were more likely to detect free air on lateral computed images than the other projections, with no significant increased sensitivity between film/screen projections. Further studies are indicated to determine the differences or lack thereof between various digital imaging systems and conventional film/screen systems.
Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong
2018-03-01
Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Stein, M.; Housner, J. D.
1978-01-01
A numerical analysis developed for the buckling of rectangular orthotropic layered panels under combined shear and compression is described. This analysis uses a central finite difference procedure based on trigonometric functions instead of using the conventional finite differences which are based on polynomial functions. Inasmuch as the buckle mode shape is usually trigonometric in nature, the analysis using trigonometric finite differences can be made to exhibit a much faster convergence rate than that using conventional differences. Also, the trigonometric finite difference procedure leads to difference equations having the same form as conventional finite differences; thereby allowing available conventional finite difference formulations to be converted readily to trigonometric form. For two-dimensional problems, the procedure introduces two numerical parameters into the analysis. Engineering approaches for the selection of these parameters are presented and the analysis procedure is demonstrated by application to several isotropic and orthotropic panel buckling problems. Among these problems is the shear buckling of stiffened isotropic and filamentary composite panels in which the stiffener is broken. Results indicate that a break may degrade the effect of the stiffener to the extent that the panel will not carry much more load than if the stiffener were absent.
Kim, Hyunji; Cha, Joo Hee; Oh, Ha-Yeun; Kim, Hak Hee; Shin, Hee Jung; Chae, Eun Young
2014-07-01
To compare the performance of radiologists in the use of conventional ultrasound (US) and automated breast volume ultrasound (ABVU) for the characterization of benign and malignant solid breast masses based on breast imaging and reporting data system (BI-RADS) criteria. Conventional US and ABVU images were obtained in 87 patients with 106 solid breast masses (52 cancers, 54 benign lesions). Three experienced radiologists who were blinded to all examination results independently characterized the lesions and reported a BI-RADS assessment category and a level of suspicion of malignancy. The results were analyzed by calculation of Cohen's κ coefficient and by receiver operating characteristic (ROC) analysis. Assessment of the agreement of conventional US and ABVU indicated that the posterior echo feature was the most discordant feature of seven features (κ = 0.371 ± 0.225) and that orientation had the greatest agreement (κ = 0.608 ± 0.210). The final assessment showed substantial agreement (κ = 0.773 ± 0.104). The areas under the ROC curves (Az) for conventional US and ABVU were not statistically significant for each reader, but the mean Az values of conventional US and ABVU by multi-reader multi-case analysis were significantly different (conventional US 0.991, ABVU 0.963; 95 % CI -0.0471 to -0.0097). The means for sensitivity, specificity, positive predictive value, and negative predictive value of conventional US and ABVU did not differ significantly. There was substantial inter-observer agreement in the final assessment of solid breast masses by conventional US and ABVU. ROC analysis comparing the performance of conventional US and ABVU indicated a marginally significant difference in mean Az, but not in mean sensitivity, specificity, positive predictive value, or negative predictive value.
Lab-on-a-chip based total-phosphorus analysis device utilizing a photocatalytic reaction
NASA Astrophysics Data System (ADS)
Jung, Dong Geon; Jung, Daewoong; Kong, Seong Ho
2018-02-01
A lab-on-a-chip (LOC) device for total phosphorus (TP) analysis was fabricated for water quality monitoring. Many commercially available TP analysis systems used to estimate water quality have good sensitivity and accuracy. However, these systems also have many disadvantages such as bulky size, complex pretreatment processes, and high cost, which limit their application. In particular, conventional TP analysis systems require an indispensable pretreatment step, in which the fluidic analyte is heated to 120 °C for 30 min to release the dissolved phosphate, because many phosphates are soluble in water at a standard temperature and pressure. In addition, this pretreatment process requires elevated pressures of up to 1.1 kg cm-2 in order to prevent the evaporation of the heated analyte. Because of these limiting conditions required by the pretreatment processes used in conventional systems, it is difficult to miniaturize TP analysis systems. In this study, we employed a photocatalytic reaction in the pretreatment process. The reaction was carried out by illuminating a photocatalytic titanium dioxide (TiO2) surface formed in a microfluidic channel with ultraviolet (UV) light. This pretreatment process does not require elevated temperatures and pressures. By applying this simplified, photocatalytic-reaction-based pretreatment process to a TP analysis system, greater degrees of freedom are conferred to the design and fabrication of LOC devices for TP monitoring. The fabricated LOC device presented in this paper was characterized by measuring the TP concentration of an unknown sample, and comparing the results with those measured by a conventional TP analysis system. The TP concentrations of the unknown sample measured by the proposed LOC device and the conventional TP analysis system were 0.018 mgP/25 mL and 0.019 mgP/25 mL, respectively. The experimental results revealed that the proposed LOC device had a performance comparable to the conventional bulky TP analysis system. Therefore, our device could be directly employed in water quality monitoring as an alternative to conventional TP analysis systems.
Paper-Plastic Hybrid Microfluidic Device for Smartphone-Based Colorimetric Analysis of Urine.
Jalal, Uddin M; Jin, Gyeong Jun; Shim, Joon S
2017-12-19
In this work, a disposable paper-plastic hybrid microfluidic lab-on-a-chip (LOC) has been developed and successfully applied for the colorimetric measurement of urine by the smartphone-based optical platform using a "UrineAnalysis" Android app. The developed device was cost-effectively implemented as a stand-alone hybrid LOC by incorporating the paper-based conventional reagent test strip inside the plastic-based LOC microchannel. The LOC device quantitatively investigated the small volume (40 μL) of urine analytes for the colorimetric reaction of glucose, protein, pH, and red blood cell (RBC) in integration with the finger-actuating micropump. On the basis of our experiments, the conventional urine strip showed large deviation as the reaction time goes by, because dipping the strip sensor in a bottle of urine could not control the reaction volume. By integrating the strip sensor in the LOC device for urine analysis, our device significantly improves the time-dependent inconstancy of the conventional dipstick-based urine strip, and the smartphone app used for image analysis enhances the visual assessment of the test strip, which is a major user concern for the colorimetric analysis in point-of-care (POC) applications. As a result, the user-friendly LOC, which is successfully implemented in a disposable format with the smartphone-based optical platform, may be applicable as an effective tool for rapid and qualitative POC urinalysis.
Tracking B-Cell Repertoires and Clonal Histories in Normal and Malignant Lymphocytes.
Weston-Bell, Nicola J; Cowan, Graeme; Sahota, Surinder S
2017-01-01
Methods for tracking B-cell repertoires and clonal history in normal and malignant B-cells based on immunoglobulin variable region (IGV) gene analysis have developed rapidly with the advent of massive parallel next-generation sequencing (mpNGS) protocols. mpNGS permits a depth of analysis of IGV genes not hitherto feasible, and presents challenges of bioinformatics analysis, which can be readily met by current pipelines. This strategy offers a potential resolution of B-cell usage at a depth that may capture fully the natural state, in a given biological setting. Conventional methods based on RT-PCR amplification and Sanger sequencing are also available where mpNGS is not accessible. Each method offers distinct advantages. Conventional methods for IGV gene sequencing are readily adaptable to most laboratories and provide an ease of analysis to capture salient features of B-cell use. This chapter describes two methods in detail for analysis of IGV genes, mpNGS and conventional RT-PCR with Sanger sequencing.
A Stirling engine analysis method based upon moving gas nodes
NASA Technical Reports Server (NTRS)
Martini, W. R.
1986-01-01
A Lagrangian nodal analysis method for Stirling engines (SEs) is described, validated, and applied to a conventional SE and an isothermalized SE (with fins in the hot and cold spaces). The analysis employs a constant-mass gas node (which moves with respect to the solid nodes during each time step) instead of the fixed gas nodes of Eulerian analysis. The isothermalized SE is found to have efficiency only slightly greater than that of a conventional SE.
NASA Astrophysics Data System (ADS)
Yusa, Yasunori; Okada, Hiroshi; Yamada, Tomonori; Yoshimura, Shinobu
2018-04-01
A domain decomposition method for large-scale elastic-plastic problems is proposed. The proposed method is based on a quasi-Newton method in conjunction with a balancing domain decomposition preconditioner. The use of a quasi-Newton method overcomes two problems associated with the conventional domain decomposition method based on the Newton-Raphson method: (1) avoidance of a double-loop iteration algorithm, which generally has large computational complexity, and (2) consideration of the local concentration of nonlinear deformation, which is observed in elastic-plastic problems with stress concentration. Moreover, the application of a balancing domain decomposition preconditioner ensures scalability. Using the conventional and proposed domain decomposition methods, several numerical tests, including weak scaling tests, were performed. The convergence performance of the proposed method is comparable to that of the conventional method. In particular, in elastic-plastic analysis, the proposed method exhibits better convergence performance than the conventional method.
Otanicar, Todd P; Golden, Jay S
2009-08-01
This study compares environmental and economic impacts of using nanofluids to enhance solar collector efficiency as compared to conventional solar collectors for domestic hotwater systems. Results show that for the current cost of nanoparticles the nanofluid based solar collector has a slightly longer payback period but at the end of its useful life has the same economic savings as a conventional solar collector. The nanofluid based collector has a lower embodied energy (approximately 9%) and approximately 3% higher levels of pollution offsets than a conventional collector. In addition if 50% penetration of residential nanofluid based solar collector systems for hot water heating could be achieved in Phoenix, Arizona over 1 million metric tons of CO2 would be offset per year.
Laursen, K H; Mihailova, A; Kelly, S D; Epov, V N; Bérail, S; Schjoerring, J K; Donard, O F X; Larsen, E H; Pedentchouk, N; Marca-Bell, A D; Halekoh, U; Olesen, J E; Husted, S
2013-12-01
Novel procedures for analytical authentication of organic plant products are urgently needed. Here we present the first study encompassing stable isotopes of hydrogen, carbon, nitrogen, oxygen, magnesium and sulphur as well as compound-specific nitrogen and oxygen isotope analysis of nitrate for discrimination of organically and conventionally grown plants. The study was based on wheat, barley, faba bean and potato produced in rigorously controlled long-term field trials comprising 144 experimental plots. Nitrogen isotope analysis revealed the use of animal manure, but was unable to discriminate between plants that were fertilised with synthetic nitrogen fertilisers or green manures from atmospheric nitrogen fixing legumes. This limitation was bypassed using oxygen isotope analysis of nitrate in potato tubers, while hydrogen isotope analysis allowed complete discrimination of organic and conventional wheat and barley grains. It is concluded, that multi-isotopic analysis has the potential to disclose fraudulent substitutions of organic with conventionally cultivated plants. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Yunhua; Jones, Susanne B.; Biddy, Mary J.
2012-08-01
This study reports the comparison of biomass gasification based syngas-to-distillate (S2D) systems using techno-economic analysis (TEA). Three cases, state of technology (SOT) case, goal case, and conventional case, were compared in terms of performance and cost. The SOT case and goal case represent technology being developed at Pacific Northwest National Laboratory for a process starting with syngas using a single-step dual-catalyst reactor for distillate generation (S2D process). The conventional case mirrors the two-step S2D process previously utilized and reported by Mobil using natural gas feedstock and consisting of separate syngas-to-methanol and methanol-to-gasoline (MTG) processes. Analysis of the three cases revealedmore » that the goal case could indeed reduce fuel production cost over the conventional case, but that the SOT was still more expensive than the conventional. The SOT case suffers from low one-pass yield and high selectivity to light hydrocarbons, both of which drive up production cost. Sensitivity analysis indicated that light hydrocarbon yield, single pass conversion efficiency, and reactor space velocity are the key factors driving the high cost for the SOT case.« less
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
The 2005 meta-analysis of homeopathy: the importance of post-publication data.
Rutten, A L B; Stolper, C F
2008-10-01
There is a discrepancy between the outcome of a meta-analysis published in 1997 of 89 trials of homeopathy by Linde et al and an analysis of 110 trials by Shang et al published in 2005, these reached opposite conclusions. Important data were not mentioned in Shang et al's paper, but only provided subsequently. What was the outcome of Shang et al's predefined hypotheses? Were the homeopathic and conventional trials comparable? Was subgroup selection justified? The possible role of ineffective treatments. Was the conclusion about effect justified? Were essential data missing in the original article? Analysis of post-publication data. Re-extraction and analysis of 21 higher quality trials selected by Shang et al with sensitivity analysis for the influence of single indications. Analysis of comparability. Sensitivity analysis of influence of subjective choices, like quality of single indications and of cut-off values for 'larger samples'. The quality of trials of homeopathy was better than of conventional trials. Regarding smaller trials, homeopathy accounted for 14 out of 83 and conventional medicine 2 out of 78 good quality trials with n<100. There was selective inclusion of unpublished trials only for homeopathy. Quality was assessed differently from previous analyses. Selecting subgroups on sample size and quality caused incomplete matching of homeopathy and conventional trials. Cut-off values for larger trials differed between homeopathy and conventional medicine without plausible reason. Sensitivity analyses for the influence of heterogeneity and the cut-off value for 'larger higher quality studies' were missing. Homeopathy is not effective for muscle soreness after long distance running, OR=1.30 (95% CI 0.96-1.76). The subset of homeopathy trials on which the conclusion was based was heterogeneous, comprising 8 trials on 8 different indications, and was not matched on indication with those of conventional medicine. Essential data were missing in the original paper. Re-analysis of Shang's post-publication data did not support the conclusion that homeopathy is a placebo effect. The conclusion that homeopathy is and that conventional is not a placebo effect was not based on comparative analysis and not justified because of heterogeneity and lack of sensitivity analysis. If we confine ourselves to the predefined hypotheses and the part of the analysis that is indeed comparative, the conclusion should be that quality of homeopathic trials is better than of conventional trials, for all trials (p=0.03) as well as for smaller trials (p=0.003).
Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L
2016-02-10
Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure. Copyright © 2015 John Wiley & Sons, Ltd.
Savić, Snezana; Tamburić, Slobodanka; Savić, Miroslav M
2010-03-01
Surfactants play an important role in the development of both conventional and advanced (colloidal) drug delivery systems. There are several commercial surfactants, but a proportionally small group of them is approved as pharmaceutical excipients, recognized in various pharmacopoeias and therefore widely accepted by the pharmaceutical industry. The review covers some of the main categories of natural, sugar-based surfactants (alkyl polyglucosides and sugar esters) as prospective pharmaceutical excipients. It provides analysis of the physicochemical characteristics of sugar-based surfactants and their possible roles in the design of conventional or advanced drug delivery systems for different routes of administration. Summary and analysis of recent data on functionality, applied concentrations and formulation improvements produced by alkyl polyglucosides and sugar esters in different conventional and advanced delivery systems could be of interest to researchers dealing with drug formulation. Recent FDA certification of an alkyl polyglucoside surfactant for topical formulation presents a significant step in the process of recognition of this relatively new group of surfactants. This could trigger further research into the potential benefits of naturally derived materials in both conventional and new drug delivery systems.
Thermal Energy Storage using PCM for Solar Domestic Hot Water Systems: A Review
NASA Astrophysics Data System (ADS)
Khot, S. A.; Sane, N. K.; Gawali, B. S.
2012-06-01
Thermal energy storage using phase chase materials (PCM) has received considerable attention in the past two decades for time dependent energy source such as solar energy. From several experimental and theoretical analyses that have been made to assess the performance of thermal energy storage systems, it has been demonstrated that PCM-based systems are reliable and viable options. This paper covers such information on PCMs and PCM-based systems developed for the application of solar domestic hot water system. In addition, economic analysis of thermal storage system using PCM in comparison with conventional storage system helps to validate its commercial possibility. From the economic analysis, it is found that, PCM based solar domestic hot water system (SWHS) provides 23 % more cumulative and life cycle savings than conventional SWHS and will continue to perform efficiently even after 15 years due to application of non-metallic tank. Payback period of PCM-based system is also less compared to conventional system. In conclusion, PCM based solar water heating systems can meet the requirements of Indian climatic situation in a cost effective and reliable manner.
Porwal, Anand; Khandelwal, Meenakshi; Punia, Vikas; Sharma, Vivek
2017-01-01
Aim: The purpose of this study was to evaluate the effect of different denture cleansers on the color stability, surface hardness, and roughness of different denture base resins. Materials and Methods: Three denture base resin materials (conventional heat cure resin, high impact resin, and polyamide denture base resin) were immersed for 180 days in commercially available two denture cleansers (sodium perborate and sodium hypochlorite). Color, surface roughness, and hardness were measured for each sample before and after immersion procedure. Statistical Analysis: One-way analysis of variance and Tukey's post hoc honestly significant difference test were used to evaluate color, surface roughness, and hardness data before and after immersion in denture cleanser (α =0.05). Results: All denture base resins tested exhibited a change in color, surface roughness, and hardness to some degree in both denture cleansers. Polyamides resin immersed in sodium perborate showed a maximum change in color after immersion for 180 days. Conventional heat cure resin immersed in sodium hypochlorite showed a maximum change in surface roughness and conventional heat cure immersed in sodium perborate showed a maximum change in hardness. Conclusion: Color changes of all denture base resins were within the clinically accepted range for color difference. Surface roughness change of conventional heat cure resin was not within the clinically accepted range of surface roughness. The choice of denture cleanser for different denture base resins should be based on the chemistry of resin and cleanser, denture cleanser concentration, and duration of immersion. PMID:28216847
Sanyal, Parikshit; Ganguli, Prosenjit; Barui, Sanghita; Deb, Prabal
2018-01-01
The Pap stained cervical smear is a screening tool for cervical cancer. Commercial systems are used for automated screening of liquid based cervical smears. However, there is no image analysis software used for conventional cervical smears. The aim of this study was to develop and test the diagnostic accuracy of a software for analysis of conventional smears. The software was developed using Python programming language and open source libraries. It was standardized with images from Bethesda Interobserver Reproducibility Project. One hundred and thirty images from smears which were reported Negative for Intraepithelial Lesion or Malignancy (NILM), and 45 images where some abnormality has been reported, were collected from the archives of the hospital. The software was then tested on the images. The software was able to segregate images based on overall nuclear: cytoplasmic ratio, coefficient of variation (CV) in nuclear size, nuclear membrane irregularity, and clustering. 68.88% of abnormal images were flagged by the software, as well as 19.23% of NILM images. The major difficulties faced were segmentation of overlapping cell clusters and separation of neutrophils. The software shows potential as a screening tool for conventional cervical smears; however, further refinement in technique is required.
Linear discriminant analysis based on L1-norm maximization.
Zhong, Fujin; Zhang, Jiashu
2013-08-01
Linear discriminant analysis (LDA) is a well-known dimensionality reduction technique, which is widely used for many purposes. However, conventional LDA is sensitive to outliers because its objective function is based on the distance criterion using L2-norm. This paper proposes a simple but effective robust LDA version based on L1-norm maximization, which learns a set of local optimal projection vectors by maximizing the ratio of the L1-norm-based between-class dispersion and the L1-norm-based within-class dispersion. The proposed method is theoretically proved to be feasible and robust to outliers while overcoming the singular problem of the within-class scatter matrix for conventional LDA. Experiments on artificial datasets, standard classification datasets and three popular image databases demonstrate the efficacy of the proposed method.
Assessing a novel polymer-wick based electrode for EEG neurophysiological research.
Pasion, Rita; Paiva, Tiago O; Pedrosa, Paulo; Gaspar, Hugo; Vasconcelos, Beatriz; Martins, Ana C; Amaral, Maria H; Nóbrega, João M; Páscoa, Ricardo; Fonseca, Carlos; Barbosa, Fernando
2016-07-15
The EEG technique has decades of valid applications in clinical and experimental neurophysiology. EEG equipment and data analysis methods have been characterized by remarkable developments, but the skin-to-electrode signal transfer remains a challenge for EEG recording. A novel quasi-dry system - the polymer wick-based electrode - was developed to overcome the limitations of conventional dry and wet silver/silver-chloride (Ag/AgCl) electrodes for EEG recording. Nine participants completed an auditory oddball protocol with simultaneous EEG acquisition using both the conventional Ag/AgCl and the wick electrodes. Wick system successfully recorded the expected P300 modulation. Standard ERP analysis, residual random noise analysis, and single-trial analysis of the P300 wave were performed in order to compare signal acquired by both electrodes. It was found that the novel wick electrode performed similarly to the conventional Ag/AgCl electrodes. The developed wick electrode appears to be a reliable alternative for EEG research, representing a promising halfway alternative between wet and dry electrodes. Copyright © 2016 Elsevier B.V. All rights reserved.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Roberts, Larry W.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Gajjar, Ketan; Ahmadzai, Abdullah A.; Valasoulis, George; Trevisan, Júlio; Founta, Christina; Nasioutziki, Maria; Loufopoulos, Aristotelis; Kyrgiou, Maria; Stasinou, Sofia Melina; Karakitsos, Petros; Paraskevaidis, Evangelos; Da Gama-Rose, Bianca; Martin-Hirsch, Pierre L.; Martin, Francis L.
2014-01-01
Background Subjective visual assessment of cervical cytology is flawed, and this can manifest itself by inter- and intra-observer variability resulting ultimately in the degree of discordance in the grading categorisation of samples in screening vs. representative histology. Biospectroscopy methods have been suggested as sensor-based tools that can deliver objective assessments of cytology. However, studies to date have been apparently flawed by a corresponding lack of diagnostic efficiency when samples have previously been classed using cytology screening. This raises the question as to whether categorisation of cervical cytology based on imperfect conventional screening reduces the diagnostic accuracy of biospectroscopy approaches; are these latter methods more accurate and diagnose underlying disease? The purpose of this study was to compare the objective accuracy of infrared (IR) spectroscopy of cervical cytology samples using conventional cytology vs. histology-based categorisation. Methods Within a typical clinical setting, a total of n = 322 liquid-based cytology samples were collected immediately before biopsy. Of these, it was possible to acquire subsequent histology for n = 154. Cytology samples were categorised according to conventional screening methods and subsequently interrogated employing attenuated total reflection Fourier-transform IR (ATR-FTIR) spectroscopy. IR spectra were pre-processed and analysed using linear discriminant analysis. Dunn’s test was applied to identify the differences in spectra. Within the diagnostic categories, histology allowed us to determine the comparative efficiency of conventional screening vs. biospectroscopy to correctly identify either true atypia or underlying disease. Results Conventional cytology-based screening results in poor sensitivity and specificity. IR spectra derived from cervical cytology do not appear to discriminate in a diagnostic fashion when categories were based on conventional screening. Scores plots of IR spectra exhibit marked crossover of spectral points between different cytological categories. Although, significant differences between spectral bands in different categories are noted, crossover samples point to the potential for poor specificity and hampers the development of biospectroscopy as a diagnostic tool. However, when histology-based categories are used to conduct analyses, the scores plot of IR spectra exhibit markedly better segregation. Conclusions Histology demonstrates that ATR-FTIR spectroscopy of liquid-based cytology identifies the presence of underlying atypia or disease missed in conventional cytology screening. This study points to an urgent need for a future biospectroscopy study where categories are based on such histology. It will allow for the validation of this approach as a screening tool. PMID:24404130
Yang, S; Liu, D G
2014-01-01
Objectives: The purposes of the study are to investigate the consistency of linear measurements between CBCT orthogonally synthesized cephalograms and conventional cephalograms and to evaluate the influence of different magnifications on these comparisons based on a simulation algorithm. Methods: Conventional cephalograms and CBCT scans were taken on 12 dry skulls with spherical metal markers. Orthogonally synthesized cephalograms were created from CBCT data. Linear parameters on both cephalograms were measured via Photoshop CS v. 5.0 (Adobe® Systems, San Jose, CA), named measurement group (MG). Bland–Altman analysis was utilized to assess the agreement of two imaging modalities. Reproducibility was investigated using paired t-test. By a specific mathematical programme “cepha”, corresponding linear parameters [mandibular corpus length (Go-Me), mandibular ramus length (Co-Go), posterior facial height (Go-S)] on these two types of cephalograms were calculated, named simulation group (SG). Bland–Altman analysis was used to assess the agreement between MG and SG. Simulated linear measurements with varying magnifications were generated based on “cepha” as well. Bland–Altman analysis was used to assess the agreement of simulated measurements between two modalities. Results: Bland–Altman analysis suggested the agreement between measurements on conventional cephalograms and orthogonally synthesized cephalograms, with a mean bias of 0.47 mm. Comparison between MG and SG showed that the difference did not reach clinical significance. The consistency between simulated measurements of both modalities with four different magnifications was demonstrated. Conclusions: Normative data of conventional cephalograms could be used for CBCT orthogonally synthesized cephalograms during this transitional period. PMID:25029593
NASA Astrophysics Data System (ADS)
Nizamuddin, M.; Loan, Sajad A.; Alamoud, Abdul R.; Abbassi, Shuja A.
2015-10-01
In this work, design and calibrated simulation of carbon nanotube field effect transistor (CNTFET)-based cascode operational transconductance amplifiers (COTA) have been performed. Three structures of CNTFET-based COTAs have been designed using HSPICE and have been compared with the conventional CMOS-based COTAs. The proposed COTAs include one using pure CNTFETs and two others that employ CNTFETs, as well as the conventional MOSFETs. The simulation study has revealed that the CNTFET-based COTAs have significantly outperformed the conventional MOSFET-based COTAs. A significant increase in dc gain, output resistance and slew rate of 81.4%, 25% and 13.2%, respectively, have been achieved in the proposed pure CNT-based COTA in comparison to the conventional CMOS-based COTA. The power consumption in the pure CNT-COTA is 324 times less in comparison to the conventional CMOS-COTA. Further, the phase margin (PM), gain margin (GM), common mode and power supply rejection ratios have been significantly increased in the proposed CNT-based COTAs in comparison to the conventional CMOS-based COTAs. Furthermore, to see the advantage of cascoding, the proposed CNT-based cascode OTAs have been compared with the CNT-based OTAs. It has been observed that by incorporating the concept of cascode in the CNTFET-based OTAs, significant increases in gain (12.5%) and output resistance (13.07%) have been achieved. The performance of the proposed COTAs has been further observed by changing the number of CNTs (N), CNT pitch (S) and CNT diameter (DCNT) in the CNTFETs used. It has been observed that the performance of the proposed COTAs can be significantly improved by using optimum values of N, S and DCNT.
Retention of denture bases fabricated by three different processing techniques – An in vivo study
Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen
2016-01-01
Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542
A Progressive Approach to Discrete Trial Teaching: Some Current Guidelines
ERIC Educational Resources Information Center
Leaf, Justin B.; Cihon, Joseph H.; Leaf, Ronald; McEachin, John; Taubman, Mitchell
2016-01-01
Discrete trial teaching (DTT) is one of the cornerstones of applied behavior analysis (ABA) based interventions. Conventionally, DTT is commonly implemented within a prescribed, fixed manner in which the therapist is governed by a strict set of rules. In contrast to conventional DTT, a progressive approach to DTT allows the therapist to remain…
NASA Astrophysics Data System (ADS)
Chauhan, H.; Krishna Mohan, B.
2014-11-01
The present study was undertaken with the objective to check effectiveness of spectral similarity measures to develop precise crop spectra from the collected hyperspectral field spectra. In Multispectral and Hyperspectral remote sensing, classification of pixels is obtained by statistical comparison (by means of spectral similarity) of known field or library spectra to unknown image spectra. Though these algorithms are readily used, little emphasis has been placed on use of various spectral similarity measures to select precise crop spectra from the set of field spectra. Conventionally crop spectra are developed after rejecting outliers based only on broad-spectrum analysis. Here a successful attempt has been made to develop precise crop spectra based on spectral similarity. As unevaluated data usage leads to uncertainty in the image classification, it is very crucial to evaluate the data. Hence, notwithstanding the conventional method, the data precision has been performed effectively to serve the purpose of the present research work. The effectiveness of developed precise field spectra was evaluated by spectral discrimination measures and found higher discrimination values compared to spectra developed conventionally. Overall classification accuracy for the image classified by field spectra selected conventionally is 51.89% and 75.47% for the image classified by field spectra selected precisely based on spectral similarity. KHAT values are 0.37, 0.62 and Z values are 2.77, 9.59 for image classified using conventional and precise field spectra respectively. Reasonable higher classification accuracy, KHAT and Z values shows the possibility of a new approach for field spectra selection based on spectral similarity measure.
Hosono, Satoyo; Terasawa, Teruhiko; Katayama, Takafumi; Sasaki, Seiju; Hoshi, Keika; Hamashima, Chisato
2018-04-01
The Bethesda system (TBS) has been used for cervical cytological diagnosis in Japan since 2008. Evaluation of specimen adequacy is the most important aspect of quality assurance and for precise diagnosis in TBS. A systematic review and meta-analysis were carried out to assess the unsatisfactory specimen rate in the primary cervical cancer screening setting in Japan. Ovid Medline and Ichushi-Web databases were searched from inception through to May 2017. Prospective and retrospective studies that reported the proportion of unsatisfactory specimens in healthy asymptomatic Japanese women in a cervical cancer screening program were eligible for inclusion; 17 studies were included in the meta-analysis. The random-effects model meta-analysis calculated summary estimates of the unsatisfactory rate of 0.60% (95% confidence interval [CI], 0.18-1.96%; I 2 = 99%) for conventional cytology and 0.04% (95% CI, 0.00-0.35%; I 2 = 99%) for liquid-based cytology (LBC). However, comparative results between conventional and liquid-based cytology, based on four direct and nine comparative studies, showed no significant difference (summary odds ratio = 3.5 × 10 -2 favoring LBC [95% CI, 6.9 × 10 -4 -1.7]; I 2 = 98%). In the subgroup analyses and meta-regressions, use of non-cotton devices for conventional cytology and use of a particular platform for LBC were associated with lower unsatisfactory rates. Meta-regression also suggested chronological improvement in unsatisfactory rates for both tests. In Japanese cervical cancer screening programs, conventional cytology remains prevalent. Future research needs to focus on evaluating the impact of screening programs using LBC by comparing the accuracy, performance, and cost-effectiveness with conventional cytology in the Japanese population. © 2018 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.
NASA Astrophysics Data System (ADS)
Shao, Xupeng
2017-04-01
Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy
A robust approach for ECG-based analysis of cardiopulmonary coupling.
Zheng, Jiewen; Wang, Weidong; Zhang, Zhengbo; Wu, Dalei; Wu, Hao; Peng, Chung-Kang
2016-07-01
Deriving respiratory signal from a surface electrocardiogram (ECG) measurement has advantage of simultaneously monitoring of cardiac and respiratory activities. ECG-based cardiopulmonary coupling (CPC) analysis estimated by heart period variability and ECG-derived respiration (EDR) shows promising applications in medical field. The aim of this paper is to provide a quantitative analysis of the ECG-based CPC, and further improve its performance. Two conventional strategies were tested to obtain EDR signal: R-S wave amplitude and area of the QRS complex. An adaptive filter was utilized to extract the common component of inter-beat interval (RRI) and EDR, generating enhanced versions of EDR signal. CPC is assessed through probing the nonlinear phase interactions between RRI series and respiratory signal. Respiratory oscillations presented in both RRI series and respiratory signals were extracted by ensemble empirical mode decomposition for coupling analysis via phase synchronization index. The results demonstrated that CPC estimated from conventional EDR series exhibits constant and proportional biases, while that estimated from enhanced EDR series is more reliable. Adaptive filtering can improve the accuracy of the ECG-based CPC estimation significantly and achieve robust CPC analysis. The improved ECG-based CPC estimation may provide additional prognostic information for both sleep medicine and autonomic function analysis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Tabatabaee, Reza M; Rasouli, Mohammad R; Maltenfort, Mitchell G; Fuino, Robert; Restrepo, Camilo; Oliashirazi, Ali
2018-04-01
Image-based and imageless computer-assisted total knee arthroplasty (CATKA) has become increasingly popular. This study aims to compare outcomes, including perioperative complications and transfusion rate, between CATKA and conventional total knee arthroplasty (TKA), as well as between image-based and imageless CATKA. Using the 9th revision of the International Classification of Diseases codes, we queried the Nationwide Inpatient Sample database from 2005 to 2011 to identify unilateral conventional TKA, image-based, and imageless CATKAs as well as in-hospital complications and transfusion rates. A total of 787,809 conventional TKAs and 13,246 CATKAs (1055 image-based and 12,191 imageless) were identified. The rate of CATKA increased 23.13% per year from 2005 to 2011. Transfusion rates in conventional TKA and CATKA cases were 11.73% and 8.20% respectively (P < .001) and 6.92% in image-based vs 8.27% in imageless (P = .023). Perioperative complications occurred in 4.50%, 3.47%, and 3.41% of cases after conventional, imageless, and imaged-based CATKAs, respectively. Using multivariate analysis, perioperative complications were significantly higher in conventional TKA compared to CATKA (odds ratio = 1.17, 95% confidence interval 1.03-1.33, P = .01). There was no significant difference between imageless and image-based CATKA (P = .34). Length of hospital stay and hospital charges were not significantly different between groups (P > .05). CATKA has low complication rates and may improve patient outcomes after TKA. CATKA, especially the image-based technique, may reduce in-hospital complications and transfusion without increasing hospital charges and length of hospital stay significantly. Large prospective studies with long follow-up are required to verify potential benefits of CATKA. Copyright © 2017 Elsevier Inc. All rights reserved.
Chu, Y; Wu, D; Hou, Q F; Huo, X D; Gao, Y; Wang, T; Wang, H D; Yang, Y L; Liao, S X
2016-08-25
To investigate the value of array-based comparative genomic hybridization (array-CGH) technique for the detection of chromosomal analysis of miscarried embryo, and to provide genetic counseling for couples with spontaneous abortion. Totally 382 patients who underwent miscarriage were enrolled in this study. All aborted tissues were analyzed with conventional cytogenetic karyotyping and array-CGH, respectively. Through genetic analysis, all of the 382 specimens were successfully analyzed by array-CGH (100.0%, 382/382), and the detection rate of chromosomal aberrations was 46.6% (178/382). However, conventional karyotype analysis was successfully performed in 281 cases (73.6%, 281/382), and 113 (40.2%, 113/281) were found with chromosomal aberrations. Of these 178 samples identified by array-CGH, 163 samples (91.6%, 163/178) were aneuploidy, 15 samples (8.4%, 15/178) were segmental deletion and (or) duplication cases. Four of 10 cases with small segmental deletion and duplication were validated to be transferred from their fathers or mathers who were carriers of submicroscopic reciprocal translocation. Of these 113 abnormal karyotypes founded by conventional karyotyping, 108 cases (95.6%, 108/113) were aneuploidy and 5 cases (4.4%, 5/113) had chromosome structural aberrations. Most array-CGH results were consistent with conventional karyotyping but with 3 cases of discrepancy, which included 2 cases of triploids, 1 case of low-level mosaicism that undetcted by array-CGH. Compared with conventional karyotyping, there is an increased detection rate of chromosomal abnormalities when array-CGH is used to analyse the products of conception, primarilly because of its sucess with nonviable tissues. It could be a first-line method to determine the reason of miscarrage with higher accuracy and sensitivity.
Thalanany, Mariamma M; Mugford, Miranda; Hibbert, Clare; Cooper, Nicola J; Truesdale, Ann; Robinson, Steven; Tiruvoipati, Ravindranath; Elbourne, Diana R; Peek, Giles J; Clemens, Felicity; Hardy, Polly; Wilson, Andrew
2008-01-01
Background Extracorporeal Membrane Oxygenation (ECMO) is a technology used in treatment of patients with severe but potentially reversible respiratory failure. A multi-centre randomised controlled trial (CESAR) was funded in the UK to compare care including ECMO with conventional intensive care management. The protocol and funding for the CESAR trial included plans for economic data collection and analysis. Given the high cost of treatment, ECMO is considered an expensive technology for many funding systems. However, conventional treatment for severe respiratory failure is also one of the more costly forms of care in any health system. Methods/Design The objectives of the economic evaluation are to compare the costs of a policy of referral for ECMO with those of conventional treatment; to assess cost-effectiveness and the cost-utility at 6 months follow-up; and to assess the cost-utility over a predicted lifetime. Resources used by patients in the trial are identified. Resource use data are collected from clinical report forms and through follow up interviews with patients. Unit costs of hospital intensive care resources are based on parallel research on cost functions in UK NHS intensive care units. Other unit costs are based on published NHS tariffs. Cost effectiveness analysis uses the outcome: survival without severe disability. Cost utility analysis is based on quality adjusted life years gained based on the Euroqol EQ-5D at 6 months. Sensitivity analysis is planned to vary assumptions about transport costs and method of costing intensive care. Uncertainty will also be expressed in analysis of individual patient data. Probabilities of cost effectiveness given different funding thresholds will be estimated. Discussion In our view it is important to record our methods in detail and present them before publication of the results of the trial so that a record of detail not normally found in the final trial reports can be made available in the public domain. Trial Registrations The CESAR trial registration number is ISRCTN47279827. PMID:18447931
Tani, Kazuki; Mio, Motohira; Toyofuku, Tatsuo; Kato, Shinichi; Masumoto, Tomoya; Ijichi, Tetsuya; Matsushima, Masatoshi; Morimoto, Shoichi; Hirata, Takumi
2017-01-01
Spatial normalization is a significant image pre-processing operation in statistical parametric mapping (SPM) analysis. The purpose of this study was to clarify the optimal method of spatial normalization for improving diagnostic accuracy in SPM analysis of arterial spin-labeling (ASL) perfusion images. We evaluated the SPM results of five spatial normalization methods obtained by comparing patients with Alzheimer's disease or normal pressure hydrocephalus complicated with dementia and cognitively healthy subjects. We used the following methods: 3DT1-conventional based on spatial normalization using anatomical images; 3DT1-DARTEL based on spatial normalization with DARTEL using anatomical images; 3DT1-conventional template and 3DT1-DARTEL template, created by averaging cognitively healthy subjects spatially normalized using the above methods; and ASL-DARTEL template created by averaging cognitively healthy subjects spatially normalized with DARTEL using ASL images only. Our results showed that ASL-DARTEL template was small compared with the other two templates. Our SPM results obtained with ASL-DARTEL template method were inaccurate. Also, there were no significant differences between 3DT1-conventional and 3DT1-DARTEL template methods. In contrast, the 3DT1-DARTEL method showed higher detection sensitivity, and precise anatomical location. Our SPM results suggest that we should perform spatial normalization with DARTEL using anatomical images.
Baad-Hansen, Thomas; Kold, Søren; Kaptein, Bart L; Søballe, Kjeld
2007-08-01
In RSA, tantalum markers attached to metal-backed acetabular cups are often difficult to detect on stereo radiographs due to the high density of the metal shell. This results in occlusion of the prosthesis markers and may lead to inconclusive migration results. Within the last few years, new software systems have been developed to solve this problem. We compared the precision of 3 RSA systems in migration analysis of the acetabular component. A hemispherical and a non-hemispherical acetabular component were mounted in a phantom. Both acetabular components underwent migration analyses with 3 different RSA systems: conventional RSA using tantalum markers, an RSA system using a hemispherical cup algorithm, and a novel model-based RSA system. We found narrow confidence intervals, indicating high precision of the conventional marker system and model-based RSA with regard to migration and rotation. The confidence intervals of conventional RSA and model-based RSA were narrower than those of the hemispherical cup algorithm-based system regarding cup migration and rotation. The model-based RSA software combines the precision of the conventional RSA software with the convenience of the hemispherical cup algorithm-based system. Based on our findings, we believe that these new tools offer an improvement in the measurement of acetabular component migration.
Biolik, A; Heide, S; Lessig, R; Hachmann, V; Stoevesandt, D; Kellner, J; Jäschke, C; Watzke, S
2018-04-01
One option for improving the quality of medical post mortem examinations is through intensified training of medical students, especially in countries where such a requirement exists regardless of the area of specialisation. For this reason, new teaching and learning methods on this topic have recently been introduced. These new approaches include e-learning modules or SkillsLab stations; one way to objectify the resultant learning outcomes is by means of the OSCE process. However, despite offering several advantages, this examination format also requires considerable resources, in particular in regards to medical examiners. For this reason, many clinical disciplines have already implemented computer-based OSCE examination formats. This study investigates whether the conventional exam format for the OSCE forensic "Death Certificate" station could be replaced with a computer-based approach in future. For this study, 123 students completed the OSCE "Death Certificate" station, using both a computer-based and conventional format, half starting with the Computer the other starting with the conventional approach in their OSCE rotation. Assignment of examination cases was random. The examination results for the two stations were compared and both overall results and the individual items of the exam checklist were analysed by means of inferential statistics. Following statistical analysis of examination cases of varying difficulty levels and correction of the repeated measures effect, the results of both examination formats appear to be comparable. Thus, in the descriptive item analysis, while there were some significant differences between the computer-based and conventional OSCE stations, these differences were not reflected in the overall results after a correction factor was applied (e.g. point deductions for assistance from the medical examiner was possible only at the conventional station). Thus, we demonstrate that the computer-based OSCE "Death Certificate" station is a cost-efficient and standardised format for examination that yields results comparable to those from a conventional format exam. Moreover, the examination results also indicate the need to optimize both the test itself (adjusting the degree of difficulty of the case vignettes) and the corresponding instructional and learning methods (including, for example, the use of computer programmes to complete the death certificate in small group formats in the SkillsLab). Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Cochand-Priollet, Béatrix; Cartier, Isabelle; de Cremoux, Patricia; Le Galès, Catherine; Ziol, Marianne; Molinié, Vincent; Petitjean, Alain; Dosda, Anne; Merea, Estelle; Biaggi, Annonciade; Gouget, Isabelle; Arkwright, Sylviane; Vacher-Lavenu, Marie-Cécile; Vielh, Philippe; Coste, Joël
2005-11-01
Many articles concerning conventional Pap smears, ThinPrep liquid-based cytology (LBC) and Hybrid-Capture II HPV test (HC II) have been published. This study, carried out by the French Society of Clinical Cytology, may be conspicuous for several reasons: it was financially independent; it compared the efficiency of the conventional Pap smear and LBC, of the conventional Pap smear and HC II, and included an economic study based on real costs; for all the women, a "gold standard" reference method, colposcopy, was available and biopsies were performed whenever a lesion was detected; The conventional Pap smear, the LBC (split-sample technique), the colposcopy, and the biopsies were done at the same time. This study included 2,585 women shared into two groups: a group A of a high-risk population, a group B of a screening population. The statistical analysis of the results showed that conventional Pap smears consistently had superior or equivalent sensitivity and specificity than LBC for the lesions at threshold CIN-I (Cervical Intraepithelial Neoplasia) or CIN-II or higher. It underlined the low specificity of the HC II. Finally, the LBC mean cost was never covered by the Social Security tariff.
Giannaki, Christoforos D; Aphamis, George; Sakkis, Panikos; Hadjicharalambous, Marios
2016-04-01
High intensity interval training (HIIT) has been recently promoted as an effective, low volume and time-efficient training method for improving fitness and health related parameters. The aim of the current study was to examine the effect of a combination of a group-based HIIT and conventional gym training on physical fitness and body composition parameters in healthy adults. Thirty nine healthy adults volunteered to participate in this eight-week intervention study. Twenty three participants performed regular gym training 4 days a week (C group), whereas the remaining 16 participants engaged twice a week in HIIT and twice in regular gym training (HIIT-C group) as the other group. Total body fat and visceral adiposity levels were calculated using bioelectrical impedance analysis. Physical fitness parameters such as cardiorespiratory fitness, speed, lower limb explosiveness, flexibility and isometric arm strength were assessed through a battery of field tests. Both exercise programs were effective in reducing total body fat and visceral adiposity (P<0.05) and improving handgrip strength, sprint time, jumping ability and flexibility (P<0.05) whilst only the combination of HIIT and conventional training improved cardiorespiratory fitness levels (P<0.05). A between of group changes analysis revealed that HIIT-C resulted in significantly greater reduction in both abdominal girth and visceral adiposity compared with conventional training (P<0.05). Eight weeks of combined group-based HIIT and conventional training improve various physical fitness parameters and reduce both total and visceral fat levels. This type of training was also found to be superior compared with conventional exercise training alone in terms of reducing more visceral adiposity levels. Group-based HIIT may consider as a good methods for individuals who exercise in gyms and craving to acquire significant fitness benefits in relatively short period of time.
Bhateria, Manisha; Rachumallu, Ramakrishna; Singh, Rajbir; Bhatta, Rabi Sankar
2014-08-01
Erythrocytes (red blood cells [RBCs]) and artificial or synthetic delivery systems such as liposomes, nanoparticles (NPs) are the most investigated carrier systems. Herein, progress made from conventional approach of using RBC as delivery systems to novel approach of using synthetic delivery systems based on RBC properties will be reviewed. We aim to highlight both conventional and novel approaches of using RBCs as potential carrier system. Conventional approaches which include two main strategies are: i) directly loading therapeutic moieties in RBCs; and ii) coupling them with RBCs whereas novel approaches exploit structural, mechanical and biological properties of RBCs to design synthetic delivery systems through various engineering strategies. Initial attempts included coupling of antibodies to liposomes to specifically target RBCs. Knowledge obtained from several studies led to the development of RBC membrane derived liposomes (nanoerythrosomes), inspiring future application of RBC or its structural features in other attractive delivery systems (hydrogels, filomicelles, microcapsules, micro- and NPs) for even greater potential. In conclusion, this review dwells upon comparative analysis of various conventional and novel engineering strategies in developing RBC based drug delivery systems, diversifying their applications in arena of drug delivery. Regardless of the challenges in front of us, RBC based delivery systems offer an exciting approach of exploiting biological entities in a multitude of medical applications.
Using the Kaldor-Hicks Tableau Format for Cost-Benefit Analysis and Policy Evaluation
ERIC Educational Resources Information Center
Krutilla, Kerry
2005-01-01
This note describes the Kaldor-Hicks (KH) tableau format as a framework for distributional accounting in cost-benefit analysis and policy evaluation. The KH tableau format can serve as a heuristic aid for teaching microeconomics-based policy analysis, and offer insight to policy analysts and decisionmakers beyond conventional efficiency analysis.
NASA Astrophysics Data System (ADS)
Darma, I. K.
2018-01-01
This research is aimed at determining: 1) the differences of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) the differences of mathematical problem solving ability between the students facilitated with authentic and conventional assessment model, and 3) interaction effect between learning and assessment model on mathematical problem solving. The research was conducted in Bali State Polytechnic, using the 2x2 experiment factorial design. The samples of this research were 110 students. The data were collected using a theoretically and empirically-validated test. Instruments were validated by using Aiken’s approach of technique content validity and item analysis, and then analyzed using anova stylistic. The result of the analysis shows that the students facilitated with problem-based learning and authentic assessment models get the highest score average compared to the other students, both in the concept understanding and mathematical problem solving. The result of hypothesis test shows that, significantly: 1) there is difference of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) there is difference of mathematical problem solving ability between the students facilitated with authentic assessment model and conventional assessment model, and 3) there is interaction effect between learning model and assessment model on mathematical problem solving. In order to improve the effectiveness of mathematics learning, collaboration between problem-based learning model and authentic assessment model can be considered as one of learning models in class.
Turning the Page: Advancing Paper-Based Microfluidics for Broad Diagnostic Application.
Gong, Max M; Sinton, David
2017-06-28
Infectious diseases are a major global health issue. Diagnosis is a critical first step in effectively managing their spread. Paper-based microfluidic diagnostics first emerged in 2007 as a low-cost alternative to conventional laboratory testing, with the goal of improving accessibility to medical diagnostics in developing countries. In this review, we examine the advances in paper-based microfluidic diagnostics for medical diagnosis in the context of global health from 2007 to 2016. The theory of fluid transport in paper is first presented. The next section examines the strategies that have been employed to control fluid and analyte transport in paper-based assays. Tasks such as mixing, timing, and sequential fluid delivery have been achieved in paper and have enabled analytical capabilities comparable to those of conventional laboratory methods. The following section examines paper-based sample processing and analysis. The most impactful advancement here has been the translation of nucleic acid analysis to a paper-based format. Smartphone-based analysis is another exciting development with potential for wide dissemination. The last core section of the review highlights emerging health applications, such as male fertility testing and wearable diagnostics. We conclude the review with the future outlook, remaining challenges, and emerging opportunities.
Masood, Athar; Stark, Ken D; Salem, Norman
2005-10-01
Conventional sample preparation for fatty acid analysis is a complicated, multiple-step process, and gas chromatography (GC) analysis alone can require >1 h per sample to resolve fatty acid methyl esters (FAMEs). Fast GC analysis was adapted to human plasma FAME analysis using a modified polyethylene glycol column with smaller internal diameters, thinner stationary phase films, increased carrier gas linear velocity, and faster temperature ramping. Our results indicated that fast GC analyses were comparable to conventional GC in peak resolution. A conventional transesterification method based on Lepage and Roy was simplified to a one-step method with the elimination of the neutralization and centrifugation steps. A robotics-amenable method was also developed, with lower methylation temperatures and in an open-tube format using multiple reagent additions. The simplified methods produced results that were quantitatively similar and with similar coefficients of variation as compared with the original Lepage and Roy method. The present streamlined methodology is suitable for the direct fatty acid analysis of human plasma, is appropriate for research studies, and will facilitate large clinical trials and make possible population studies.
A Shot Number Based Approach to Performance Analysis in Table Tennis
Yoshida, Kazuto; Yamada, Koshi
2017-01-01
Abstract The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method. PMID:28210334
Sun, F; Chen, J; Tong, Q; Zeng, S
2007-01-01
Management of drinking water safety is changing towards an integrated risk assessment and risk management approach that includes all processes in a water supply system from catchment to consumers. However, given the large number of water supply systems in China and the cost of implementing such a risk assessment procedure, there is a necessity to first conduct a strategic screening analysis at a national level. An integrated methodology of risk assessment and screening analysis is thus proposed to evaluate drinking water safety of a conventional water supply system. The violation probability, indicating drinking water safety, is estimated at different locations of a water supply system in terms of permanganate index, ammonia nitrogen, turbidity, residual chlorine and trihalomethanes. Critical parameters with respect to drinking water safety are then identified, based on which an index system is developed to prioritize conventional water supply systems in implementing a detailed risk assessment procedure. The evaluation results are represented as graphic check matrices for the concerned hazards in drinking water, from which the vulnerability of a conventional water supply system is characterized.
Autonomous control systems - Architecture and fundamental issues
NASA Technical Reports Server (NTRS)
Antsaklis, P. J.; Passino, K. M.; Wang, S. J.
1988-01-01
A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).
Ishihara, Masaru; Onoguchi, Masahisa; Taniguchi, Yasuyo; Shibutani, Takayuki
2017-12-01
The aim of this study was to clarify the differences in thallium-201-chloride (thallium-201) myocardial perfusion imaging (MPI) scans evaluated by conventional anger-type single-photon emission computed tomography (conventional SPECT) versus cadmium-zinc-telluride SPECT (CZT SPECT) imaging in normal databases for different ethnic groups. MPI scans from 81 consecutive Japanese patients were examined using conventional SPECT and CZT SPECT and analyzed with the pre-installed quantitative perfusion SPECT (QPS) software. We compared the summed stress score (SSS), summed rest score (SRS), and summed difference score (SDS) for the two SPECT devices. For a normal MPI reference, we usually use Japanese databases for MPI created by the Japanese Society of Nuclear Medicine, which can be used with conventional SPECT but not with CZT SPECT. In this study, we used new Japanese normal databases constructed in our institution to compare conventional and CZT SPECT. Compared with conventional SPECT, CZT SPECT showed lower SSS (p < 0.001), SRS (p = 0.001), and SDS (p = 0.189) using the pre-installed SPECT database. In contrast, CZT SPECT showed no significant difference from conventional SPECT in QPS analysis using the normal databases from our institution. Myocardial perfusion analyses by CZT SPECT should be evaluated using normal databases based on the ethnic group being evaluated.
Barnhoorn, Karlijn; Staal, J Bart; van Dongen, Robert Tm; Frölke, Jan Paul M; Klomp, Frank P; van de Meent, Henk; Adang, Eddy; Nijhuis-van der Sanden, Maria Wg
2018-06-01
To analyze cost-effectiveness of Pain Exposure Physical Therapy compared to conventional treatment alongside a randomized controlled trial (NCT00817128) in patients with complex regional pain syndrome type 1, where no clinical difference was shown between the two groups in an intention-to-treat analysis. Randomized controlled trial with 9 months follow-up. Patients were recruited from hospitals and general practitioners in the region around a university hospital. A total of 56 patients, 45 (80.4%) female, were randomized. About 4 patients in the intervention and 11 patients in the conventional group switched groups. The mean (SD) age was 44.3 (16.6) years, and in 37 (66.1%) patients, the upper extremity was affected. Patients received either Pain Exposure Physical Therapy (maximum of five sessions), or conventional treatment conforming with the Dutch multidisciplinary guideline. For the economic evaluation difference between the groups in health-related quality of life (quality-adjusted life years (QALYs)), and the clinical outcomes Impairment level Sum Score-Restricted Version and Pain Disability was determined based on the intention-to-treat analysis as well as differences in both healthcare-related costs and travel expenses. Cost-effectiveness planes were constructed using bootstrapping to compare effects and costs. No significant effects were found for QALYs (mean difference = -0.02; 95% confidence interval (CI) -0.10 to 0.04) and clinical outcomes. A cost minimization analysis showed a significant difference in costs between groups. The conventional treatment was 64% more expensive than the Pain Exposure Physical Therapy. This economic analysis shows that Pain Exposure Physical Therapy compared to conventional treatment is cost-effective.
Students concept understanding of fluid static based on the types of teaching
NASA Astrophysics Data System (ADS)
Rahmawati, I. D.; Suparmi; Sunarno, W.
2018-03-01
This research aims to know the concept understanding of student are taught by guided inquiry based learning and conventional based learning. Subjects in this study are high school students as much as 2 classes and each class consists of 32 students, both classes are homogen. The data was collected by conceptual test in the multiple choice form with the students argumentation of the answer. The data analysis used is qualitative descriptive method. The results of the study showed that the average of class that was using guided inquiry based learning is 78.44 while the class with use conventional based learning is 65.16. Based on these data, the guided inquiry model is an effective learning model used to improve students concept understanding.
Life-cycle analysis of shale gas and natural gas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, C.E.; Han, J.; Burnham, A.
2012-01-27
The technologies and practices that have enabled the recent boom in shale gas production have also brought attention to the environmental impacts of its use. Using the current state of knowledge of the recovery, processing, and distribution of shale gas and conventional natural gas, we have estimated up-to-date, life-cycle greenhouse gas emissions. In addition, we have developed distribution functions for key parameters in each pathway to examine uncertainty and identify data gaps - such as methane emissions from shale gas well completions and conventional natural gas liquid unloadings - that need to be addressed further. Our base case results showmore » that shale gas life-cycle emissions are 6% lower than those of conventional natural gas. However, the range in values for shale and conventional gas overlap, so there is a statistical uncertainty regarding whether shale gas emissions are indeed lower than conventional gas emissions. This life-cycle analysis provides insight into the critical stages in the natural gas industry where emissions occur and where opportunities exist to reduce the greenhouse gas footprint of natural gas.« less
Chen, Minghao; Wei, Shiyou; Hu, Junyan; Yuan, Jing; Liu, Fenghua
2017-01-01
The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94-1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06-1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80-1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use.
Lee, Ki Song; Choe, Young Chan; Park, Sung Hee
2015-10-01
This study examined the structural variables affecting the environmental effects of organic farming compared to those of conventional farming. A meta-analysis based on 107 studies and 360 observations published from 1977 to 2012 compared energy efficiency (EE) and greenhouse gas emissions (GHGE) for organic and conventional farming. The meta-analysis systematically analyzed the results of earlier comparative studies and used logistic regression to identify the structural variables that contributed to differences in the effects of organic and conventional farming on the environment. The statistical evidence identified characteristics that differentiated the environmental effects of organic and conventional farming, which is controversial. The results indicated that data sources, sample size and product type significantly affected EE, whereas product type, cropping pattern and measurement unit significantly affected the GHGE of organic farming compared to conventional farming. Superior effects of organic farming on the environment were more likely to appear for larger samples, primary data rather than secondary data, monocropping rather than multicropping, and crops other than fruits and vegetables. The environmental effects of organic farming were not affected by the study period, geographic location, farm size, cropping pattern, or measurement method. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Popper's experiment and communication
NASA Astrophysics Data System (ADS)
Gerjuoy, Edward; Sessler, Andrew M.
2006-07-01
We comment on an analysis by Qureshi of an experiment proposed by Popper and show that an analysis based solely on conventional nonrelativistic quantum mechanics is sufficient to exclude the possibility of subluminal or superluminal communication. That is, local operations cannot be employed to transmit information.
Cortés, J; Ribera, J M; Cardellach, F; Selva-O’Callaghan, A; Kostov, B; García, L; Cirugeda, L; Altman, D G; González, J A; Sànchez, J A; Miras, F; Urrutia, A; Fonollosa, V; Rey-Joly, C; Vilardell, M
2011-01-01
Objective To investigate the effect of an additional review based on reporting guidelines such as STROBE and CONSORT on quality of manuscripts. Design Masked randomised trial. Population Original research manuscripts submitted to the Medicina Clínica journal from May 2008 to April 2009 and considered suitable for publication. Intervention Control group: conventional peer reviews alone. Intervention group: conventional review plus an additional review looking for missing items from reporting guidelines. Outcomes Manuscript quality, assessed with a 5 point Likert scale (primary: overall quality; secondary: average quality of specific items in paper). Main analysis compared groups as allocated, after adjustment for baseline factors (analysis of covariance); sensitivity analysis compared groups as reviewed. Adherence to reviewer suggestions assessed with Likert scale. Results Of 126 consecutive papers receiving conventional review, 34 were not suitable for publication. The remaining 92 papers were allocated to receive conventional reviews alone (n=41) or additional reviews (n=51). Four papers assigned to the conventional review group deviated from protocol; they received an additional review based on reporting guidelines. We saw an improvement in manuscript quality in favour of the additional review group (comparison as allocated, 0.25, 95% confidence interval –0.05 to 0.54; as reviewed, 0.33, 0.03 to 0.63). More papers with additional reviews than with conventional reviews alone improved from baseline (22 (43%) v eight (20%), difference 23.6% (3.2% to 44.0%), number needed to treat 4.2 (from 2.3 to 31.2), relative risk 2.21 (1.10 to 4.44)). Authors in the additional review group adhered more to suggestions from conventional reviews than to those from additional reviews (average increase 0.43 Likert points (0.19 to 0.67)). Conclusions Additional reviews based on reporting guidelines improve manuscript quality, although the observed effect was smaller than hypothesised and not definitively demonstrated. Authors adhere more to suggestions from conventional reviews than to those from additional reviews, showing difficulties in adhering to high methodological standards at the latest research phases. To boost paper quality and impact, authors should be aware of future requirements of reporting guidelines at the very beginning of their study. Trial registration and protocol Although registries do not include trials of peer review, the protocol design was submitted to sponsored research projects (Instituto de Salud Carlos III, PI081903). PMID:22108262
Diagnostic value of highly-sensitive chimerism analysis after allogeneic stem cell transplantation.
Sellmann, Lea; Rabe, Kim; Bünting, Ivonne; Dammann, Elke; Göhring, Gudrun; Ganser, Arnold; Stadler, Michael; Weissinger, Eva M; Hambach, Lothar
2018-05-02
Conventional analysis of host chimerism (HC) frequently fails to detect relapse before its clinical manifestation in patients with hematological malignancies after allogeneic stem cell transplantation (allo-SCT). Quantitative PCR (qPCR)-based highly-sensitive chimerism analysis extends the detection limit of conventional (short tandem repeats-based) chimerism analysis from 1 to 0.01% host cells in whole blood. To date, the diagnostic value of highly-sensitive chimerism analysis is hardly defined. Here, we applied qPCR-based chimerism analysis to 901 blood samples of 71 out-patients with hematological malignancies after allo-SCT. Receiver operating characteristics (ROC) curves were calculated for absolute HC values and for the increments of HC before relapse. Using the best cut-offs, relapse was detected with sensitivities of 74 or 85% and specificities of 69 or 75%, respectively. Positive predictive values (PPVs) were only 12 or 18%, but the respective negative predictive values were 98 or 99%. Relapse was detected median 38 or 45 days prior to clinical diagnosis, respectively. Considering also durations of steadily increasing HC of more than 28 days improved PPVs to more than 28 or 59%, respectively. Overall, highly-sensitive chimerism analysis excludes relapses with high certainty and predicts relapses with high sensitivity and specificity more than a month prior to clinical diagnosis.
NASA Astrophysics Data System (ADS)
Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi
2018-02-01
In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.
Neck pain assessment in a virtual environment.
Sarig-Bahat, Hilla; Weiss, Patrice L Tamar; Laufer, Yocheved
2010-02-15
Neck-pain and control group comparative analysis of conventional and virtual reality (VR)-based assessment of cervical range of motion (CROM). To use a tracker-based VR system to compare CROM of individuals suffering from chronic neck pain with CROM of asymptomatic individuals; to compare VR system results with those obtained during conventional assessment; to present the diagnostic value of CROM measures obtained by both assessments; and to demonstrate the effect of a single VR session on CROM. Neck pain is a common musculoskeletal complaint with a reported annual prevalence of 30% to 50%. In the absence of a gold standard for CROM assessment, a variety of assessment devices and methodologies exist. Common to these methodologies, assessment of CROM is carried out by instructing subjects to move their head as far as possible. However, these elicited movements do not necessarily replicate functional movements which occur spontaneously in response to multiple stimuli. To achieve a more functional approach to cervical motion assessment, we have recently developed a VR environment in which electromagnetic tracking is used to monitor cervical motion while participants are involved in a simple yet engaging gaming scenario. CROM measures were collected from 25 symptomatic and 42 asymptomatic individuals using VR and conventional assessments. Analysis of variance was used to determine differences between groups and assessment methods. Logistic regression analysis, using a single predictor, compared the diagnostic ability of both methods. Results obtained by both methods demonstrated significant CROM limitations in the symptomatic group. The VR measures showed greater CROM and sensitivity while conventional measures showed greater specificity. A single session exposure to VR resulted in a significant increase in CROM. Neck pain is significantly associated with reduced CROM as demonstrated by both VR and conventional assessment methods. The VR method provides assessment of functional CROM and can be used for CROM enhancement. Assessment by VR has greater sensitivity than conventional assessment and can be used for the detection of true symptomatic individuals.
Baksi, B Güniz; Ermis, R Banu
2007-10-01
To test the efficacy of conventional radiometry with indirect digital image analysis in the assessment of the relative radiopacity of dental cements used as liners or bases compared to human enamel and dentin. Disks of 15 different dental cements, 5 mm in diameter and 2 mm thick, were exposed to radiation together with 2-mm-thick disks of enamel and dentin and an aluminum step wedge. Density was evaluated by digital transmission densitometry and with the histogram function of an image analysis program following digitization of the radiographs with a flatbed scanner. A higher number of dental cements were discriminated from both dentin and enamel with conventional radiographic densitometer. All the cements examined, except Ionoseal (Voco) and Ionobond (Voco), were more radiopaque than dentin. With both methods, Chelon-Silver (3M ESPE) had the highest radiopacity and glass-ionomer cements the lowest. Radiodensity of dental cements can be differentiated with a high probability with the conventional radiometric method.
Wavelet-based spectral finite element dynamic analysis for an axially moving Timoshenko beam
NASA Astrophysics Data System (ADS)
Mokhtari, Ali; Mirdamadi, Hamid Reza; Ghayour, Mostafa
2017-08-01
In this article, wavelet-based spectral finite element (WSFE) model is formulated for time domain and wave domain dynamic analysis of an axially moving Timoshenko beam subjected to axial pretension. The formulation is similar to conventional FFT-based spectral finite element (SFE) model except that Daubechies wavelet basis functions are used for temporal discretization of the governing partial differential equations into a set of ordinary differential equations. The localized nature of Daubechies wavelet basis functions helps to rule out problems of SFE model due to periodicity assumption, especially during inverse Fourier transformation and back to time domain. The high accuracy of WSFE model is then evaluated by comparing its results with those of conventional finite element and SFE results. The effects of moving beam speed and axial tensile force on vibration and wave characteristics, and static and dynamic stabilities of moving beam are investigated.
Forecast skill impact of drifting buoys in the Southern Hemisphere
NASA Technical Reports Server (NTRS)
Kalnay, E.; Atlas, R.; Baker, W.; Halem, M.
1984-01-01
Two analyses are performed to evaluate the effect of drift buoys and the FGGE's special observing system (SOS) on forecasting. The FGGE analysis utilizes all level II-b conventional and special data, and the Nosat analysis employs only surface and conventional upper air data. Twelve five-day forecasts are produced from these data. An additional experiment utilizing the FGGE data base minus buoys data, and the Nosat data base including buoys data is being conducted. The forecasts are compared and synoptic evaluation of the effect of buoys data is described. The results reveal that the FGGE data base with the SOS significantly improves forecasting in the Southern Hemisphere and the loss of buoys data does not have a great effect on forecasting. The Nosat data has less impact on forecasting; however, the addition of buoys data provides an improvement in forecast skills.
Confined detection volume of fluorescence correlation spectroscopy by bare fiber probes.
Lu, Guowei; Lei, Franck H; Angiboust, Jean-François; Manfait, Michel
2010-04-01
A fiber-tip-based near-field fluorescence correlation spectroscopy (FCS) has been developed for confining the detection volume to sub-diffraction-limited dimensions. This near-field FCS is based on near-field illumination by coupling a scanning near-field optical microscope (SNOM) to a conventional confocal FCS. Single-molecule FCS analysis at 100 nM Rhodamine 6G has been achieved by using bare chemically etched, tapered fiber tips. The detection volume under control of the SNOM system has been reduced over one order of magnitude compared to that of the conventional confocal FCS. Related factors influencing the near-field FCS performance are investigated and discussed in detail. In this proof-of-principle study, the preliminary experimental results suggest that the fiber-tip-based near-field FCS might be a good alternative to realize localized analysis at the single-molecule level.
NASA Astrophysics Data System (ADS)
Patsariya, Ajay; Rai, Shiwani; Kumar, Yogendra, Dr.; Kirar, Mukesh, Dr.
2017-08-01
The energy crisis particularly with developing GDPs, has bring up to a new panorama of sustainable power source like solar energy, which has encountered huge development. Progressively high infiltration level of photovoltaic (PV) era emerges in keen matrix. Sunlight based power is irregular and variable, as the sun based source at the ground level is exceedingly subject to overcast cover inconstancy, environmental vaporized levels, and other climate parameters. The inalienable inconstancy of substantial scale sun based era acquaints huge difficulties with keen lattice vitality administration. Exact determining of sun powered power/irradiance is basic to secure financial operation of the shrewd framework. In this paper a noble TLBO-MPPT technique has been proposed to address the vitality of solar energy. A comparative analysis has been presented between conventional PO, IC and the proposed MPPT technique. The research has been done on Matlab Simulink software version 2013.
Energy Optimization for a Weak Hybrid Power System of an Automobile Exhaust Thermoelectric Generator
NASA Astrophysics Data System (ADS)
Fang, Wei; Quan, Shuhai; Xie, Changjun; Tang, Xinfeng; Ran, Bin; Jiao, Yatian
2017-11-01
An integrated starter generator (ISG)-type hybrid electric vehicle (HEV) scheme is proposed based on the automobile exhaust thermoelectric generator (AETEG). An eddy current dynamometer is used to simulate the vehicle's dynamic cycle. A weak ISG hybrid bench test system is constructed to test the 48 V output from the power supply system, which is based on engine exhaust-based heat power generation. The thermoelectric power generation-based system must ultimately be tested when integrated into the ISG weak hybrid mixed power system. The test process is divided into two steps: comprehensive simulation and vehicle-based testing. The system's dynamic process is simulated for both conventional and thermoelectric powers, and the dynamic running process comprises four stages: starting, acceleration, cruising and braking. The quantity of fuel available and battery pack energy, which are used as target vehicle energy functions for comparison with conventional systems, are simplified into a single energy target function, and the battery pack's output current is used as the control variable in the thermoelectric hybrid energy optimization model. The system's optimal battery pack output current function is resolved when its dynamic operating process is considered as part of the hybrid thermoelectric power generation system. In the experiments, the system bench is tested using conventional power and hybrid thermoelectric power for the four dynamic operation stages. The optimal battery pack curve is calculated by functional analysis. In the vehicle, a power control unit is used to control the battery pack's output current and minimize energy consumption. Data analysis shows that the fuel economy of the hybrid power system under European Driving Cycle conditions is improved by 14.7% when compared with conventional systems.
Design sensitivity analysis using EAL. Part 1: Conventional design parameters
NASA Technical Reports Server (NTRS)
Dopker, B.; Choi, Kyung K.; Lee, J.
1986-01-01
A numerical implementation of design sensitivity analysis of builtup structures is presented, using the versatility and convenience of an existing finite element structural analysis code and its database management system. The finite element code used in the implemenatation presented is the Engineering Analysis Language (EAL), which is based on a hybrid method of analysis. It was shown that design sensitivity computations can be carried out using the database management system of EAL, without writing a separate program and a separate database. Conventional (sizing) design parameters such as cross-sectional area of beams or thickness of plates and plane elastic solid components are considered. Compliance, displacement, and stress functionals are considered as performance criteria. The method presented is being extended to implement shape design sensitivity analysis using a domain method and a design component method.
Zhong, Z W; Wu, R G; Wang, Z P; Tan, H L
2015-09-01
Conventional microfluidic devices are typically complex and expensive. The devices require the use of pneumatic control systems or highly precise pumps to control the flow in the devices. This work investigates an alternative method using paper based microfluidic devices to replace conventional microfluidic devices. Size based separation and extraction experiments conducted were able to separate free dye from a mixed protein and dye solution. Experimental results showed that pure fluorescein isothiocyanate could be separated from a solution of mixed fluorescein isothiocyanate and fluorescein isothiocyanate labeled bovine serum albumin. The analysis readings obtained from a spectrophotometer clearly show that the extracted tartrazine sample did not contain any amount of Blue-BSA, because its absorbance value was 0.000 measured at a wavelength of 590nm, which correlated to Blue-BSA. These demonstrate that paper based microfluidic devices, which are inexpensive and easy to implement, can potentially replace their conventional counterparts by the use of simple geometry designs and the capillary action. These findings will potentially help in future developments of paper based microfluidic devices. Copyright © 2015 Elsevier B.V. All rights reserved.
Analysis of Aluminum-Nitride SOI for High-Temperature Electronics
NASA Technical Reports Server (NTRS)
Biegel, Bryan A.; Osman, Mohamed A.; Yu, Zhiping
2000-01-01
We use numerical simulation to investigate the high-temperature (up to 500K) operation of SOI MOSFETs with Aluminum-Nitride (AIN) buried insulators, rather than the conventional silicon-dioxide (SiO2). Because the thermal conductivity of AIN is about 100 times that of SiO2, AIN SOI should greatly reduce the often severe self-heating problem of conventional SOI, making SOI potentially suitable for high-temperature applications. A detailed electrothermal transport model is used in the simulations, and solved with a PDE solver called PROPHET In this work, we compare the performance of AIN-based SOI with that of SiO2-based SOI and conventional MOSFETs. We find that AIN SOI does indeed remove the self-heating penalty of SOL However, several device design trade-offs remain, which our simulations highlight.
NASA Astrophysics Data System (ADS)
Lisitsa, Y. V.; Yatskou, M. M.; Apanasovich, V. V.; Apanasovich, T. V.
2015-09-01
We have developed an algorithm for segmentation of cancer cell nuclei in three-channel luminescent images of microbiological specimens. The algorithm is based on using a correlation between fluorescence signals in the detection channels for object segmentation, which permits complete automation of the data analysis procedure. We have carried out a comparative analysis of the proposed method and conventional algorithms implemented in the CellProfiler and ImageJ software packages. Our algorithm has an object localization uncertainty which is 2-3 times smaller than for the conventional algorithms, with comparable segmentation accuracy.
NASA Astrophysics Data System (ADS)
Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar
2017-09-01
The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.
Charpentier, R.R.; Klett, T.R.
2005-01-01
During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.
Piqué, Ester; Vargas-Murga, Liliana; Gómez-Catalán, Jesús; Lapuente, Joaquin de; Llobet, Joan Maria
2013-10-01
In the last years, consumption of organic foods has become increasingly popular. Nevertheless, safety of organic foods is still unclear, and needs to be thoroughly evaluated. Patulin is a mycotoxin mainly present in rotten apples and apple-based products. The aim of this study is to analyse the content of patulin in apple juices and purees derived from organic and conventional production systems, in order to assess the risk to consumers, particularly in children. A total of 93 apple-based products marketed in Catalonia were analysed, 49 of which were derived from conventional and 44 from organic farming. The results showed higher incidence of positive samples and higher concentration of patulin in organic apple purees when comparing with conventional ones. In the case of juices, significant differences were found between conventional and organic samples, but applying a multivariate analysis the type of agriculture did not seem to have a relevant contribution to patulin occurrence, being cloudiness the main factor involved. The estimated daily intake of patulin for infants and young children (0-3 years old), children (4-18 years old) and adults (19-66 years old), were below the provisional maximum tolerable daily intake (PMTDI) of 0.4 μg/kg bw in all scenarios considered. Copyright © 2013 Elsevier Ltd. All rights reserved.
Dimensional changes of acrylic resin denture bases: conventional versus injection-molding technique.
Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam
2014-07-01
Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding.
Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique
Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam
2014-01-01
Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050
Arslan, Mustafa; Murat, Sema; Alp, Gulce; Zaimoglu, Ali
2018-01-01
The objectives of this in vitro study were to evaluate the flexural strength (FS), surface roughness (Ra), and hydrophobicity of polymethylmethacrylate (PMMA)-based computer-aided design/computer-aided manufacturing (CAD/CAM) polymers and to compare the properties of different CAD/CAM PMMA-based polymers with conventional heat-polymerized PMMA following thermal cycling. Twenty rectangular-shaped specimens (64 × 10 × 3.3 mm) were fabricated from three CAD/CAM PMMA-based polymers (M-PM Disc [M], AvaDent Puck Disc [A], and Pink CAD/CAM Disc Polident [P], and one conventional heat-polymerized PMMA (Promolux [C]), according to ISO 20795-1:2013 standards. The specimens were divided into two subgroups (n = 10), a control and a thermocycled group. The specimens in the thermocycled group were subjected to 5000 thermal cycling procedures (5 to 55°C; 30 s dwell times). The Ra value was measured using a profilometer. Contact angle (CA) was assessed using the sessile drop method to evaluate surface hydrophobicity. In addition, the FS of the specimens was tested in a universal testing machine at a crosshead speed of 1.0 mm/min. Surface texture of the materials was assessed using scanning electron microscope (SEM). The data were analyzed using two-way analysis of variance (ANOVA), followed by Tukey's HSD post-hoc test (α < 0.05). CAD/CAM PMMA-based polymers showed significantly higher FS than conventional heat-polymerized PMMA for each group (P < 0.001). CAD/CAM PMMA-based polymer [P] showed the highest FS, whereas conventional PMMA [C] showed the lowest FS before and after thermal cycling (P < 0.001). There were no significant differences among the Ra values of the tested denture base polymers in the control group (P > 0.05). In the thermocycled group, the lowest Ra value was observed for CAD/CAM PMMA-based polymer [M] (P < 0.001), whereas CAD/CAM PMMA-based polymers [A] and [P], and conventional PMMA [C] had similar Ra values (P > 0.05). Conventional PMMA [C] had a significantly lower CA and consequently lower hydrophobicity compared to the CAD/CAM polymers in the control group (P < 0.001). In the thermocycled group, CAD/CAM PMMA-based polymer [A] and conventional PMMA [C] had significantly higher CA, and consequently higher hydrophobicity when compared to CAD/CAM polymers [M] and [P] (P < 0.001). However, no significant differences were found among the other materials (P > 0.05). The FS and hydrophobicity of the CAD/CAM PMMA-based polymers were higher than the conventional heat-polymerized PMMA, whereas the CAD/CAM PMMA-based polymers had similar Ra values to the conventional PMMA. Thermocycling had a significant effect on FS and hydrophobicity except for the Ra of denture base materials.
Development of HAN-based Liquid Propellant Thruster
NASA Astrophysics Data System (ADS)
Hisatsune, K.; Izumi, J.; Tsutaya, H.; Furukawa, K.
2004-10-01
Many of propellants that are applied to the conventional spacecraft propulsion system are toxic propellants. Because of its toxicity, considering the environmental pollution or safety on handling, it will be necessary to apply the "green" propellant to the spacecraft propulsion system. The purpose of this study is to apply HAN based liquid propellant (LP1846) to mono propellant thruster. Compared to the hydrazine that is used in conventional mono propellant thruster, HAN based propellant is not only lower toxic but also can obtain higher specific impulse. Moreover, HAN based propellant can be decomposed by the catalyst. It means there are the possibility of applying to the mono propellant thruster that can leads to the high reliability of the propulsion system.[1],[2] However, there are two technical subjects, to apply HAN based propellant to the mono propellant thruster. One is the high combustion temperature. The catalyst will be damaged under high temperature condition. The other is the low catalytic activity. It is the serious problem on application of HAN based propellant to the mono propellant thruster that is used for attitude control of spacecraft. To improve the catalytic activity of HAN based propellant, it is necessary to screen the best catalyst for HAN based propellant. The adsorption analysis is conducted by Monte Carlo Simulation to screen the catalyst metal for HAN and TEAN. The result of analysis shows the Iridium is the best catalyst metal for HAN and TEAN. Iridium is the catalyst metal that is used at conventional mono propellant thruster catalyst Shell405. Then, to confirm the result of analysis, the reaction test about catalyst is conducted. The result of this test is the same as the result of adsorption analysis. That means the adsorption analysis is effective in screening the catalyst metal. At the evaluating test, the various types of carrier of catalyst are also compared to Shell 405 to improve catalytic activity. The test result shows the inorganic porous media is superior to Shell405 in catalytic activity. Next, the catalyst life with HAN based propellant (LP1846) is evaluated. The Shell405 and inorganic porous media catalyst are compared at the life test. The test result shows the inorganic porous media catalyst is superior to Shell405 in catalyst life. In this paper, the detail of the result of adsorption analysis and evaluating test are reported.
Alternative Strategies in Assessing Special Education Needs
ERIC Educational Resources Information Center
Dykeman, Bruce F.
2006-01-01
The conventional use of standardized testing within a discrepancy analysis model is reviewed. The Response-to-Intervention (RTI) process is explained, along with descriptions of assessment procedures within RTI: functional assessment, authentic assessment, curriculum-based measurement, and play-based assessment. Psychometric issues relevant to RTI…
NASA Astrophysics Data System (ADS)
Ahn, Junkeon; Noh, Yeelyong; Park, Sung Ho; Choi, Byung Il; Chang, Daejun
2017-10-01
This study proposes a fuzzy-based FMEA (failure mode and effect analysis) for a hybrid molten carbonate fuel cell and gas turbine system for liquefied hydrogen tankers. An FMEA-based regulatory framework is adopted to analyze the non-conventional propulsion system and to understand the risk picture of the system. Since the participants of the FMEA rely on their subjective and qualitative experiences, the conventional FMEA used for identifying failures that affect system performance inevitably involves inherent uncertainties. A fuzzy-based FMEA is introduced to express such uncertainties appropriately and to provide flexible access to a risk picture for a new system using fuzzy modeling. The hybrid system has 35 components and has 70 potential failure modes, respectively. Significant failure modes occur in the fuel cell stack and rotary machine. The fuzzy risk priority number is used to validate the crisp risk priority number in the FMEA.
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modified Involute Helical Gears: Computerized Design, Simulation of Meshing, and Stress Analysis
NASA Technical Reports Server (NTRS)
Handschuh, Robert (Technical Monitor); Litvin, Faydor L.; Gonzalez-Perez, Ignacio; Carnevali, Luca; Kawasaki, Kazumasa; Fuentes-Aznar, Alfonso
2003-01-01
The computerized design, methods for generation, simulation of meshing, and enhanced stress analysis of modified involute helical gears is presented. The approaches proposed for modification of conventional involute helical gears are based on conjugation of double-crowned pinion with a conventional helical involute gear. Double-crowning of the pinion means deviation of cross-profile from an involute one and deviation in longitudinal direction from a helicoid surface. Using the method developed, the pinion-gear tooth surfaces are in point-contact, the bearing contact is localized and oriented longitudinally, and edge contact is avoided. Also, the influence of errors of aligment on the shift of bearing contact, vibration, and noise are reduced substantially. The theory developed is illustrated with numerical examples that confirm the advantages of the gear drives of the modified geometry in comparison with conventional helical involute gears.
Modified Involute Helical Gears: Computerized Design, Simulation of Meshing and Stress Analysis
NASA Technical Reports Server (NTRS)
2003-01-01
The computerized design, methods for generation, simulation of meshing, and enhanced stress analysis of modified involute helical gears is presented. The approaches proposed for modification of conventional involute helical gears are based on conjugation of double-crowned pinion with a conventional helical involute gear. Double-crowning of the pinion means deviation of cross-profile from an involute one and deviation in longitudinal direction from a helicoid surface. Using the method developed, the pinion-gear tooth surfaces are in point-contact, the bearing contact is localized and oriented longitudinally, and edge contact is avoided. Also, the influence of errors of alignment on the shift of bearing contact, vibration, and noise are reduced substantially. The theory developed is illustrated with numerical examples that confirm the advantages of the gear drives of the modified geometry in comparison with conventional helical involute gears.
Composite Structure Modeling and Analysis of Advanced Aircraft Fuselage Concepts
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek; Sorokach, Michael R.
2015-01-01
NASA Environmentally Responsible Aviation (ERA) project and the Boeing Company are collabrating to advance the unitized damage arresting composite airframe technology with application to the Hybrid-Wing-Body (HWB) aircraft. The testing of a HWB fuselage section with Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) construction is presently being conducted at NASA Langley. Based on lessons learned from previous HWB structural design studies, improved finite-element models (FEM) of the HWB multi-bay and bulkhead assembly are developed to evaluate the performance of the PRSEUS construction. In order to assess the comparative weight reduction benefits of the PRSEUS technology, conventional cylindrical skin-stringer-frame models of a cylindrical and a double-bubble section fuselage concepts are developed. Stress analysis with design cabin-pressure load and scenario based case studies are conducted for design improvement in each case. Alternate analysis with stitched composite hat-stringers and C-frames are also presented, in addition to the foam-core sandwich frame and pultruded rod-stringer construction. The FEM structural stress, strain and weights are computed and compared for relative weight/strength benefit assessment. The structural analysis and specific weight comparison of these stitched composite advanced aircraft fuselage concepts demonstrated that the pressurized HWB fuselage section assembly can be structurally as efficient as the conventional cylindrical fuselage section with composite stringer-frame and PRSEUS construction, and significantly better than the conventional aluminum construction and the double-bubble section concept.
Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.
Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse
2018-05-01
Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.
Flight display dynamics and compensatory head movements in pilots.
Beer, Jeremy; Freeman, David
2007-06-01
Experiments measured the optokinetic cervical reflex (OKCR), wherein the banking pilot aligns the head with the horizon. In a synthetic cockpit, the flight display was manipulated to test whether changing the visual reference frame would alter OKCR. Eight subjects (five rated pilots) flew a route in simulated visual meteorological conditions that required them to bank the aircraft frequently. Pilots' head tilt was characterized using both the conventional method of regressing against simultaneous aircraft bank, and also an event-based analysis, which identified head movements before, during, and after each turn. Three display configurations were compared to determine whether pilots' orientation would ever migrate from the horizon to the aircraft symbol. The first was a conventional "Inside-Out" condition. A "Frequency-Separated" condition combined Inside-Out horizon geometry with Outside-In dynamics for the aircraft symbol, which depicted joystick bank inputs. In the "Outside-In" condition, the aircraft symbol rolled against a static horizon. Regressions identified an interaction (p < 0.001) between display condition and aircraft bank: head tilt followed horizon tilt in Inside-Out and Frequency-Separated conditions, while remaining mostly level in the Outside-In condition. The event-based analysis identified anticipatory head movements in Inside-Out and Frequency-Separated conditions: 95% CI indicated that before each turn, head tilt favored the direction of the imminent bank. While the conventional analysis confirmed that the horizon comprises a primary spatial reference, the finer-grained event-based analysis indicated that pilots' reference can migrate at least temporarily to the vehicle, and that OKCR can be preceded by anticipatory head movements in the opposite direction.
ERIC Educational Resources Information Center
Lundy, Laura
2012-01-01
This article aims to shed light on the impact of the United Nations Convention on the Rights of the Child (CRC) on education policy in Europe. The findings are based on a documentary analysis of the published reports of the Committee on the Rights of the Child (the Committee) on the implementation of the education rights in the CRC in every EU…
We advocate an approach to reduce the anticipated increase in stormwater runoff from conventional development by demonstrating a low-impact development that incorporates hydrologic factors into an expanded land suitability analysis. This methodology was applied to a 3 hectare exp...
The German Passive: Analysis and Teaching Technique.
ERIC Educational Resources Information Center
Griffen, T. D.
1981-01-01
Proposes an analysis of German passive based upon internal structure rather than translation conventions from Latin and Greek. Claims that this approach leads to a description of the perfect participle as an adjectival complement, which eliminates the classification of a passive voice for German and simplifies the learning task. (MES)
Competition and Curriculum Diversity in Local Schooling Markets: Theory and Evidence.
ERIC Educational Resources Information Center
Adnett, Nick; Davies, Peter
2000-01-01
Based on conventional economic analysis, increasing competitive pressures on schools should promote greater curricular innovation and diversity. The United Kingdom's experience suggests that market-based reforms can initially create pressures to increase curriculum conformity in local schooling markets. Innovation incentives are greatest for…
Keystroke dynamics in the pre-touchscreen era
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.
2013-01-01
Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568
Keystroke dynamics in the pre-touchscreen era.
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A
2013-12-19
Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.
A Protein Domain and Family Based Approach to Rare Variant Association Analysis.
Richardson, Tom G; Shihab, Hashem A; Rivas, Manuel A; McCarthy, Mark I; Campbell, Colin; Timpson, Nicholas J; Gaunt, Tom R
2016-01-01
It has become common practice to analyse large scale sequencing data with statistical approaches based around the aggregation of rare variants within the same gene. We applied a novel approach to rare variant analysis by collapsing variants together using protein domain and family coordinates, regarded to be a more discrete definition of a biologically functional unit. Using Pfam definitions, we collapsed rare variants (Minor Allele Frequency ≤ 1%) together in three different ways 1) variants within single genomic regions which map to individual protein domains 2) variants within two individual protein domain regions which are predicted to be responsible for a protein-protein interaction 3) all variants within combined regions from multiple genes responsible for coding the same protein domain (i.e. protein families). A conventional collapsing analysis using gene coordinates was also undertaken for comparison. We used UK10K sequence data and investigated associations between regions of variants and lipid traits using the sequence kernel association test (SKAT). We observed no strong evidence of association between regions of variants based on Pfam domain definitions and lipid traits. Quantile-Quantile plots illustrated that the overall distributions of p-values from the protein domain analyses were comparable to that of a conventional gene-based approach. Deviations from this distribution suggested that collapsing by either protein domain or gene definitions may be favourable depending on the trait analysed. We have collapsed rare variants together using protein domain and family coordinates to present an alternative approach over collapsing across conventionally used gene-based regions. Although no strong evidence of association was detected in these analyses, future studies may still find value in adopting these approaches to detect previously unidentified association signals.
Research on ionospheric tomography based on variable pixel height
NASA Astrophysics Data System (ADS)
Zheng, Dunyong; Li, Peiqing; He, Jie; Hu, Wusheng; Li, Chaokui
2016-05-01
A novel ionospheric tomography technique based on variable pixel height was developed for the tomographic reconstruction of the ionospheric electron density distribution. The method considers the height of each pixel as an unknown variable, which is retrieved during the inversion process together with the electron density values. In contrast to conventional computerized ionospheric tomography (CIT), which parameterizes the model with a fixed pixel height, the variable-pixel-height computerized ionospheric tomography (VHCIT) model applies a disturbance to the height of each pixel. In comparison with conventional CIT models, the VHCIT technique achieved superior results in a numerical simulation. A careful validation of the reliability and superiority of VHCIT was performed. According to the results of the statistical analysis of the average root mean square errors, the proposed model offers an improvement by 15% compared with conventional CIT models.
Cost effectiveness of conventional versus LANDSAT use data for hydrologic modeling
NASA Technical Reports Server (NTRS)
George, T. S.; Taylor, R. S.
1982-01-01
Six case studies were analyzed to investigate the cost effectiveness of using land use data obtained from LANDSAT as opposed to conventionally obtained data. A procedure was developed to determine the relative effectiveness of the two alternative means of acquiring data for hydrological modelling. The cost of conventionally acquired data ranged between $3,000 and $16,000 for the six test basins. Information based on LANDSAT imagery cost between $2,000 and $5,000. Results of the effectiveness analysis shows the differences between the two methods are insignificant. From the cost comparison and the act that each method, conventional and LANDSAT, is shown to be equally effective in developing land use data for hydrologic studies, the cost effectiveness of the conventional or LANDSAT method is found to be a function of basin size for the six test watersheds analyzed. The LANDSAT approach is cost effective for areas containing more than 10 square miles.
Memristor-based cellular nonlinear/neural network: design, analysis, and applications.
Duan, Shukai; Hu, Xiaofang; Dong, Zhekang; Wang, Lidan; Mazumder, Pinaki
2015-06-01
Cellular nonlinear/neural network (CNN) has been recognized as a powerful massively parallel architecture capable of solving complex engineering problems by performing trillions of analog operations per second. The memristor was theoretically predicted in the late seventies, but it garnered nascent research interest due to the recent much-acclaimed discovery of nanocrossbar memories by engineers at the Hewlett-Packard Laboratory. The memristor is expected to be co-integrated with nanoscale CMOS technology to revolutionize conventional von Neumann as well as neuromorphic computing. In this paper, a compact CNN model based on memristors is presented along with its performance analysis and applications. In the new CNN design, the memristor bridge circuit acts as the synaptic circuit element and substitutes the complex multiplication circuit used in traditional CNN architectures. In addition, the negative differential resistance and nonlinear current-voltage characteristics of the memristor have been leveraged to replace the linear resistor in conventional CNNs. The proposed CNN design has several merits, for example, high density, nonvolatility, and programmability of synaptic weights. The proposed memristor-based CNN design operations for implementing several image processing functions are illustrated through simulation and contrasted with conventional CNNs. Monte-Carlo simulation has been used to demonstrate the behavior of the proposed CNN due to the variations in memristor synaptic weights.
Tarai, Madhumita; Kumar, Keshav; Divya, O; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar
2017-09-05
The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Nichols, J. D.; Gialdini, M.; Jaakkola, S.
1974-01-01
A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.
Experimental study of geotextile as plinth beam in a pile group-supported modeled building frame
NASA Astrophysics Data System (ADS)
Ravi Kumar Reddy, C.; Gunneswara Rao, T. D.
2017-12-01
This paper presents the experimental results of static vertical load tests on a model building frame with geotextile as plinth beam supported by pile groups embedded in cohesionless soil (sand). The experimental results have been compared with those obtained from the nonlinear FEA and conventional method of analysis. The results revealed that the conventional method of analysis gives a shear force of about 53%, bending moment at the top of the column about 17% and at the base of the column about 50-98% higher than that by the nonlinear FEA for the frame with geotextile as plinth beam.
ERIC Educational Resources Information Center
Umeasiegbu, Veronica I.; Bishop, Malachy; Mpofu, Elias
2013-01-01
This article presents an analysis of the United Nations Convention on the Rights of Persons with Disabilities (CRPD) in relation to prior United Nations conventions on disability and U.S. disability policy law with a view to identifying the conventional and also the incremental advances of the CRPD. Previous United Nations conventions related to…
Commercial Crop Yields Reveal Strengths and Weaknesses for Organic Agriculture in the United States.
Kniss, Andrew R; Savage, Steven D; Jabbour, Randa
2016-01-01
Land area devoted to organic agriculture has increased steadily over the last 20 years in the United States, and elsewhere around the world. A primary criticism of organic agriculture is lower yield compared to non-organic systems. Previous analyses documenting the yield deficiency in organic production have relied mostly on data generated under experimental conditions, but these studies do not necessarily reflect the full range of innovation or practical limitations that are part of commercial agriculture. The analysis we present here offers a new perspective, based on organic yield data collected from over 10,000 organic farmers representing nearly 800,000 hectares of organic farmland. We used publicly available data from the United States Department of Agriculture to estimate yield differences between organic and conventional production methods for the 2014 production year. Similar to previous work, organic crop yields in our analysis were lower than conventional crop yields for most crops. Averaged across all crops, organic yield averaged 67% of conventional yield [corrected]. However, several crops had no significant difference in yields between organic and conventional production, and organic yields surpassed conventional yields for some hay crops. The organic to conventional yield ratio varied widely among crops, and in some cases, among locations within a crop. For soybean (Glycine max) and potato (Solanum tuberosum), organic yield was more similar to conventional yield in states where conventional yield was greatest. The opposite trend was observed for barley (Hordeum vulgare), wheat (Triticum aestevum), and hay crops, however, suggesting the geographical yield potential has an inconsistent effect on the organic yield gap.
Commercial Crop Yields Reveal Strengths and Weaknesses for Organic Agriculture in the United States
Savage, Steven D.; Jabbour, Randa
2016-01-01
Land area devoted to organic agriculture has increased steadily over the last 20 years in the United States, and elsewhere around the world. A primary criticism of organic agriculture is lower yield compared to non-organic systems. Previous analyses documenting the yield deficiency in organic production have relied mostly on data generated under experimental conditions, but these studies do not necessarily reflect the full range of innovation or practical limitations that are part of commercial agriculture. The analysis we present here offers a new perspective, based on organic yield data collected from over 10,000 organic farmers representing nearly 800,000 hectares of organic farmland. We used publicly available data from the United States Department of Agriculture to estimate yield differences between organic and conventional production methods for the 2014 production year. Similar to previous work, organic crop yields in our analysis were lower than conventional crop yields for most crops. Averaged across all crops, organic yield averaged 80% of conventional yield. However, several crops had no significant difference in yields between organic and conventional production, and organic yields surpassed conventional yields for some hay crops. The organic to conventional yield ratio varied widely among crops, and in some cases, among locations within a crop. For soybean (Glycine max) and potato (Solanum tuberosum), organic yield was more similar to conventional yield in states where conventional yield was greatest. The opposite trend was observed for barley (Hordeum vulgare), wheat (Triticum aestevum), and hay crops, however, suggesting the geographical yield potential has an inconsistent effect on the organic yield gap. PMID:27552217
Yuan, Jing; Liu, Fenghua
2017-01-01
Objective The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Methods Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Results Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94–1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06–1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80–1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Conclusions Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use. PMID:28570713
NASA Astrophysics Data System (ADS)
Yang, Lei; Yan, Hongyong; Liu, Hong
2017-03-01
Implicit staggered-grid finite-difference (ISFD) scheme is competitive for its great accuracy and stability, whereas its coefficients are conventionally determined by the Taylor-series expansion (TE) method, leading to a loss in numerical precision. In this paper, we modify the TE method using the minimax approximation (MA), and propose a new optimal ISFD scheme based on the modified TE (MTE) with MA method. The new ISFD scheme takes the advantage of the TE method that guarantees great accuracy at small wavenumbers, and keeps the property of the MA method that keeps the numerical errors within a limited bound at the same time. Thus, it leads to great accuracy for numerical solution of the wave equations. We derive the optimal ISFD coefficients by applying the new method to the construction of the objective function, and using a Remez algorithm to minimize its maximum. Numerical analysis is made in comparison with the conventional TE-based ISFD scheme, indicating that the MTE-based ISFD scheme with appropriate parameters can widen the wavenumber range with high accuracy, and achieve greater precision than the conventional ISFD scheme. The numerical modeling results also demonstrate that the MTE-based ISFD scheme performs well in elastic wave simulation, and is more efficient than the conventional ISFD scheme for elastic modeling.
Optimization and Validation of Rotating Current Excitation with GMR Array Sensors for Riveted
2016-09-16
distribution. Simulation results, using both an optimized coil and a conventional coil, are generated using the finite element method (FEM) model...optimized coil and a conventional coil, are generated using the finite element method (FEM) model. The signal magnitude for an optimized coil is seen to be...optimized coil. 4. Model Based Performance Analysis A 3D finite element model (FEM) is used to analyze the performance of the optimized coil and
Sahm, Maik; Otto, Ronny; Pross, Matthias; Mantke, Rene
2018-06-25
Approximately 90,000 thyroid operations are performed in Germany each year. Minimally invasive video-assisted thyroidectomy (MIVAT) accounts for 5 - 10% of these operations. There are few data that compare long-term cosmetic results after MIVAT to those after conventional surgery. Current systematic reviews show no advantage for MIVAT. The goal of this study was to analyse the long-term postoperative results in both procedures and the evaluation of relevant factors. The analysis of the long-term results is based on follow-up examinations using a validated method for scar appraisal (POSAS). Cohort analysis was performed on MIVAT operations in our hospital between 2004 and 2011 and conventional thyroid operations in 2011. Follow-up examination data were analysed from 117 patients from the MIVAT group and 102 patients from the conventional group. The follow-up examination was performed with a mean of 23.1 vs. 23.6 months postoperatively (MIVAT vs. conventional). The Friedman Test showed that scar pigmentation (mean rank 4.79) and scar surface structure (mean rank 3.62) were the deciding factors influencing the long-term cosmetic results. Both MIVAT and conventional surgery gave very good long-term cosmetic results. From the patient's perspective, there is no significant advantage with conventional surgery. The evaluation of the long-term results largely depends on factors such as scar pigmentation and surface structure that can only be influenced to a limited extent by the surgical procedure. Georg Thieme Verlag KG Stuttgart · New York.
[An improved medical image fusion algorithm and quality evaluation].
Chen, Meiling; Tao, Ling; Qian, Zhiyu
2009-08-01
Medical image fusion is of very important value for application in medical image analysis and diagnosis. In this paper, the conventional method of wavelet fusion is improved,so a new algorithm of medical image fusion is presented and the high frequency and low frequency coefficients are studied respectively. When high frequency coefficients are chosen, the regional edge intensities of each sub-image are calculated to realize adaptive fusion. The choice of low frequency coefficient is based on the edges of images, so that the fused image preserves all useful information and appears more distinctly. We apply the conventional and the improved fusion algorithms based on wavelet transform to fuse two images of human body and also evaluate the fusion results through a quality evaluation method. Experimental results show that this algorithm can effectively retain the details of information on original images and enhance their edge and texture features. This new algorithm is better than the conventional fusion algorithm based on wavelet transform.
Biscotti, C V; Hollow, J A; Toddy, S M; Easley, K A
1995-08-01
Paired fine-needle aspiration specimens were analyzed from 41 surgically resected thyroid nodules, to compare diagnostic accuracy, amount (absent, mild, moderate, or marked) and pattern (diffuse, droplets, or both) of colloid, nuclear detail (poor, satisfactory, or excellent) and cytoplasmic detail (intact or disrupted) in ThinPrep (TP) (Cytyc, Marlborough, MA) versus conventional smear (CS) cytologic preparations. The 41 surgical specimens included 25 colloid nodules, 6 papillary carcinomas, 4 follicular adenomas, 2 minimally invasive (encapsulated) follicular carcinomas, 3 Hashimoto's thyroiditis, and 1 Grave's disease. Both techniques identified seven of the eight carcinomas with the minimally invasive follicular carcinomas categorized as hypercellular follicular nodule, possibly malignant (HCFN). One papillary carcinoma was classified as a HCFN by both TP and CS techniques. The four follicular adenomas were classified as HCFN based on the TP slides. One oxyphilic follicular adenoma, associated with focal lymphocytic thyroiditis, was misinterpreted as Hashimoto's thyroiditis on a conventional smear. Three colloid nodules were interpreted as HCFN based on the TP slides. Two of these were similarly classified based on the conventional smear. ThinPrep slides contained less colloid and the colloid occurred as droplets rather than a diffuse pattern. TP slides had better nuclear detail but more often disrupted cytoplasm. In conclusion, the TP process does alter some cellular features; however, we experienced similar diagnostic accuracy with the TP and conventional smear preparations.
CFD analysis of heat transfer performance of graphene based hybrid nanofluid in radiators
NASA Astrophysics Data System (ADS)
Bharadwaj, Bharath R.; Sanketh Mogeraya, K.; Manjunath, D. M.; Rao Ponangi, Babu; Rajendra Prasad, K. S.; Krishna, V.
2018-04-01
For Improved performance of an automobile engine, Cooling systems are one of the critical systems that need attention. With increased capacity to carry away large amounts of wasted heat, performance of an engine is increased. Current research on Nano-fluids suggests that they offer higher heat transfer rate compared to that of conventional coolants. Hence this project seeks to investigate the use of hybrid-nanofluids in radiators so as to increase its heat transfer performance. Carboxyl Graphene and Graphene Oxide based nanoparticles were selected due to the very high thermal conductivity of Graphene. System Analysis of the radiator was performed by considering a small part of the whole automobile radiator modelled using SEIMENS NX. CFD analysis was conducted using ANSYS FLUENT® for the nanofluid defined and the increase in effectiveness was compared to that of conventional coolants. Usage of such nanofluids for a fixed cooling requirement in the future can lead to significant downsizing of the radiator.
Adjoint Sensitivity Analysis for Scale-Resolving Turbulent Flow Solvers
NASA Astrophysics Data System (ADS)
Blonigan, Patrick; Garai, Anirban; Diosady, Laslo; Murman, Scott
2017-11-01
Adjoint-based sensitivity analysis methods are powerful design tools for engineers who use computational fluid dynamics. In recent years, these engineers have started to use scale-resolving simulations like large-eddy simulations (LES) and direct numerical simulations (DNS), which resolve more scales in complex flows with unsteady separation and jets than the widely-used Reynolds-averaged Navier-Stokes (RANS) methods. However, the conventional adjoint method computes large, unusable sensitivities for scale-resolving simulations, which unlike RANS simulations exhibit the chaotic dynamics inherent in turbulent flows. Sensitivity analysis based on least-squares shadowing (LSS) avoids the issues encountered by conventional adjoint methods, but has a high computational cost even for relatively small simulations. The following talk discusses a more computationally efficient formulation of LSS, ``non-intrusive'' LSS, and its application to turbulent flows simulated with a discontinuous-Galkerin spectral-element-method LES/DNS solver. Results are presented for the minimal flow unit, a turbulent channel flow with a limited streamwise and spanwise domain.
Amidžić Klarić, Daniela; Klarić, Ilija; Mornar, Ana; Velić, Darko; Velić, Natalija
2015-08-01
This study brings out the data on the content of 21 mineral and heavy metal in 15 blackberry wines made of conventionally and organically grown blackberries. The objective of this study was to classify the blackberry wine samples based on their mineral composition and the applied cultivation method of the starting raw material by using chemometric analysis. The metal content of Croatian blackberry wine samples was determined by AAS after dry ashing. The comparison between an organic and conventional group of investigated blackberry wines showed statistically significant difference in concentrations of Si and Li, where the organic group contained higher concentrations of these compounds. According to multivariate data analysis, the model based on the original metal content data set finally included seven original variables (K, Fe, Mn, Cu, Ba, Cd and Cr) and gave a satisfactory separation of two applied cultivation methods of the starting raw material.
Vision-Based UAV Flight Control and Obstacle Avoidance
2006-01-01
denoted it by Vb = (Vb1, Vb2 , Vb3). Fig. 2 shows the block diagram of the proposed vision-based motion analysis and obstacle avoidance system. We denote...structure analysis often involve computation- intensive computer vision tasks, such as feature extraction and geometric modeling. Computation-intensive...First, we extract a set of features from each block. 2) Second, we compute the distance between these two sets of features. In conventional motion
ERIC Educational Resources Information Center
Pulz, Michael; Lusti, Markus
PROJECTTUTOR is an intelligent tutoring system that enhances conventional classroom instruction by teaching problem solving in project planning. The domain knowledge covered by the expert module is divided into three functions. Structural analysis, identifies the activities that make up the project, time analysis, computes the earliest and latest…
Acoustic analysis of the propfan
NASA Technical Reports Server (NTRS)
Farassat, F.; Succi, G. P.
1979-01-01
A review of propeller noise prediction technology is presented. Two methods for the prediction of the noise from conventional and advanced propellers in forward flight are described. These methods are based on different time domain formulations. Brief descriptions of the computer algorithms based on these formulations are given. The output of the programs (the acoustic pressure signature) was Fourier analyzed to get the acoustic pressure spectrum. The main difference between the two programs is that one can handle propellers with supersonic tip speed while the other is for subsonic tip speed propellers. Comparisons of the calculated and measured acoustic data for a conventional and an advanced propeller show good agreement in general.
Taguchi, Y-h; Iwadate, Mitsuo; Umeyama, Hideaki
2015-04-30
Feature extraction (FE) is difficult, particularly if there are more features than samples, as small sample numbers often result in biased outcomes or overfitting. Furthermore, multiple sample classes often complicate FE because evaluating performance, which is usual in supervised FE, is generally harder than the two-class problem. Developing sample classification independent unsupervised methods would solve many of these problems. Two principal component analysis (PCA)-based FE, specifically, variational Bayes PCA (VBPCA) was extended to perform unsupervised FE, and together with conventional PCA (CPCA)-based unsupervised FE, were tested as sample classification independent unsupervised FE methods. VBPCA- and CPCA-based unsupervised FE both performed well when applied to simulated data, and a posttraumatic stress disorder (PTSD)-mediated heart disease data set that had multiple categorical class observations in mRNA/microRNA expression of stressed mouse heart. A critical set of PTSD miRNAs/mRNAs were identified that show aberrant expression between treatment and control samples, and significant, negative correlation with one another. Moreover, greater stability and biological feasibility than conventional supervised FE was also demonstrated. Based on the results obtained, in silico drug discovery was performed as translational validation of the methods. Our two proposed unsupervised FE methods (CPCA- and VBPCA-based) worked well on simulated data, and outperformed two conventional supervised FE methods on a real data set. Thus, these two methods have suggested equivalence for FE on categorical multiclass data sets, with potential translational utility for in silico drug discovery.
Evaluation of a HDR image sensor with logarithmic response for mobile video-based applications
NASA Astrophysics Data System (ADS)
Tektonidis, Marco; Pietrzak, Mateusz; Monnin, David
2017-10-01
The performance of mobile video-based applications using conventional LDR (Low Dynamic Range) image sensors highly depends on the illumination conditions. As an alternative, HDR (High Dynamic Range) image sensors with logarithmic response are capable to acquire illumination-invariant HDR images in a single shot. We have implemented a complete image processing framework for a HDR sensor, including preprocessing methods (nonuniformity correction (NUC), cross-talk correction (CTC), and demosaicing) as well as tone mapping (TM). We have evaluated the HDR sensor for video-based applications w.r.t. the display of images and w.r.t. image analysis techniques. Regarding the display we have investigated the image intensity statistics over time, and regarding image analysis we assessed the number of feature correspondences between consecutive frames of temporal image sequences. For the evaluation we used HDR image data recorded from a vehicle on outdoor or combined outdoor/indoor itineraries, and we performed a comparison with corresponding conventional LDR image data.
Lazzaro, Carlo; Bordonaro, Roberto; Cognetti, Francesco; Fabi, Alessandra; De Placido, Sabino; Arpino, Grazia; Marchetti, Paolo; Botticelli, Andrea; Pronzato, Paolo; Martelli, Elisa
2013-01-01
Purpose Paclitaxel albumin (nab-paclitaxel) is a nanoparticle albumin-bound paclitaxel formulation aimed at increasing therapeutic index in metastatic breast cancer. When compared to conventional paclitaxel, nab-paclitaxel has a reported longer time to progression, higher response, lower incidence of neutropenia, no need for premedication, shorter time of administration, and in pretreated metastatic breast cancer patients, extended overall survival. This study investigates the cost-effectiveness of nab-paclitaxel versus conventional paclitaxel for pretreated metastatic breast cancer patients in Italy. Materials and methods A Markov model with progression-free, progressed, and dead states was developed to estimate costs, outcomes, and quality adjusted life years over 5 years from the Italian National Health Service viewpoint. Patients were assumed to receive nab-paclitaxel 260 mg/m2 three times weekly or conventional paclitaxel 175 mg/m2 three times weekly. Data on health care resource consumption was collected from a convenience sample of five Italian centers. Resources were valued at Euro (€) 2011. Published utility weights were applied to health states to estimate the impact of response, disease progression, and adverse events on quality adjusted life years. Three sensitivity analyses tested the robustness of the base case incremental cost-effectiveness ratio (ICER). Results and conclusion Compared to conventional paclitaxel, nab-paclitaxel gains an extra 0.165 quality adjusted life years (0.265 life years saved) and incurs additional costs of €2506 per patient treated. This translates to an ICER of €15,189 (95% confidence interval: €11,891–€28,415). One-way sensitivity analysis underscores that ICER for nab-paclitaxel remains stable despite varying taxanes cost. Threshold analysis shows that ICER for nab-paclitaxel exceeds €40,000 only if cost per mg of conventional paclitaxel is set to zero. Probabilistic sensitivity analysis highlights that nab-paclitaxel has a 0.99 probability to be cost-effective for a threshold value of €40,000 and is the optimal alternative from a threshold value of €16,316 onwards. Based on these findings, nab-paclitaxel can be considered highly cost-effective when compared to the acceptability range for ICER proposed by the Italian Health Economics Association (€25,000–€40,000). PMID:23610525
Saam, Tobias; Herzen, Julia; Hetterich, Holger; Fill, Sandra; Willner, Marian; Stockmar, Marco; Achterhold, Klaus; Zanette, Irene; Weitkamp, Timm; Schüller, Ulrich; Auweter, Sigrid; Adam-Neumair, Silvia; Nikolaou, Konstantin; Reiser, Maximilian F.; Pfeiffer, Franz; Bamberg, Fabian
2013-01-01
Objectives Phase-contrast imaging is a novel X-ray based technique that provides enhanced soft tissue contrast. The aim of this study was to evaluate the feasibility of visualizing human carotid arteries by grating-based phase-contrast tomography (PC-CT) at two different experimental set-ups: (i) applying synchrotron radiation and (ii) using a conventional X-ray tube. Materials and Methods Five ex-vivo carotid artery specimens were examined with PC-CT either at the European Synchrotron Radiation Facility using a monochromatic X-ray beam (2 specimens; 23 keV; pixel size 5.4 µm), or at a laboratory set-up on a conventional X-ray tube (3 specimens; 35-40 kVp; 70 mA; pixel size 100 µm). Tomographic images were reconstructed and compared to histopathology. Two independent readers determined vessel dimensions and one reader determined signal-to-noise ratios (SNR) between PC-CT and absorption images. Results In total, 51 sections were included in the analysis. Images from both set-ups provided sufficient contrast to differentiate individual vessel layers. All PCI-based measurements strongly predicted but significantly overestimated lumen, intima and vessel wall area for both the synchrotron and the laboratory-based measurements as compared with histology (all p<0.001 with slope >0.53 per mm2, 95%-CI: 0.35 to 0.70). Although synchrotron-based images were characterized by higher SNRs than laboratory-based images; both PC-CT set-ups had superior SNRs compared to corresponding conventional absorption-based images (p<0.001). Inter-reader reproducibility was excellent (ICCs >0.98 and >0.84 for synchrotron and for laboratory-based measurements; respectively). Conclusion Experimental PC-CT of carotid specimens is feasible with both synchrotron and conventional X-ray sources, producing high-resolution images suitable for vessel characterization and atherosclerosis research. PMID:24039969
Effects of 99mTc-TRODAT-1 drug template on image quantitative analysis
Yang, Bang-Hung; Chou, Yuan-Hwa; Wang, Shyh-Jen; Chen, Jyh-Cheng
2018-01-01
99mTc-TRODAT-1 is a type of drug that can bind to dopamine transporters in living organisms and is often used in SPCT imaging for observation of changes in the activity uptake of dopamine in the striatum. Therefore, it is currently widely used in studies on clinical diagnosis of Parkinson’s disease (PD) and movement-related disorders. In conventional 99mTc-TRODAT-1 SPECT image evaluation, visual inspection or manual selection of ROI for semiquantitative analysis is mainly used to observe and evaluate the degree of striatal defects. However, these methods are dependent on the subjective opinions of observers, which lead to human errors, have shortcomings such as long duration, increased effort, and have low reproducibility. To solve this problem, this study aimed to establish an automatic semiquantitative analytical method for 99mTc-TRODAT-1. This method combines three drug templates (one built-in SPECT template in SPM software and two self-generated MRI-based and HMPAO-based TRODAT-1 templates) for the semiquantitative analysis of the striatal phantom and clinical images. At the same time, the results of automatic analysis of the three templates were compared with results from a conventional manual analysis for examining the feasibility of automatic analysis and the effects of drug templates on automatic semiquantitative analysis results. After comparison, it was found that the MRI-based TRODAT-1 template generated from MRI images is the most suitable template for 99mTc-TRODAT-1 automatic semiquantitative analysis. PMID:29543874
Liu, Tao; Thibos, Larry; Marin, Gildas; Hernandez, Martha
2014-01-01
Conventional aberration analysis by a Shack-Hartmann aberrometer is based on the implicit assumption that an injected probe beam reflects from a single fundus layer. In fact, the biological fundus is a thick reflector and therefore conventional analysis may produce errors of unknown magnitude. We developed a novel computational method to investigate this potential failure of conventional analysis. The Shack-Hartmann wavefront sensor was simulated by computer software and used to recover by two methods the known wavefront aberrations expected from a population of normally-aberrated human eyes and bi-layer fundus reflection. The conventional method determines the centroid of each spot in the SH data image, from which wavefront slopes are computed for least-squares fitting with derivatives of Zernike polynomials. The novel 'global' method iteratively adjusted the aberration coefficients derived from conventional centroid analysis until the SH image, when treated as a unitary picture, optimally matched the original data image. Both methods recovered higher order aberrations accurately and precisely, but only the global algorithm correctly recovered the defocus coefficients associated with each layer of fundus reflection. The global algorithm accurately recovered Zernike coefficients for mean defocus and bi-layer separation with maximum error <0.1%. The global algorithm was robust for bi-layer separation up to 2 dioptres for a typical SH wavefront sensor design. For 100 randomly generated test wavefronts with 0.7 D axial separation, the retrieved mean axial separation was 0.70 D with standard deviations (S.D.) of 0.002 D. Sufficient information is contained in SH data images to measure the dioptric thickness of dual-layer fundus reflection. The global algorithm is superior since it successfully recovered the focus value associated with both fundus layers even when their separation was too small to produce clearly separated spots, while the conventional analysis misrepresents the defocus component of the wavefront aberration as the mean defocus for the two reflectors. Our novel global algorithm is a promising method for SH data image analysis in clinical and visual optics research for human and animal eyes. © 2013 The Authors Ophthalmic & Physiological Optics © 2013 The College of Optometrists.
Droplet Microarray Based on Superhydrophobic-Superhydrophilic Patterns for Single Cell Analysis.
Jogia, Gabriella E; Tronser, Tina; Popova, Anna A; Levkin, Pavel A
2016-12-09
Single-cell analysis provides fundamental information on individual cell response to different environmental cues and is a growing interest in cancer and stem cell research. However, current existing methods are still facing challenges in performing such analysis in a high-throughput manner whilst being cost-effective. Here we established the Droplet Microarray (DMA) as a miniaturized screening platform for high-throughput single-cell analysis. Using the method of limited dilution and varying cell density and seeding time, we optimized the distribution of single cells on the DMA. We established culturing conditions for single cells in individual droplets on DMA obtaining the survival of nearly 100% of single cells and doubling time of single cells comparable with that of cells cultured in bulk cell population using conventional methods. Our results demonstrate that the DMA is a suitable platform for single-cell analysis, which carries a number of advantages compared with existing technologies allowing for treatment, staining and spot-to-spot analysis of single cells over time using conventional analysis methods such as microscopy.
Computer-Based Instruction and Health Professions Education: A Meta-Analysis of Outcomes.
ERIC Educational Resources Information Center
Cohen, Peter A.; Dacanay, Lakshmi S.
1992-01-01
The meta-analytic techniques of G. V. Glass were used to statistically integrate findings from 47 comparative studies on computer-based instruction (CBI) in health professions education. A clear majority of the studies favored CBI over conventional methods of instruction. Results show higher-order applications of computers to be especially…
Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms
ERIC Educational Resources Information Center
Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy
2005-01-01
Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…
Nonlinear constitutive theory for turbine engine structural analysis
NASA Technical Reports Server (NTRS)
Thompson, R. L.
1982-01-01
A number of viscoplastic constitutive theories and a conventional constitutive theory are evaluated and compared in their ability to predict nonlinear stress-strain behavior in gas turbine engine components at elevated temperatures. Specific application of these theories is directed towards the structural analysis of combustor liners undergoing transient, cyclic, thermomechanical load histories. The combustor liner material considered in this study is Hastelloy X. The material constants for each of the theories (as a function of temperature) are obtained from existing, published experimental data. The viscoplastic theories and a conventional theory are incorporated into a general purpose, nonlinear, finite element computer program. Several numerical examples of combustor liner structural analysis using these theories are given to demonstrate their capabilities. Based on the numerical stress-strain results, the theories are evaluated and compared.
Basic relationships for LTA economic analysis
NASA Technical Reports Server (NTRS)
Ausrotas, R. A.
1975-01-01
Operating costs based on data of actual and proposed airships for conventional lighter than air craft (LTA) are presented. An economic comparison of LTA with the B-47F is included, and possible LTA economic trends are discussed.
Simulation-based sensitivity analysis for non-ignorably missing data.
Yin, Peng; Shi, Jian Q
2017-01-01
Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.
VHH antibodies: Emerging reagents for the analysis of environmental chemicals
Bever, Candace S.; Dong, Jie-Xian; Vasylieva, Natalia; Barnych, Bogdan; Cui, Yongliang; Xu, Zhen-Lin; Hammock, Bruce D.; Gee, Shirley J.
2016-01-01
A VHH antibody (or nanobody) is the antigen binding fragment of heavy chain only antibodies. Discovered nearly 25 years ago, they have been investigated for their use in clinical therapeutics and immunodiagnostics, and more recently for environmental monitoring applications. A new and valuable immunoreagent for the analysis of small molecular weight environmental chemicals, VHH will overcome many pitfalls encountered with conventional reagents. In the work so far, VHH antibodies often perform comparably to conventional antibodies for small molecule analysis, are amenable to numerous genetic engineering techniques, and show ease of adaption to other immunodiagnostic platforms for use in environmental monitoring. Recent reviews cover the structure and production of VHH antibodies as well as their use in clinical settings. However, no report focuses on the use of these VHH antibodies to small environmental chemicals (MW <1,500 Da). This review article summarizes the efforts made to produce VHHs to various environmental targets, compares the VHH-based assays with conventional antibody assays, and discusses the advantages and limitations in developing these new antibody reagents particularly to small molecule targets. PMID:27209591
NASA Astrophysics Data System (ADS)
Kusumo, B. H.; Sukartono, S.; Bustan, B.
2018-02-01
Measuring soil organic carbon (C) using conventional analysis is tedious procedure, time consuming and expensive. It is needed simple procedure which is cheap and saves time. Near infrared technology offers rapid procedure as it works based on the soil spectral reflectance and without any chemicals. The aim of this research is to test whether this technology able to rapidly measure soil organic C in rice paddy field. Soil samples were collected from rice paddy field of Lombok Island Indonesia, and the coordinates of the samples were recorded. Parts of the samples were analysed using conventional analysis (Walkley and Black) and some other parts were scanned using near infrared spectroscopy (NIRS) for soil spectral collection. Partial Least Square Regression (PLSR) Models were developed using data of soil C analysed using conventional analysis and data from soil spectral reflectance. The models were moderately successful to measure soil C in rice paddy field of Lombok Island. This shows that the NIR technology can be further used to monitor the C change in rice paddy soil.
Dutta, Sunil W; Bauer-Nilsen, Kristine; Sanders, Jason C; Trifiletti, Daniel M; Libby, Bruce; Lash, Donna H; Lain, Melody; Christodoulou, Deborah; Hodge, Constance; Showalter, Timothy N
To evaluate the delivery cost of frequently used radiotherapy options offered to patients with intermediate- to high-risk prostate cancer using time-driven activity-based costing and compare the results with Medicare reimbursement and relative value units (RVUs). Process maps were created to represent each step of prostate radiotherapy treatment at our institution. Salary data, equipment purchase costs, and consumable costs were factored into the cost analysis. The capacity cost rate was determined for each resource and calculated for each treatment option from initial consultation to its completion. Treatment options included low-dose-rate brachytherapy (LDR-BT), combined high-dose-rate brachytherapy single fraction boost with 25-fraction intensity-modulated radiotherapy (HDR-BT-IMRT), moderately hypofractionated 28-fraction IMRT, conventionally fractionated 39-fraction IMRT, and conventionally fractionated (2 Gy/fraction) 23-fraction pelvis irradiation with 16-fraction prostate boost. The total cost to deliver LDR-BT, HDR-BT-IMRT, moderately hypofractionated 28-fraction IMRT, conventionally fractionated 39-fraction IMRT, conventionally fractionated 39-fraction IMRT, and conventionally fractionated (2 Gy/fraction) 23-fraction pelvis irradiation with 16-fraction prostate boost was $2719, $6517, $4173, $5507, and $5663, respectively. Total reimbursement for each course was $3123, $10,156, $7862, $9725, and $10,377, respectively. Radiation oncology attending time was 1.5-2 times higher for treatment courses incorporating BT. Attending radiation oncologist's time consumed per RVU was higher with BT (4.83 and 2.56 minutes per RVU generated for LDR-BT and HDR-BT-IMRT, respectively) compared to without BT (1.41-1.62 minutes per RVU). Time-driven activity-based costing analysis identified higher delivery costs associated with prostate BT compared with IMRT alone. In light of recent guidelines promoting BT for intermediate- to high-risk disease, re-evaluation of payment policies is warranted to encourage BT delivery. Copyright © 2018 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Géczi, Gábor; Horváth, Márk; Kaszab, Tímea; Alemany, Gonzalo Garnacho
2013-01-01
Extension of shelf life and preservation of products are both very important for the food industry. However, just as with other processes, speed and higher manufacturing performance are also beneficial. Although microwave heating is utilized in a number of industrial processes, there are many unanswered questions about its effects on foods. Here we analyze whether the effects of microwave heating with continuous flow are equivalent to those of traditional heat transfer methods. In our study, the effects of heating of liquid foods by conventional and continuous flow microwave heating were studied. Among other properties, we compared the stability of the liquid foods between the two heat treatments. Our goal was to determine whether the continuous flow microwave heating and the conventional heating methods have the same effects on the liquid foods, and, therefore, whether microwave heat treatment can effectively replace conventional heat treatments. We have compared the colour, separation phenomena of the samples treated by different methods. For milk, we also monitored the total viable cell count, for orange juice, vitamin C contents in addition to the taste of the product by sensory analysis. The majority of the results indicate that the circulating coil microwave method used here is equivalent to the conventional heating method based on thermal conduction and convection. However, some results in the analysis of the milk samples show clear differences between heat transfer methods. According to our results, the colour parameters (lightness, red-green and blue-yellow values) of the microwave treated samples differed not only from the untreated control, but also from the traditional heat treated samples. The differences are visually undetectable, however, they become evident through analytical measurement with spectrophotometer. This finding suggests that besides thermal effects, microwave-based food treatment can alter product properties in other ways as well.
Géczi, Gábor; Horváth, Márk; Kaszab, Tímea; Alemany, Gonzalo Garnacho
2013-01-01
Extension of shelf life and preservation of products are both very important for the food industry. However, just as with other processes, speed and higher manufacturing performance are also beneficial. Although microwave heating is utilized in a number of industrial processes, there are many unanswered questions about its effects on foods. Here we analyze whether the effects of microwave heating with continuous flow are equivalent to those of traditional heat transfer methods. In our study, the effects of heating of liquid foods by conventional and continuous flow microwave heating were studied. Among other properties, we compared the stability of the liquid foods between the two heat treatments. Our goal was to determine whether the continuous flow microwave heating and the conventional heating methods have the same effects on the liquid foods, and, therefore, whether microwave heat treatment can effectively replace conventional heat treatments. We have compared the colour, separation phenomena of the samples treated by different methods. For milk, we also monitored the total viable cell count, for orange juice, vitamin C contents in addition to the taste of the product by sensory analysis. The majority of the results indicate that the circulating coil microwave method used here is equivalent to the conventional heating method based on thermal conduction and convection. However, some results in the analysis of the milk samples show clear differences between heat transfer methods. According to our results, the colour parameters (lightness, red-green and blue-yellow values) of the microwave treated samples differed not only from the untreated control, but also from the traditional heat treated samples. The differences are visually undetectable, however, they become evident through analytical measurement with spectrophotometer. This finding suggests that besides thermal effects, microwave-based food treatment can alter product properties in other ways as well. PMID:23341982
Dispersion properties of plasma cladded annular optical fiber
NASA Astrophysics Data System (ADS)
KianiMajd, M.; Hasanbeigi, A.; Mehdian, H.; Hajisharifi, K.
2018-05-01
One of the considerable problems in a conventional image transferring fiber optic system is the two-fold coupling of propagating hybrid modes. In this paper, using a simple and practical analytical approach based on exact modal vectorial analysis together with Maxwell's equations, we show that applying plasma as a cladding medium of an annular optical fiber can remove this defect of conventional fiber optic automatically without any external instrument as the polarization beam splitter. Moreover, the analysis indicates that the presence of plasma in the proposed optical fiber could extend the possibilities for controlling the propagation property. The proposed structure presents itself as a promising route to advanced optical processing and opens new avenues in applied optics and photonics.
Googling DNA sequences on the World Wide Web.
Hajibabaei, Mehrdad; Singer, Gregory A C
2009-11-10
New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.
Iorio, Alfonso; Krishnan, Sangeeta; Myrén, Karl-Johan; Lethagen, Stefan; McCormick, Nora; Yermakov, Sander; Karner, Paul
2017-04-01
Continuous prophylaxis for patients with hemophilia B requires frequent injections that are burdensome and that may lead to suboptimal adherence and outcomes. Hence, therapies requiring less-frequent injections are needed. In the absence of head-to-head comparisons, this study compared the first extended-half-life-recombinant factor IX (rFIX) product-recombinant factor IX Fc fusion protein (rFIXFc)-with conventional rFIX products based on annualized bleed rates (ABRs) and factor consumption reported in studies of continuous prophylaxis. This study compared ABRs and weekly factor consumption rates in clinical studies of continuous prophylaxis treatment with rFIXFc and conventional rFIX products (identified by systematic literature review) in previously-treated adolescents and adults with moderate-to-severe hemophilia B. Meta-analysis was used to pool ABRs reported for conventional rFIX products for comparison. Comparisons of weekly factor consumption were based on the mean, reported or estimated from the mean dose per injection. Five conventional rFIX studies (injections 1 to >3 times/week) met the criteria for comparison with once-weekly rFIXFc reported by the B-LONG study. The pooled mean ABR for conventional rFIX was slightly higher than but comparable to rFIXFc (difference=0.71; p = 0.210). Weekly factor consumption was significantly lower with rFIXFc than in conventional rFIX studies (difference in means = 42.8-74.5 IU/kg/week [93-161%], p < 0.001). Comparisons of clinical study results suggest weekly injections with rFIXFc result in similar bleeding rates and significantly lower weekly factor consumption compared with more-frequently-injected conventional rFIX products. The real-world effectiveness of rFIXFc may be higher based on results from a model of the impact of simulated differences in adherence.
Organic dairy farmers put more emphasis on production traits than conventional farmers.
Slagboom, M; Kargo, M; Edwards, D; Sørensen, A C; Thomasen, J R; Hjortø, L
2016-12-01
The overall aim of this research was to characterize the preferences of Danish dairy farmers for improvements in breeding goal traits. The specific aims were (1) to investigate the presence of heterogeneity in farmers' preferences by means of cluster analysis, and (2) to associate these clusters with herd characteristics and production systems (organic or conventional). We established a web-based survey to characterize the preferences of farmers for improvements in 10 traits, by means of pairwise rankings. We also collected a considerable number of herd characteristics. Overall, 106 organic farmers and 290 conventional farmers answered the survey, all with Holstein cows. The most preferred trait improvement was cow fertility, and the least preferred was calving difficulty. By means of cluster analysis, we identified 4 distinct clusters of farmers and named them according to the trait improvements that were most preferred: Health and Fertility, Production and Udder Health, Survival, and Fertility and Production. Some herd characteristics differed between clusters; for example, farmers in the Survival cluster had twice the percentage of dead cows in their herds compared with the other clusters, and farmers that gave the highest ranking to cow and heifer fertility had the lowest conception rate in their herds. This finding suggests that farmers prefer to improve traits that are more problematic in their herd. The proportion of organic and conventional farmers also differed between clusters; we found a higher proportion of organic farmers in the production-based clusters. When we analyzed organic and conventional data separately, we found that organic farmers ranked production traits higher than conventional farmers. The herds of organic farmers had lower milk yields and lower disease incidences, which might explain the high ranking of milk production and the low ranking of disease traits. This study shows that heterogeneity exists in farmers' preferences for improvements in breeding goal traits, that organic and conventional farmers differ in their preferences, and that herd characteristics can be linked to different farmer clusters. The results of this study could be used for the future development of breeding goals in Danish Holstein cows and for the development of customized total merit indices based on farmer preferences. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Ronco, Guglielmo; Giorgi-Rossi, Paolo; Carozzi, Francesca; Dalla Palma, Paolo; Del Mistro, Annarosa; De Marco, Laura; De Lillo, Margherita; Naldoni, Carlo; Pierotti, Paola; Rizzolo, Raffaella; Segnan, Nereo; Schincaglia, Patrizia; Zorzi, Manuel; Confortini, Massimo; Cuzick, Jack
2006-07-01
Testing for human papillomavirus (HPV) DNA is more sensitive but less specific than cytological analysis. Loss in specificity is most relevant in women younger than 35 years because of increased HPV prevalence. We aimed to compare conventional screening with an experimental strategy in women aged 25-34 years, and investigate the effect of different criteria of referral to define the best methods of HPV screening. Women were randomly assigned to the conventional procedure (standard cytology, with referral to colposcopy if cytology showed atypical squamous cells of undetermined significance or more [ASCUS+]) or an experimental procedure (liquid-based cytology and testing for high-risk HPV types, with referral to colposcopy with ASCUS+ cytology). Women positive for HPV (cutoff > or = 1 pg/mL) but with normal cytology were retested after 1 year. The main endpoint was the presence of cervical intraepithelial neoplasia at grade 2 or more (CIN2+) in reviewed histology. The main analysis was by intention to screen. This trial is registered as an International Standard Randomised Controlled Trial, number ISRCTN81678807. We randomly assigned 5808 women aged 25-34 years to the conventional group and 6002 to the experimental group. The experimental procedure was significantly more sensitive than the conventional procedure (55 vs 33 CIN2+ lesions detected; relative sensitivity 1.61 [95% CI 1.05-2.48]), but had a lower positive predictive value (PPV; relative PPV 0.55 [0.37-0.82]). HPV testing (> or = 1 pg/mL) with cytology triage was also more sensitive than conventional cytology (relative sensitivity 1.58 [1.03-2.44], relative PPV 0.78 [0.52-1.16]). Relative PPV could be improved, with minimum loss in sensitivity, by use of a 2 pg/mL cutoff for HPV testing. Compared with conventional cytology, liquid-based cytology had a relative sensitivity of 1.32 (0.84-2.06), relative PPV 0.58 [0.38-0.89]). HPV testing alone with cytology triage could be a feasible alternative to conventional cytology for screening women younger than 35 years. Follow-up will provide data on possible overdiagnosis and on the feasibility of extended intervals.
The impact of chimerism in DNA-based forensic sex determination analysis.
George, Renjith; Donald, Preethy Mary; Nagraj, Sumanth Kumbargere; Idiculla, Jose Joy; Hj Ismail, Rashid
2013-01-01
Sex determination is the most important step in personal identification in forensic investigations. DNA-based sex determination analysis is comparatively more reliable than the other conventional methods of sex determination analysis. Advanced technology like real-time polymerase chain reaction (PCR) offers accurate and reproducible results and is at the level of legal acceptance. But still there are situations like chimerism where an individual possess both male and female specific factors together in their body. Sex determination analysis in such cases can give erroneous results. This paper discusses the phenomenon of chimerism and its impact on sex determination analysis in forensic investigations.
[Postmortem CT examination in a case of alleged drowning--a case report].
Woźniak, Krzysztof; Urbanik, Andrzej; Rzepecka-Woźniak, Ewa; Moskała, Artur; Kłys, Małgorzata
2009-01-01
The authors present an analysis of postmortem CT examination in a case of drowning in fresh water of a young male. Both the results of conventional forensic autopsy and radiologic examination have been compared. The analysis is illustrated by two-dimensional and three-dimensional reconstructions based on the DICOM files obtained during postmortem CT examination.
Knowledge-Base Semantic Gap Analysis for the Vulnerability Detection
NASA Astrophysics Data System (ADS)
Wu, Raymond; Seki, Keisuke; Sakamoto, Ryusuke; Hisada, Masayuki
Web security became an alert in internet computing. To cope with ever-rising security complexity, semantic analysis is proposed to fill-in the gap that the current approaches fail to commit. Conventional methods limit their focus to the physical source codes instead of the abstraction of semantics. It bypasses new types of vulnerability and causes tremendous business loss.
Nuclear magnetic resonance (NMR) based body composition analysis is an idea means of assessing changes in relative proportions of fat, lean, and fluid in rodents non invasively. While the data are not as accurate as convent ional chemical analysis, the systems allow one to follo...
Brown, Melissa M; Brown, Gary C; Brown, Heidi C; Peet, Jonathan; Roth, Zachary
2009-02-01
To assess the comparative effectiveness and cost-effectiveness (cost-utility) of a 0.05% emulsion of topical cyclosporine (Restasis; Allergan Inc, Irvine, California) for the treatment of moderate to severe dry eye syndrome that is unresponsive to conventional therapy. Data from 2 multicenter, randomized, clinical trials and Food and Drug Administration files for topical cyclosporine, 0.05%, emulsion were used in Center for Value-Based Medicine analyses. Analyses included value-based medicine as a comparative effectiveness analysis and average cost-utility analysis using societal and third-party insurer cost perspectives. Outcome measures of comparative effectiveness were quality-adjusted life-year (QALY) gain and percentage of improvement in quality of life, and for cost-effectiveness were cost-utility ratio (CUR) using dollars per QALY. Topical cyclosporine, 0.05%, confers a value gain (comparative effectiveness) of 0.0319 QALY per year compared with topical lubricant therapy, a 4.3% improvement in quality of life for the average patient with moderate to severe dry eye syndrome that is unresponsive to conventional lubricant therapy. The societal perspective incremental CUR for cyclosporine over vehicle therapy is $34,953 per QALY and the societal perspective average CUR is $11,199 per QALY. The third-party-insurer incremental CUR is $37,179 per QALY, while the third-party-insurer perspective average CUR is $34,343 per QALY. Topical cyclosporine emulsion, 0.05%, confers considerable patient value and is a cost-effective therapy for moderate to severe dry eye syndrome that is unresponsive to conventional therapy.
Design, fabrication and skin-electrode contact analysis of polymer microneedle-based ECG electrodes
NASA Astrophysics Data System (ADS)
O'Mahony, Conor; Grygoryev, Konstantin; Ciarlone, Antonio; Giannoni, Giuseppe; Kenthao, Anan; Galvin, Paul
2016-08-01
Microneedle-based ‘dry’ electrodes have immense potential for use in diagnostic procedures such as electrocardiography (ECG) analysis, as they eliminate several of the drawbacks associated with the conventional ‘wet’ electrodes currently used for physiological signal recording. To be commercially successful in such a competitive market, it is essential that dry electrodes are manufacturable in high volumes and at low cost. In addition, the topographical nature of these emerging devices means that electrode performance is likely to be highly dependent on the quality of the skin-electrode contact. This paper presents a low-cost, wafer-level micromoulding technology for the fabrication of polymeric ECG electrodes that use microneedle structures to make a direct electrical contact to the body. The double-sided moulding process can be used to eliminate post-process via creation and wafer dicing steps. In addition, measurement techniques have been developed to characterize the skin-electrode contact force. We perform the first analysis of signal-to-noise ratio dependency on contact force, and show that although microneedle-based electrodes can outperform conventional gel electrodes, the quality of ECG recordings is significantly dependent on temporal and mechanical aspects of the skin-electrode interface.
Li, Zhijian; Wu, Chengqing; Olayiwola, J Nwando; Hilaire, Daniel St; Huang, John J
2012-02-01
To study the cost benefit analysis of using a telemedicine-based digital retinal imaging evaluation compared to conventional ophthalmologic fundus examination of diabetic patients for diabetic retinopathy. In this study, diabetic patients from Community Health Center, Inc. (CHCI), a large multi-site Federally Qualified Health Center) were evaluated by teleophthalmology using the Canon CR-1 nonmydriatic fundus camera. Digital images were acquired in the CHCI offices and saved on the EyePACS server network. The images were later evaluated by retinal specialists at the Yale Eye Center, Yale University Department of Ophthalmology and Visual Science. The costs for the standard of care ophthalmic examinations were calculated based on 2009 Medicaid reimbursement rates. The process of telemedicine-based diagnosis was based on a take-store-forward-visualize system. The cost of telemedicine-based digital retinal imaging examination included cost for devices, training, annual costs and a transportation fee. Current Medicaid reimbursement, transportation, and staff labor costs were used to calculate the conventional retinal examination cost as a comparison. Among the 611 patients digital retinal images screened in the first year of this program and for whom data are available, 166 (27.2%) cases of diabetic retinopathy were identified. Seventy-five (12.3%) patients screened positive with clinically significant disease and were referred for further ophthalmological evaluation and treatment. The primary direct cost of the telemedicine was $3.80, $15.00, $17.60, $1.50, and $2.50 per patient for medical assistant, ophthalmologist, capital cost (Equipment + Training), equipment maintenance, and transportation fee, respectively. The total cost in the telemedicine-based digital retinal imaging and evaluation was $40.40. The cost of conventional retinal examination was $8.70, $65.30, and $3.80 per patients for round-trip transportation, 2009 national Medicaid Physician Fee Schedule allowable for bilateral eye examination, and medical assistant personnel, respectively. The total costs of conventional fundus examination were $77.80. An additional conventional ophthalmologic retinal examination was required for 75 (12.3%) patients with clinically significant disease on telemedicine evaluation, which involves an averaged additional cost of $ 9.55 per patient for all the patients in the study. If the cost of subsequent examination was added, the total cost of telemedicine-based digital fundus imaging was $49.95 per patient in our group of 611 patients evaluated. Our cost analysis indicates that telemedicine-based diabetic retinopathy screening cost less ($49.95 vs $77.80) than conventional retinal examination and the telemedicine-based digital retinal imaging examination has the potential to provide an alternative method with greater convenience and access for the remote and indigent populations. Diabetes mellitus and diabetic retinopathy are growing problems in the United States and worldwide. Large scale adoption of telemedicine should be encouraged as a means toward providing improved access, increasing compliance with annual evaluation, at a low cost for patients with diabetes with direct access to an eye care specialist.
NASA Astrophysics Data System (ADS)
Brothers, P.; Karaki, S.
Using a solar computer simulation package called TRNSYS, simulations of the direct contact liquid-liquid heat exchanger (DCLLHE) solar system and a system with conventional shell-and-tube heat exchanger were developed, based in part on performance measurements of the actual systems. The two systems were simulated over a full year on an hour-by-hour basis at five locations; Boston, Massachusetts, Charleston, South Carolina, Dodge City, Kansas, Madison, Wisconsin, and Phoenix, Arizona. Typically the direct-contact system supplies slightly more heat for domestic hot water and space heating in all locations and about 5 percentage points more cooling as compared to the conventional system. Using a common set of economic parameters and the appropriate federal and state income tax credits, as well as property tax legislation for solar systems in the corresponding states, the results of the study indicate for heating-only systems, the DCLLHE system has a slight life-cycle cost disadvantage compared to the conventional system. For combined solar heating and cooling systems, the DCLLHE has a slight life-cycle cost advantage which varies with location and amounts to one to three percent difference from the conventional system.
NASA Astrophysics Data System (ADS)
Cox, M.; Shirono, K.
2017-10-01
A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.
NASA Astrophysics Data System (ADS)
Chitanda, Jackson M.; Zhang, Haixia; Pahl, Erica; Purves, Randy W.; El-Aneed, Anas
2016-10-01
The utility of novel functionalized nanodiamonds (NDs) as matrices for matrix-assisted laser desorption ionization-mass spectrometry (MALDI-MS) is described herein. MALDI-MS analysis of small organic compounds (<1000 Da) is typically complex because of interferences from numerous cluster ions formed when using conventional matrices. To expand the use of MALDI for the analysis of small molecules, novel matrices were designed by covalently linking conventional matrices (or a lysine moiety) to detonated NDs. Four new functionalized NDs were evaluated for their ionization capabilities using five pharmaceuticals with varying molecular structures. Two ND matrices were able to ionize all tested pharmaceuticals in the negative ion mode, producing the deprotonated ions [M - H]-. Ion intensity for target analytes was generally strong with enhanced signal-to-noise ratios compared with conventional matrices. The negative ion mode is of great importance for biological samples as interference from endogenous compounds is inherently minimized in the negative ion mode. Since the molecular structures of the tested pharmaceuticals did not suggest that negative ion mode would be preferable, this result magnifies the importance of these findings. On the other hand, conventional matrices primarily facilitated the ionization as expected in the positive ion mode, producing either the protonated molecules [M + H]+ or cationic adducts (typically producing complex spectra with numerous adduct peaks). The data presented in this study suggests that these matrices may offer advantages for the analysis of low molecular weight pharmaceuticals/metabolites.
Chitanda, Jackson M; Zhang, Haixia; Pahl, Erica; Purves, Randy W; El-Aneed, Anas
2016-10-01
The utility of novel functionalized nanodiamonds (NDs) as matrices for matrix-assisted laser desorption ionization-mass spectrometry (MALDI-MS) is described herein. MALDI-MS analysis of small organic compounds (<1000 Da) is typically complex because of interferences from numerous cluster ions formed when using conventional matrices. To expand the use of MALDI for the analysis of small molecules, novel matrices were designed by covalently linking conventional matrices (or a lysine moiety) to detonated NDs. Four new functionalized NDs were evaluated for their ionization capabilities using five pharmaceuticals with varying molecular structures. Two ND matrices were able to ionize all tested pharmaceuticals in the negative ion mode, producing the deprotonated ions [M - H](-). Ion intensity for target analytes was generally strong with enhanced signal-to-noise ratios compared with conventional matrices. The negative ion mode is of great importance for biological samples as interference from endogenous compounds is inherently minimized in the negative ion mode. Since the molecular structures of the tested pharmaceuticals did not suggest that negative ion mode would be preferable, this result magnifies the importance of these findings. On the other hand, conventional matrices primarily facilitated the ionization as expected in the positive ion mode, producing either the protonated molecules [M + H](+) or cationic adducts (typically producing complex spectra with numerous adduct peaks). The data presented in this study suggests that these matrices may offer advantages for the analysis of low molecular weight pharmaceuticals/metabolites. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Arumugam, S.; Ramakrishna, P.; Sangavi, S.
2018-02-01
Improvements in heating technology with solar energy is gaining focus, especially solar parabolic collectors. Solar heating in conventional parabolic collectors is done with the help of radiation concentration on receiver tubes. Conventional receiver tubes are open to atmosphere and loose heat by ambient air currents. In order to reduce the convection losses and also to improve the aperture area, we designed a tube with cavity. This study is a comparative performance behaviour of conventional tube and cavity model tube. The performance formulae were derived for the cavity model based on conventional model. Reduction in overall heat loss coefficient was observed for cavity model, though collector heat removal factor and collector efficiency were nearly same for both models. Improvement in efficiency was also observed in the cavity model’s performance. The approach towards the design of a cavity model tube as the receiver tube in solar parabolic collectors gave improved results and proved as a good consideration.
Stringent DDI-based prediction of H. sapiens-M. tuberculosis H37Rv protein-protein interactions.
Zhou, Hufeng; Rezaei, Javad; Hugo, Willy; Gao, Shangzhi; Jin, Jingjing; Fan, Mengyuan; Yong, Chern-Han; Wozniak, Michal; Wong, Limsoon
2013-01-01
H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are very important information to illuminate the infection mechanism of M. tuberculosis H37Rv. But current H. sapiens-M. tuberculosis H37Rv PPI data are very scarce. This seriously limits the study of the interaction between this important pathogen and its host H. sapiens. Computational prediction of H. sapiens-M. tuberculosis H37Rv PPIs is an important strategy to fill in the gap. Domain-domain interaction (DDI) based prediction is one of the frequently used computational approaches in predicting both intra-species and inter-species PPIs. However, the performance of DDI-based host-pathogen PPI prediction has been rather limited. We develop a stringent DDI-based prediction approach with emphasis on (i) differences between the specific domain sequences on annotated regions of proteins under the same domain ID and (ii) calculation of the interaction strength of predicted PPIs based on the interacting residues in their interaction interfaces. We compare our stringent DDI-based approach to a conventional DDI-based approach for predicting PPIs based on gold standard intra-species PPIs and coherent informative Gene Ontology terms assessment. The assessment results show that our stringent DDI-based approach achieves much better performance in predicting PPIs than the conventional approach. Using our stringent DDI-based approach, we have predicted a small set of reliable H. sapiens-M. tuberculosis H37Rv PPIs which could be very useful for a variety of related studies. We also analyze the H. sapiens-M. tuberculosis H37Rv PPIs predicted by our stringent DDI-based approach using cellular compartment distribution analysis, functional category enrichment analysis and pathway enrichment analysis. The analyses support the validity of our prediction result. Also, based on an analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent DDI-based approach, we have discovered some important properties of domains involved in host-pathogen PPIs. We find that both host and pathogen proteins involved in host-pathogen PPIs tend to have more domains than proteins involved in intra-species PPIs, and these domains have more interaction partners than domains on proteins involved in intra-species PPI. The stringent DDI-based prediction approach reported in this work provides a stringent strategy for predicting host-pathogen PPIs. It also performs better than a conventional DDI-based approach in predicting PPIs. We have predicted a small set of accurate H. sapiens-M. tuberculosis H37Rv PPIs which could be very useful for a variety of related studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, R; Kamima, T; Tachibana, H
2015-06-15
Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sitesmore » (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.« less
Amerinatanzi, Amirhesam; Zamanian, Hashem; Shayesteh Moghaddam, Narges
2017-01-01
Hinge-based Ankle Foot Orthosis (HAFO) is one of the most common non-surgical solutions for the foot drop. In conventional HAFOs, the ankle joint is almost locked, and plantar flexion is restricted due to the high stiffness of the hinge mechanism. This often leads to a rigid walking gate cycle, poor muscle activity, and muscle atrophy. Since the ankle torque-angle loop has a non-linear profile, the use of a superelastic NiTi spring within the hinge, due to its nonlinear behavior, could recreate a close-to-normal stiffness of the normal ankle joint, which, in turn, could create a more natural walk. The focus of this study is to evaluate the performance of a superelastic NiTi spring versus a conventional Stainless Steel spring in a hinge mechanism of a custom-fit HAFO. To this aim, a custom-fit HAFO was fabricated via the fast casting technique. Then, motion analysis was performed for two healthy subjects (Case I and Case II): (i) subjects with bare foot; (ii) subjects wearing a conventional HAFO with no spring; (iii) subjects wearing a conventional Stainless Steel-based HAFO; and (iv) subjects wearing a NiTi spring-based HAFO. The data related to the ankle angle and the amount of moment applied to the ankle during walking were recorded using Cortex software and used for the evaluations. Finally, Finite Element Analysis (FEA) was performed to evaluate the safety of the designed HAFO. The NiTi spring offers a higher range of motion (7.9 versus 4.14 degree) and an increased level of moment (0.55 versus 0.36 N·m/kg). Furthermore, a NiTi spring offers an ankle torque-angle loop closer to that of the healthy subjects. PMID:29215571
Amerinatanzi, Amirhesam; Zamanian, Hashem; Shayesteh Moghaddam, Narges; Jahadakbar, Ahmadreza; Elahinia, Mohammad
2017-12-07
Hinge-based Ankle Foot Orthosis (HAFO) is one of the most common non-surgical solutions for the foot drop. In conventional HAFOs, the ankle joint is almost locked, and plantar flexion is restricted due to the high stiffness of the hinge mechanism. This often leads to a rigid walking gate cycle, poor muscle activity, and muscle atrophy. Since the ankle torque-angle loop has a non-linear profile, the use of a superelastic NiTi spring within the hinge, due to its nonlinear behavior, could recreate a close-to-normal stiffness of the normal ankle joint, which, in turn, could create a more natural walk. The focus of this study is to evaluate the performance of a superelastic NiTi spring versus a conventional Stainless Steel spring in a hinge mechanism of a custom-fit HAFO. To this aim, a custom-fit HAFO was fabricated via the fast casting technique. Then, motion analysis was performed for two healthy subjects (Case I and Case II): (i) subjects with bare foot; (ii) subjects wearing a conventional HAFO with no spring; (iii) subjects wearing a conventional Stainless Steel-based HAFO; and (iv) subjects wearing a NiTi spring-based HAFO. The data related to the ankle angle and the amount of moment applied to the ankle during walking were recorded using Cortex software and used for the evaluations. Finally, Finite Element Analysis (FEA) was performed to evaluate the safety of the designed HAFO. The NiTi spring offers a higher range of motion (7.9 versus 4.14 degree) and an increased level of moment (0.55 versus 0.36 N·m/kg). Furthermore, a NiTi spring offers an ankle torque-angle loop closer to that of the healthy subjects.
Effect of organic and conventional rearing system on the mineral content of pork.
Zhao, Yan; Wang, Donghua; Yang, Shuming
2016-08-01
Dietary composition and rearing regime largely determine the trace elemental composition of pigs, and consequently their concentration in animal products. The present study evaluates thirteen macro- and trace element concentrations in pork from organic and conventional farms. Conventional pigs were given a commercial feed with added minerals; organic pigs were given a feed based on organic feedstuffs. The content of macro-elements (Na, K, Mg and Ca) and some trace elements (Ni, Fe, Zn and Sr) in organic and conventional meat samples showed no significant differences (P>0.05). Several trace element concentrations in organic pork were significantly higher (P<0.05) compared to conventional pork: Cr (808 and 500μg/kg in organic and conventional pork, respectively), Mn (695 and 473μg/kg) and Cu (1.80 and 1.49mg/kg). The results showed considerable differences in mineral content between samples from pigs reared in organic and conventional systems. Our results also indicate that authentication of organic pork can be realized by applying multivariate chemometric methods such as discriminant analysis to this multi-element data. Copyright © 2016 Elsevier Ltd. All rights reserved.
Jaruzel, Candace B; Kelechi, Teresa J
2016-08-01
To analyze and clarify the concept of providing relief from anxiety using complementary therapies in the perioperative period utilizing the epistemological, pragmatic, linguistic and logical principles of a principle-based concept analysis to examine the state of the science. The majority of patients scheduled for surgery experience anxiety in the perioperative period. Anxiety has the potential to limit a patient's ability to participate in his or her care throughout their hospitalization. Although medications are the conventional medical treatment for anxiety in the perioperative period, the addition of a complementary therapy could be an effective holistic approach to providing relief from anxiety. Principle-based concept analysis. In 2015, strategic literature searches of CINHAL and PUBMED using keywords were performed. Fifty-six full text articles were assessed for eligibility. Twelve studies were used in the final analysis to clarify the concept of relief from anxiety using complementary therapies in the perioperative period. This analysis has clarified the maturity and boundaries, within the four principles of a principle-based concept analysis, of the concept of relief from anxiety using complementary therapies in the perioperative period. A greater understanding of relief from anxiety using complimentary therapies in the perioperative period as an adjunct to conventional medicine will allow perioperative nurses and anesthesia providers to modify and specify the plan of care for their surgical patients. The use of complementary therapies for relief in the perioperative period appears to be an area of promising research and treatment for patients, families and providers. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chung, Yun Won; Kwon, Jae Kyun; Park, Suwon
2014-01-01
One of the key technologies to support mobility of mobile station (MS) in mobile communication systems is location management which consists of location update and paging. In this paper, an improved movement-based location management scheme with two movement thresholds is proposed, considering bursty data traffic characteristics of packet-switched (PS) services. The analytical modeling for location update and paging signaling loads of the proposed scheme is developed thoroughly and the performance of the proposed scheme is compared with that of the conventional scheme. We show that the proposed scheme outperforms the conventional scheme in terms of total signaling load with an appropriate selection of movement thresholds.
Organic farming enhances soil microbial abundance and activity—A meta-analysis and meta-regression
Symnaczik, Sarah; Mäder, Paul; De Deyn, Gerlinde; Gattinger, Andreas
2017-01-01
Population growth and climate change challenge our food and farming systems and provide arguments for an increased intensification of agriculture. A promising option is eco-functional intensification through organic farming, an approach based on using and enhancing internal natural resources and processes to secure and improve agricultural productivity, while minimizing negative environmental impacts. In this concept an active soil microbiota plays an important role for various soil based ecosystem services such as nutrient cycling, erosion control and pest and disease regulation. Several studies have reported a positive effect of organic farming on soil health and quality including microbial community traits. However, so far no systematic quantification of whether organic farming systems comprise larger and more active soil microbial communities compared to conventional farming systems was performed on a global scale. Therefore, we conducted a meta-analysis on current literature to quantify possible differences in key indicators for soil microbial abundance and activity in organic and conventional cropping systems. All together we integrated data from 56 mainly peer-reviewed papers into our analysis, including 149 pairwise comparisons originating from different climatic zones and experimental duration ranging from 3 to more than 100 years. Overall, we found that organic systems had 32% to 84% greater microbial biomass carbon, microbial biomass nitrogen, total phospholipid fatty-acids, and dehydrogenase, urease and protease activities than conventional systems. Exclusively the metabolic quotient as an indicator for stresses on microbial communities remained unaffected by the farming systems. Categorical subgroup analysis revealed that crop rotation, the inclusion of legumes in the crop rotation and organic inputs are important farming practices affecting soil microbial community size and activity. Furthermore, we show that differences in microbial size and activity between organic and conventional farming systems vary as a function of land use (arable, orchards, and grassland), plant life cycle (annual and perennial) and climatic zone. In summary, this study shows that overall organic farming enhances total microbial abundance and activity in agricultural soils on a global scale. PMID:28700609
He, Zhixue; Li, Xiang; Luo, Ming; Hu, Rong; Li, Cai; Qiu, Ying; Fu, Songnian; Yang, Qi; Yu, Shaohua
2016-05-02
We propose and experimentally demonstrate two independent component analysis (ICA) based channel equalizers (CEs) for 6 × 6 MIMO-OFDM transmission over few-mode fiber. Compared with the conventional channel equalizer based on training symbols (TSs-CE), the proposed two ICA-based channel equalizers (ICA-CE-I and ICA-CE-II) can achieve comparable performances, while requiring much less training symbols. Consequently, the overheads for channel equalization can be substantially reduced from 13.7% to 0.4% and 2.6%, respectively. Meanwhile, we also experimentally investigate the convergence speed of the proposed ICA-based CEs.
Lee, Yoon-Kyung; Ryu, Joo-Hyung; Choi, Jong-Kuk; Lee, Seok; Woo, Han-Jun
2015-08-15
Spatial and temporal changes around an area of conventional coastal engineering can be easily observed from field surveys because of the clear cause-and-effect observable in the before and after stages of the project. However, it is more difficult to determine environmental changes in the vicinity of tidal flats and coastal areas that are a considerable distance from the project. To identify any unexpected environmental impacts of the construction of Saemangeum Dyke in the area, we examined morphological changes identified by satellite-based observations through a field survey on Gomso Bay tidal flats (15km from Saemangeum Dyke), and changes in the suspended sediment distribution identified by satellite-based observations through a hydrodynamic analysis in the Saemangeum and Gomso coastal area. We argue that hydrodynamic changes due to conventional coastal engineering can affect the sedimentation pattern in the vicinity of tidal flats. We suggest that the environmental impact assessment conducted before a conventional coastal engineering project should include a larger area than is currently considered. Copyright © 2015 Elsevier Ltd. All rights reserved.
Video-based teleradiology for intraosseous lesions. A receiver operating characteristic analysis.
Tyndall, D A; Boyd, K S; Matteson, S R; Dove, S B
1995-11-01
Immediate access to off-site expert diagnostic consultants regarding unusual radiographic findings or radiographic quality assurance issues could be a current problem for private dental practitioners. Teleradiology, a system for transmitting radiographic images, offers a potential solution to this problem. Although much research has been done to evaluate feasibility and utilization of teleradiology systems in medical imaging, little research on dental applications has been performed. In this investigation 47 panoramic films with an equal distribution of images with intraosseous jaw lesions and no disease were viewed by a panel of observers with teleradiology and conventional viewing methods. The teleradiology system consisted of an analog video-based system simulating remote radiographic consultation between a general dentist and a dental imaging specialist. Conventional viewing consisted of traditional viewbox methods. Observers were asked to identify the presence or absence of 24 intraosseous lesions and to determine their locations. No statistically significant differences in modalities or observers were identified between methods at the 0.05 level. The results indicate that viewing intraosseous lesions of video-based panoramic images is equal to conventional light box viewing.
Enhancement of optical polarization degree of AlGaN quantum wells by using staggered structure.
Wang, Weiying; Lu, Huimin; Fu, Lei; He, Chenguang; Wang, Mingxing; Tang, Ning; Xu, Fujun; Yu, Tongjun; Ge, Weikun; Shen, Bo
2016-08-08
Staggered AlGaN quantum wells (QWs) are designed to enhance the transverse-electric (TE) polarized optical emission in deep ultraviolet (DUV) light- emitting diodes (LED). The optical polarization properties of the conventional and staggered AlGaN QWs are investigated by a theoretical model based on the k·p method as well as polarized photoluminescence (PL) measurements. Based on an analysis of the valence subbands and momentum matrix elements, it is found that AlGaN QWs with step-function-like Al content in QWs offers much stronger TE polarized emission in comparison to that from conventional AlGaN QWs. Experimental results show that the degree of the PL polarization at room temperature can be enhanced from 20.8% of conventional AlGaN QWs to 40.2% of staggered AlGaN QWs grown by MOCVD, which is in good agreement with the theoretical simulation. It suggests that polarization band engineering via staggered AlGaN QWs can be well applied in high efficiency AlGaN-based DUV LEDs.
Automated lithology prediction from PGNAA and other geophysical logs.
Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T
2006-02-01
Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.
Performance and cost analysis of Siriraj liquid-based cytology: a direct-to-vial study.
Laiwejpithaya, Somsak; Benjapibal, Mongkol; Laiwejpithaya, Sujera; Wongtiraporn, Weerasak; Sangkarat, Suthi; Rattanachaiyanont, Manee
2009-12-01
To compare the cytological diagnoses, specimen adequacy, and cost of the Siriraj liquid-based cytology (LBC) with those of the conventional smear technique. An observational study with historical comparison was conducted in a tertiary university hospital. Cytological reports of 23,676 Siriraj-LBC specimens obtained in 2006 were compared with those of 25,510 conventional smears obtained in 2004. Overall prevalence of abnormal cervical cytology detected by conventional smear was 1.76% and by Siriraj-LBC was 3.70%. Compared with the conventional method, the Siriraj-LBC yielded a significantly higher overall detection rate of abnormal cervical cytology, with a 110.23% increase in the detection rate (P<0.001), mainly due to the increase in diagnosis of squamous intraepithelial lesions (SIL), both low and high grade, together with atypical squamous cells of undetermined significance, "atypical squamous cells cannot exclude HSIL", and malignancies, but not atypical glandular cells. The Siriraj-LBC had a smaller proportion of unsatisfactory slides (4.94% vs. 18.60%, P<0.001) and a higher negative predictive value (96.33% vs. 92.74%, P=0.001), but no difference in positive predictive value (83.03% vs. 86.83%, P=0.285). The cost of Siriraj-LBC was approximately 67% higher than that of the conventional cytology used in Siriraj Hospital and 50-70% lower than that of the commercially available LBC techniques in Thailand. The Siriraj-LBC increases the detection rate of abnormal cytology, improves specimen adequacy, and enhances the negative predictive value without compromising the positive predictive value. For centers where conventional Pap smear does not perform well, the introduction of a low cost Siriraj-LBC might help to improve performance and it may be an economical alternative to the commercially available liquid-based cytology.
Nguyen, N Q; Holloway, R H; Smout, A J; Omari, T I
2013-03-01
Automated integrated analysis of impedance and pressure signals has been reported to identify patients at risk of developing dysphagia post fundoplication. This study aimed to investigate this analysis in the evaluation of patients with non-obstructive dysphagia (NOD) and normal manometry (NOD/NM). Combined impedance-manometry was performed in 42 patients (27F : 15M; 56.2 ± 5.1 years) and compared with that of 24 healthy subjects (8F : 16M; 48.2 ± 2.9 years). Both liquid and viscous boluses were tested. MATLAB-based algorithms defined the median intrabolus pressure (IBP), IBP slope, peak pressure (PP), and timing of bolus flow relative to peak pressure (TNadImp-PP). An index of pressure and flow (PFI) in the distal esophagus was derived from these variables. Diagnoses based on conventional manometric assessment: diffuse spasm (n = 5), non-specific motor disorders (n = 19), and normal (n = 11). Patients with achalasia (n = 7) were excluded from automated impedance-manometry (AIM) analysis. Only 2/11 (18%) patients with NOD/NM had evidence of flow abnormality on conventional impedance analysis. Several variables derived by integrated impedance-pressure analysis were significantly different in patients as compared with healthy: higher PNadImp (P < 0.01), IBP (P < 0.01) and IBP slope (P < 0.05), and shorter TNadImp_PP (P = 0.01). The PFI of NOD/NM patients was significantly higher than that in healthy (liquid: 6.7 vs 1.2, P = 0.02; viscous: 27.1 vs 5.7, P < 0.001) and 9/11 NOD/NM patients had abnormal PFI. Overall, the addition of AIM analysis provided diagnoses and/or a plausible explanation in 95% (40/42) of patients who presented with NOD. Compared with conventional pressure-impedance assessment, integrated analysis is more sensitive in detecting subtle abnormalities in esophageal function in patients with NOD and normal manometry. © 2012 Blackwell Publishing Ltd.
Effects of feedstock characteristics on microwave-assisted pyrolysis - A review.
Zhang, Yaning; Chen, Paul; Liu, Shiyu; Peng, Peng; Min, Min; Cheng, Yanling; Anderson, Erik; Zhou, Nan; Fan, Liangliang; Liu, Chenghui; Chen, Guo; Liu, Yuhuan; Lei, Hanwu; Li, Bingxi; Ruan, Roger
2017-04-01
Microwave-assisted pyrolysis is an important approach to obtain bio-oil from biomass. Similar to conventional electrical heating pyrolysis, microwave-assisted pyrolysis is significantly affected by feedstock characteristics. However, microwave heating has its unique features which strongly depend on the physical and chemical properties of biomass feedstock. In this review, the relationships among heating, bio-oil yield, and feedstock particle size, moisture content, inorganics, and organics in microwave-assisted pyrolysis are discussed and compared with those in conventional electrical heating pyrolysis. The quantitative analysis of data reported in the literature showed a strong contrast between the conventional processes and microwave based processes. Microwave-assisted pyrolysis is a relatively new process with limited research compared with conventional electrical heating pyrolysis. The lack of understanding of some observed results warrant more and in-depth fundamental research. Copyright © 2017 Elsevier Ltd. All rights reserved.
Preeti, Bajaj; Ashish, Ahuja; Shriram, Gosavi
2013-12-01
As the "Science of Medicine" is getting advanced day-by-day, need for better pedagogies & learning techniques are imperative. Problem Based Learning (PBL) is an effective way of delivering medical education in a coherent, integrated & focused manner. It has several advantages over conventional and age-old teaching methods of routine. It is based on principles of adult learning theory, including student's motivation, encouragement to set goals, think critically about decision making in day-to-day operations. Above all these, it stimulates challenge acceptance and learning curiosity among students and creates pragmatic educational program. To measure the effectiveness of the "Problem Based Learning" as compared to conventional theory/didactic lectures based learning. The study was conducted on 72 medical students from Dayanand Medical College & Hospital, Ludhiana. Two modules of problem based sessions designed and delivered. Pre & Post-test score's scientific statistical analysis was done. Student feed-back received based on questionnaire in the five-point Likert scale format. Significant improvement in overall performance observed. Feedback revealed majority agreement that "Problem-based learning" helped them create interest (88.8 %), better understanding (86%) & promotes self-directed subject learning (91.6 %). Substantial improvement in the post-test scores clearly reveals acceptance of PBL over conventional learning. PBL ensures better practical learning, ability to create interest, subject understanding. It is a modern-day educational strategy, an effective tool to objectively improve the knowledge acquisition in Medical Teaching.
Win, Khin Thanda; Vegas, Juan; Zhang, Chunying; Song, Kihwan; Lee, Sanghyeob
2017-01-01
QTL mapping using NGS-assisted BSA was successfully applied to an F 2 population for downy mildew resistance in cucumber. QTLs detected by NGS-assisted BSA were confirmed by conventional QTL analysis. Downy mildew (DM), caused by Pseudoperonospora cubensis, is one of the most destructive foliar diseases in cucumber. QTL mapping is a fundamental approach for understanding the genetic inheritance of DM resistance in cucumber. Recently, many studies have reported that a combination of bulked segregant analysis (BSA) and next-generation sequencing (NGS) can be a rapid and cost-effective way of mapping QTLs. In this study, we applied NGS-assisted BSA to QTL mapping of DM resistance in cucumber and confirmed the results by conventional QTL analysis. By sequencing two DNA pools each consisting of ten individuals showing high resistance and susceptibility to DM from a F 2 population, we identified single nucleotide polymorphisms (SNPs) between the two pools. We employed a statistical method for QTL mapping based on these SNPs. Five QTLs, dm2.2, dm4.1, dm5.1, dm5.2, and dm6.1, were detected and dm2.2 showed the largest effect on DM resistance. Conventional QTL analysis using the F 2 confirmed dm2.2 (R 2 = 10.8-24 %) and dm5.2 (R 2 = 14-27.2 %) as major QTLs and dm4.1 (R 2 = 8 %) as two minor QTLs, but could not detect dm5.1 and dm6.1. A new QTL on chromosome 2, dm2.1 (R 2 = 28.2 %) was detected by the conventional QTL method using an F 3 population. This study demonstrated the effectiveness of NGS-assisted BSA for mapping QTLs conferring DM resistance in cucumber and revealed the unique genetic inheritance of DM resistance in this population through two distinct major QTLs on chromosome 2 that mainly harbor DM resistance.
Daxini, S D; Prajapati, J M
2014-01-01
Meshfree methods are viewed as next generation computational techniques. With evident limitations of conventional grid based methods, like FEM, in dealing with problems of fracture mechanics, large deformation, and simulation of manufacturing processes, meshfree methods have gained much attention by researchers. A number of meshfree methods have been proposed till now for analyzing complex problems in various fields of engineering. Present work attempts to review recent developments and some earlier applications of well-known meshfree methods like EFG and MLPG to various types of structure mechanics and fracture mechanics applications like bending, buckling, free vibration analysis, sensitivity analysis and topology optimization, single and mixed mode crack problems, fatigue crack growth, and dynamic crack analysis and some typical applications like vibration of cracked structures, thermoelastic crack problems, and failure transition in impact problems. Due to complex nature of meshfree shape functions and evaluation of integrals in domain, meshless methods are computationally expensive as compared to conventional mesh based methods. Some improved versions of original meshfree methods and other techniques suggested by researchers to improve computational efficiency of meshfree methods are also reviewed here.
Tapia-Orozco, Natalia; Santiago-Toledo, Gerardo; Barrón, Valeria; Espinosa-García, Ana María; García-García, José Antonio; García-Arrazola, Roeb
2017-04-01
Environmental Epigenomics is a developing field to study the epigenetic effect on human health from exposure to environmental factors. Endocrine disrupting chemicals have been detected primarily in pharmaceutical drugs, personal care products, food additives, and food containers. Exposure to endocrine-disrupting chemicals (EDCs) has been associated with a high incidence and prevalence of many endocrine-related disorders in humans. Nevertheless, further evidence is needed to establish a correlation between exposure to EDC and human disorders. Conventional detection of EDCs is based on chemical structure and concentration sample analysis. However, substantial evidence has emerged, suggesting that cell exposure to EDCs leads to epigenetic changes, independently of its chemical structure with non-monotonic low-dose responses. Consequently, a paradigm shift in toxicology assessment of EDCs is proposed based on a comprehensive review of analytical techniques used to evaluate the epigenetic effects. Fundamental insights reported elsewhere are compared in order to establish DNA methylation analysis as a viable method for assessing endocrine disruptors beyond the conventional study approach of chemical structure and concentration analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Creep-Fatigue Interaction and Cyclic Strain Analysis in P92 Steel Based on Test
NASA Astrophysics Data System (ADS)
Ji, Dongmei; Zhang, Lai-Chang; Ren, Jianxing; Wang, Dexian
2015-04-01
This work focused on the interaction of creep and fatigue and cyclic strain analysis in high-chromium ferritic P92 steel based on load-controlled creep-fatigue (CF) tests and conventional creep test at 873 K. Mechanical testing shows that the cyclic load inhibits the propagation of creep damage in the P92 steel and CF interaction becomes more severe with the decrease in the holding period duration and stress ratio. These results are also verified by the analysis of cyclic strain. The fatigue lifetime reduces with the increasing of the holding period duration and it does not reduce much with the increasing stress ratio especially under the conditions of long holding period duration. The cyclic strains (i.e., the strain range and creep strain) of CF tests consist of three stages, which is the same as those for the conventional creep behavior. The microscopic fracture surface observations illustrated that two different kinds of voids are observed at the fracture surfaces and Laves phase precipitates at the bottom of the voids.
Comparative study of Sperm Motility Analysis System and conventional microscopic semen analysis
KOMORI, KAZUHIKO; ISHIJIMA, SUMIO; TANJAPATKUL, PHANU; FUJITA, KAZUTOSHI; MATSUOKA, YASUHIRO; TAKAO, TETSUYA; MIYAGAWA, YASUSHI; TAKADA, SHINGO; OKUYAMA, AKIHIKO
2006-01-01
Background and Aim: Conventional manual sperm analysis still shows variations in structure, process and outcome although World Health Organization (WHO) guidelines present an appropriate method for sperm analysis. In the present study a new system for sperm analysis, Sperm Motility Analysis System (SMAS), was compared with manual semen analysis based on WHO guidelines. Materials and methods: Samples from 30 infertility patients and 21 healthy volunteers were subjected to manual microscopic analysis and SMAS analysis, simultaneously. We compared these two methods with respect to sperm concentration and percent motility. Results: Sperm concentrations obtained by SMAS (Csmas) and manual microscopic analyses on WHO guidelines (Cwho) were strongly correlated (Cwho = 1.325 × Csmas; r = 0.95, P < 0.001). If we excluded subjects with Csmas values >30 × 106 sperm/mL, the results were more similar (Cwho = 1.022 × Csmas; r = 0.81, P < 0.001). Percent motility obtained by SMAS (Msmas) and manual analysis on WHO guidelines (Mwho) were strongly correlated (Mwho = 1.214 × Msmas; r = 0.89, P < 0.001). Conclusions: The data indicate that the results of SMAS and those of manual microscopic sperm analyses based on WHO guidelines are strongly correlated. SMAS is therefore a promising system for sperm analysis. (Reprod Med Biol 2006; 5: 195–200) PMID:29662398
Flow Cytometry Data Preparation Guidelines for Improved Automated Phenotypic Analysis.
Jimenez-Carretero, Daniel; Ligos, José M; Martínez-López, María; Sancho, David; Montoya, María C
2018-05-15
Advances in flow cytometry (FCM) increasingly demand adoption of computational analysis tools to tackle the ever-growing data dimensionality. In this study, we tested different data input modes to evaluate how cytometry acquisition configuration and data compensation procedures affect the performance of unsupervised phenotyping tools. An analysis workflow was set up and tested for the detection of changes in reference bead subsets and in a rare subpopulation of murine lymph node CD103 + dendritic cells acquired by conventional or spectral cytometry. Raw spectral data or pseudospectral data acquired with the full set of available detectors by conventional cytometry consistently outperformed datasets acquired and compensated according to FCM standards. Our results thus challenge the paradigm of one-fluorochrome/one-parameter acquisition in FCM for unsupervised cluster-based analysis. Instead, we propose to configure instrument acquisition to use all available fluorescence detectors and to avoid integration and compensation procedures, thereby using raw spectral or pseudospectral data for improved automated phenotypic analysis. Copyright © 2018 by The American Association of Immunologists, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang Xiangyang; Yang Yi; Tang Shaojie
Purpose: Differential phase contrast CT (DPC-CT) is emerging as a new technology to improve the contrast sensitivity of conventional attenuation-based CT. The noise equivalent quanta as a function over spatial frequency, i.e., the spectrum of noise equivalent quanta NEQ(k), is a decisive indicator of the signal and noise transfer properties of an imaging system. In this work, we derive the functional form of NEQ(k) in DPC-CT. Via system modeling, analysis, and computer simulation, we evaluate and verify the derived NEQ(k) and compare it with that of the conventional attenuation-based CT. Methods: The DPC-CT is implemented with x-ray tube and gratings.more » The x-ray propagation and data acquisition are modeled and simulated through Fresnel and Fourier analysis. A monochromatic x-ray source (30 keV) is assumed to exclude any system imperfection and interference caused by scatter and beam hardening, while a 360 Degree-Sign full scan is carried out in data acquisition to avoid any weighting scheme that may disrupt noise randomness. Adequate upsampling is implemented to simulate the x-ray beam's propagation through the gratings G{sub 1} and G{sub 2} with periods 8 and 4 {mu}m, respectively, while the intergrating distance is 193.6 mm (1/16 of the Talbot distance). The dimensions of the detector cell for data acquisition are 32 Multiplication-Sign 32, 64 Multiplication-Sign 64, 96 Multiplication-Sign 96, and 128 Multiplication-Sign 128 {mu}m{sup 2}, respectively, corresponding to a 40.96 Multiplication-Sign 40.96 mm{sup 2} field of view in data acquisition. An air phantom is employed to obtain the noise power spectrum NPS(k), spectrum of noise equivalent quanta NEQ(k), and detective quantum efficiency DQE(k). A cylindrical water phantom at 5.1 mm diameter and complex refraction coefficient n= 1 -{delta}+i{beta}= 1 -2.5604 Multiplication-Sign 10{sup -7}+i1.2353 Multiplication-Sign 10{sup -10} is placed in air to measure the edge transfer function, line spread function and then modulation transfer function MTF(k), of both DPC-CT and the conventional attenuation-based CT. The x-ray flux is set at 5 Multiplication-Sign 10{sup 6} photon/cm{sup 2} per projection and observes the Poisson distribution, which is consistent with that of a micro-CT for preclinical applications. Approximately 360 regions, each at 128 Multiplication-Sign 128 matrix, are used to calculate the NPS(k) via 2D Fourier transform, in which adequate zero padding is carried out to avoid aliasing in noise. Results: The preliminary data show that the DPC-CT possesses a signal transfer property [MTF(k)] comparable to that of the conventional attenuation-based CT. Meanwhile, though there exists a radical difference in their noise power spectrum NPS(k) (trait 1/|k| in DPC-CT but |k| in the conventional attenuation-based CT) the NEQ(k) and DQE(k) of DPC-CT and the conventional attenuation-based CT are in principle identical. Conclusions: Under the framework of ideal observer study, the joint signal and noise transfer property NEQ(k) and detective quantum efficiency DQE(k) of DPC-CT are essentially the same as those of the conventional attenuation-based CT. The findings reported in this paper may provide insightful guidelines on the research, development, and performance optimization of DPC-CT for extensive preclinical and clinical applications in the future.« less
NASA Astrophysics Data System (ADS)
Asmar, Joseph Al; Lahoud, Chawki; Brouche, Marwan
2018-05-01
Cogeneration and trigeneration systems can contribute to the reduction of primary energy consumption and greenhouse gas emissions in residential and tertiary sectors, by reducing fossil fuels demand and grid losses with respect to conventional systems. The cogeneration systems are characterized by a very high energy efficiency (80 to 90%) as well as a less polluting aspect compared to the conventional energy production. The integration of these systems into the energy network must simultaneously take into account their economic and environmental challenges. In this paper, a decision-making strategy will be introduced and is divided into two parts. The first one is a strategy based on a multi-objective optimization tool with data analysis and the second part is based on an optimization algorithm. The power dispatching of the Lebanese electricity grid is then simulated and considered as a case study in order to prove the compatibility of the cogeneration power calculated by our decision-making technique. In addition, the thermal energy produced by the cogeneration systems which capacity is selected by our technique shows compatibility with the thermal demand for district heating.
Simulation and analysis of plasmonic sensor in NIR with fluoride glass and graphene layer
NASA Astrophysics Data System (ADS)
Pandey, Ankit Kumar; Sharma, Anuj K.
2018-02-01
A calcium fluoride (CaF2) prism based plasmonic biosensor with graphene layer is proposed in near infrared region (NIR) of operation. The stacking of multilayer graphene is considered with dielectric interlayer sandwiched between two graphene layers. Excellent optical properties of CaF2 glass and enhanced field at the graphene-analyte interface are intended to be exploited for proposed sensor structure in NIR spectral region. Performance parameters in terms of field enhancement at interface and figure of merit (FOM) are analyzed and compared with those of conventional SPR based sensor. It is demonstrated that the same sensor probe can also be used for gas sensing with nearly 3.5-4 times enhancement in FOM, compared with conventional sensor. The results show that CaF2 based SPR sensor provides much better sensitivity than that based on other glasses.
Security analysis of quadratic phase based cryptography
NASA Astrophysics Data System (ADS)
Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Healy, John J.; Sheridan, John T.
2016-09-01
The linear canonical transform (LCT) is essential in modeling a coherent light field propagation through first-order optical systems. Recently, a generic optical system, known as a Quadratic Phase Encoding System (QPES), for encrypting a two-dimensional (2D) image has been reported. It has been reported together with two phase keys the individual LCT parameters serve as keys of the cryptosystem. However, it is important that such the encryption systems also satisfies some dynamic security properties. Therefore, in this work, we examine some cryptographic evaluation methods, such as Avalanche Criterion and Bit Independence, which indicates the degree of security of the cryptographic algorithms on QPES. We compare our simulation results with the conventional Fourier and the Fresnel transform based DRPE systems. The results show that the LCT based DRPE has an excellent avalanche and bit independence characteristics than that of using the conventional Fourier and Fresnel based encryption systems.
Evaluation of bearing capacity of piles from cone penetration test data.
DOT National Transportation Integrated Search
2007-12-01
A statistical analysis and ranking criteria were used to compare the CPT methods and the conventional alpha design method. Based on the results, the de Ruiter/Beringen and LCPC methods showed the best capability in predicting the measured load carryi...
Hernandez, Wilmar; de Vicente, Jesús; Sergiyenko, Oleg Y.; Fernández, Eduardo
2010-01-01
In this paper, the fast least-mean-squares (LMS) algorithm was used to both eliminate noise corrupting the important information coming from a piezoresisitive accelerometer for automotive applications, and improve the convergence rate of the filtering process based on the conventional LMS algorithm. The response of the accelerometer under test was corrupted by process and measurement noise, and the signal processing stage was carried out by using both conventional filtering, which was already shown in a previous paper, and optimal adaptive filtering. The adaptive filtering process relied on the LMS adaptive filtering family, which has shown to have very good convergence and robustness properties, and here a comparative analysis between the results of the application of the conventional LMS algorithm and the fast LMS algorithm to solve a real-life filtering problem was carried out. In short, in this paper the piezoresistive accelerometer was tested for a multi-frequency acceleration excitation. Due to the kind of test conducted in this paper, the use of conventional filtering was discarded and the choice of one adaptive filter over the other was based on the signal-to-noise ratio improvement and the convergence rate. PMID:22315579
Analysis of PCR Thermocycling by Rayleigh-Bénard Convection
NASA Astrophysics Data System (ADS)
Sharma, Ruchi; Ugaz, Victor
2004-03-01
In previous studies, we demonstrated a novel device employing the circulatory flow field established by Rayleigh-Bénard convection to perform amplification of a 295 base target region from a human genomic DNA template inside a 35 uL cylindrical cavity using the polymerase chain reaction (PCR) [Krishnan, Ugaz & Burns, Science, Vol. 298, 2002, p. 793]. This design eliminates the need for dynamic external temperature control required in conventional thermocyclers that repeatedly heat and cool static sample volumes to denaturation, annealing, and extension temperatures. In this paper, we extend these studies by demonstrating the design and operation of a multiwell convective flow device capable of achieving amplification of a 191 base pair fragment associated with membrane channel proteins M1 and M2 of the influenza-A virus in as little as 15 minutes with performance comparable to a conventional thermocycler. We also study the effect of initial template concentration and observe no degradation in performance over four orders of magnitude of initial template loading dilution, consistent with conventional thermocycler results. These results illustrate the ability of convective flow PCR systems to achieve performance equal to or exceeding conventional thermocycling hardware, and demonstrate their suitability for use in rapid biodetection assays.
Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi
2014-01-01
The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.
Rodriguez, S; Khabir, A; Keryer, C; Perrot, C; Drira, M; Ghorbel, A; Jlidi, R; Bernheim, A; Valent, A; Busson, P
2005-03-01
Nasopharyngeal carcinoma (NPC) occurs with a high incidence in Southeast Asia and to a lesser extent in the Mediterranean area, especially in Tunisia, Algeria, and Morocco. Cellular gene alterations combined with latent Epstein-Barr virus infection are thought to be essential for NPC oncogenesis. To date, chromosome analysis with comparative genomic hybridization (CGH) has been reported exclusively for NPCs from Southeast Asia. Although NPCs from the Mediterranean area have several distinct clinical and epidemiological features, CGH investigations have been lacking. Chromosome analysis was therefore undertaken on a series of NPC xenografts and biopsies derived from patients of Mediterranean origin. Four xenografts were investigated with a combination of conventional CGH, array-based CGH, and comparative expressed sequence hybridization. In addition, 23 fresh NPC biopsies were analyzed with conventional CGH. Data obtained from xenografts and fresh biopsies were consistent, except that amplification of genes at 18p was observed only in xenografts derived from metastatic tissues. Frequent gains associated with gene overexpression were detected at 1q25 approximately qter (64%) and 12p13 (50%). Losses were noticed mainly at 11q14 approximately q23 (50%), 13q12 approximately q31 (50%), 14q24 approximately q31 (43%), and 3p13 approximately p23 (43%). Comparison with previous reports suggests that Mediterranean NPCs have higher frequencies of gains at 1q and losses at 13q than their Asian counterparts.
Dynamic analysis of apoptosis using cyanine SYTO probes: From classical to microfluidic cytometry
Wlodkowic, Donald; Skommer, Joanna; Faley, Shannon; Darzynkiewicz, Zbigniew; Cooper, Jonathan M.
2013-01-01
Cell death is a stochastic process, often initiated and/or executed in a multi-pathway/multi-organelle fashion. Therefore, high-throughput single-cell analysis platforms are required to provide detailed characterization of kinetics and mechanisms of cell death in heterogeneous cell populations. However, there is still a largely unmet need for inert fluorescent probes, suitable for prolonged kinetic studies. Here, we compare the use of innovative adaptation of unsymmetrical SYTO dyes for dynamic real-time analysis of apoptosis in conventional as well as microfluidic chip-based systems. We show that cyanine SYTO probes allow non-invasive tracking of intracellular events over extended time. Easy handling and “stain–no wash” protocols open up new opportunities for high-throughput analysis and live-cell sorting. Furthermore, SYTO probes are easily adaptable for detection of cell death using automated microfluidic chip-based cytometry. Overall, the combined use of SYTO probes and state-of-the-art Lab-on-a-Chip platform emerges as a cost effective solution for automated drug screening compared to conventional Annexin V or TUNEL assays. In particular, it should allow for dynamic analysis of samples where low cell number has so far been an obstacle, e.g. primary cancer stems cells or circulating minimal residual tumors. PMID:19298813
Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm
Veladi, H.
2014-01-01
A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717
Performance-based seismic design of steel frames utilizing colliding bodies algorithm.
Veladi, H
2014-01-01
A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm.
de Castro, Denise Tornavoi; Lepri, César Penazzo; Valente, Mariana Lima da Costa; dos Reis, Andréa Cândido
2016-01-01
The aim of this study was to compare the compressive strength of a silorane-based composite resin (Filtek P90) to that of conventional composite resins (Charisma, Filtek Z250, Fill Magic, and NT Premium) before and after accelerated artificial aging (AAA). For each composite resin, 16 cylindrical specimens were prepared and divided into 2 groups. One group underwent analysis of compressive strength in a universal testing machine 24 hours after preparation, and the other was subjected first to 192 hours of AAA and then the compressive strength test. Data were analyzed by analysis of variance, followed by the Tukey HSD post hoc test (α = 0.05). Some statistically significant differences in compressive strength were found among the commercial brands (P < 0.001). The conventional composite resin Fill Magic presented the best performance before (P < 0.05) and after AAA (P < 0.05). Values for compressive strength of the silorane-based composite were among the lowest obtained, both before and after aging. Comparison of each material before and after AAA revealed that the aging process did not influence the compressive strength of the tested resins (P = 0.785).
Space-based surface wind vectors to aid understanding of air-sea interactions
NASA Technical Reports Server (NTRS)
Atlas, R.; Bloom, S. C.; Hoffman, R. N.; Ardizzone, J. V.; Brin, G.
1991-01-01
A novel and unique ocean-surface wind data-set has been derived by combining the Defense Meteorological Satellite Program Special Sensor Microwave Imager data with additional conventional data. The variational analysis used generates a gridded surface wind analysis that minimizes an objective function measuring the misfit of the analysis to the background, the data, and certain a priori constraints. In the present case, the European Center for Medium-Range Weather Forecasts surface-wind analysis is used as the background.
A new approach to modeling aviation accidents
NASA Astrophysics Data System (ADS)
Rao, Arjun Harsha
General Aviation (GA) is a catchall term for all aircraft operations in the US that are not categorized as commercial operations or military flights. GA aircraft account for almost 97% of the US civil aviation fleet. Unfortunately, GA flights have a much higher fatal accident rate than commercial operations. Recent estimates by the Federal Aviation Administration (FAA) showed that the GA fatal accident rate has remained relatively unchanged between 2010 and 2015, with 1566 fatal accidents accounting for 2650 fatalities. Several research efforts have been directed towards betters understanding the causes of GA accidents. Many of these efforts use National Transportation Safety Board (NTSB) accident reports and data. Unfortunately, while these studies easily identify the top types of accidents (e.g., inflight loss of control (LOC)), they usually cannot identify why these accidents are happening. Most NTSB narrative reports for GA accidents are very short (many are only one paragraph long), and do not contain much information on the causes (likely because the causes were not fully identified). NTSB investigators also code each accident using an event-based coding system, which should facilitate identification of patterns and trends in causation, given the high number of GA accidents each year. However, this system is susceptible to investigator interpretation and error, meaning that two investigators may code the same accident differently, or omit applicable codes. To facilitate a potentially better understanding of GA accident causation, this research develops a state-based approach to check for logical gaps or omissions in NTSB accident records, and potentially fills-in the omissions. The state-based approach offers more flexibility as it moves away from the conventional event-based representation of accidents, which classifies events in accidents into several categories such as causes, contributing factors, findings, occurrences, and phase of flight. The method views aviation accidents as a set of hazardous states of a system (pilot and aircraft), and triggers that cause the system to move between hazardous states. I used the NTSB's accident coding manual (that contains nearly 4000 different codes) to develop a "dictionary" of hazardous states, triggers, and information codes. Then, I created the "grammar", or a set of rules, that: (1) orders the hazardous states in each accident; and, (2) links the hazardous states using the appropriate triggers. This approach: (1) provides a more correct count of the causes for accidents in the NTSB database; and, (2) checks for gaps or omissions in NTSB accident data, and fills in some of these gaps using logic-based rules. These rules also help identify and count causes for accidents that were not discernable from previous analyses of historical accident data. I apply the model to 6200 helicopter accidents that occurred in the US between 1982 and 2015. First, I identify the states and triggers that are most likely to be associated with fatal and non-fatal accidents. The results suggest that non-fatal accidents, which account for approximately 84% of the accidents, provide valuable opportunities to learn about the causes for accidents. Next, I investigate the causes of inflight loss of control using both a conventional approach and using the state-based approach. The conventional analysis provides little insight into the causal mechanism for LOC. For instance, the top cause of LOC is "aircraft control/directional control not maintained", which does not provide any insight. In contrast, the state-based analysis showed that pilots' tendency to clip objects frequently triggered LOC (16.7% of LOC accidents)--this finding was not directly discernable from conventional analyses. Finally, I investigate the causes for improper autorotations using both a conventional approach and the state-based approach. The conventional approach uses modifiers (e.g., "improper", "misjudged") associated with "24520: Autorotation" to identify improper autorotations in the pre-2008 system. In the psot-2008 system, the NTSB represents autorotation as a phase of flight, which has no modifier--making it impossible to determine if the autorotation was unsuccessful. In contrast, the state-based analysis identified 632 improper autorotation accidents, compared to 174 with a conventional analysis. Results from the state-based analysis show that not maintaining rotor RPM and improper flare were among the top reasons for improper autorotations. The presence of the "not possible" trigger in 11.6% of improper autorotations, suggests that it was impossible to make an autorotative landing. Improper use of collective is the sixth most frequent trigger for improper autorotation. Correct use of collective pitch control is crucial to maintain rotor RPM during an autorotation (considering that engines are generally not operational during autorotations).
A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation
NASA Astrophysics Data System (ADS)
Byun, K.; Hamlet, A. F.
2017-12-01
There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.
Barlow, Brian T; McLawhorn, Alexander S; Westrich, Geoffrey H
2017-05-03
Dislocation remains a clinically important problem following primary total hip arthroplasty, and it is a common reason for revision total hip arthroplasty. Dual mobility (DM) implants decrease the risk of dislocation but can be more expensive than conventional implants and have idiosyncratic failure mechanisms. The purpose of this study was to investigate the cost-effectiveness of DM implants compared with conventional bearings for primary total hip arthroplasty. Markov model analysis was conducted from the societal perspective with use of direct and indirect costs. Costs, expressed in 2013 U.S. dollars, were derived from the literature, the National Inpatient Sample, and the Centers for Medicare & Medicaid Services. Effectiveness was expressed in quality-adjusted life years (QALYs). The model was populated with health state utilities and state transition probabilities derived from previously published literature. The analysis was performed for a patient's lifetime, and costs and effectiveness were discounted at 3% annually. The principal outcome was the incremental cost-effectiveness ratio (ICER), with a willingness-to-pay threshold of $100,000/QALY. Sensitivity analyses were performed to explore relevant uncertainty. In the base case, DM total hip arthroplasty showed absolute dominance over conventional total hip arthroplasty, with lower accrued costs ($39,008 versus $40,031 U.S. dollars) and higher accrued utility (13.18 versus 13.13 QALYs) indicating cost-savings. DM total hip arthroplasty ceased being cost-saving when its implant costs exceeded those of conventional total hip arthroplasty by $1,023, and the cost-effectiveness threshold for DM implants was $5,287 greater than that for conventional implants. DM was not cost-effective when the annualized incremental probability of revision from any unforeseen failure mechanism or mechanisms exceeded 0.29%. The probability of intraprosthetic dislocation exerted the most influence on model results. This model determined that, compared with conventional bearings, DM implants can be cost-saving for routine primary total hip arthroplasty, from the societal perspective, if newer-generation DM implants meet specific economic and clinical benchmarks. The differences between these thresholds and the performance of other contemporary bearings were frequently quite narrow. The results have potential application to the postmarket surveillance of newer-generation DM components. Economic and decision analysis Level III. See Instructions for Authors for a complete description of levels of evidence.
Virtual reality simulation training for health professions trainees in gastrointestinal endoscopy.
Walsh, Catharine M; Sherlock, Mary E; Ling, Simon C; Carnahan, Heather
2012-06-13
Traditionally, training in gastrointestinal endoscopy has been based upon an apprenticeship model, with novice endoscopists learning basic skills under the supervision of experienced preceptors in the clinical setting. Over the last two decades, however, the growing awareness of the need for patient safety has brought the issue of simulation-based training to the forefront. While the use of simulation-based training may have important educational and societal advantages, the effectiveness of virtual reality gastrointestinal endoscopy simulators has yet to be clearly demonstrated. To determine whether virtual reality simulation training can supplement and/or replace early conventional endoscopy training (apprenticeship model) in diagnostic oesophagogastroduodenoscopy, colonoscopy and/or sigmoidoscopy for health professions trainees with limited or no prior endoscopic experience. Health professions, educational and computer databases were searched until November 2011 including The Cochrane Central Register of Controlled Trials, MEDLINE, EMBASE, Scopus, Web of Science, Biosis Previews, CINAHL, Allied and Complementary Medicine Database, ERIC, Education Full Text, CBCA Education, Career and Technical Education @ Scholars Portal, Education Abstracts @ Scholars Portal, Expanded Academic ASAP @ Scholars Portal, ACM Digital Library, IEEE Xplore, Abstracts in New Technologies and Engineering and Computer & Information Systems Abstracts. The grey literature until November 2011 was also searched. Randomised and quasi-randomised clinical trials comparing virtual reality endoscopy (oesophagogastroduodenoscopy, colonoscopy and sigmoidoscopy) simulation training versus any other method of endoscopy training including conventional patient-based training, in-job training, training using another form of endoscopy simulation (e.g. low-fidelity simulator), or no training (however defined by authors) were included. Trials comparing one method of virtual reality training versus another method of virtual reality training (e.g. comparison of two different virtual reality simulators) were also included. Only trials measuring outcomes on humans in the clinical setting (as opposed to animals or simulators) were included. Two authors (CMS, MES) independently assessed the eligibility and methodological quality of trials, and extracted data on the trial characteristics and outcomes. Due to significant clinical and methodological heterogeneity it was not possible to pool study data in order to perform a meta-analysis. Where data were available for each continuous outcome we calculated standardized mean difference with 95% confidence intervals based on intention-to-treat analysis. Where data were available for dichotomous outcomes we calculated relative risk with 95% confidence intervals based on intention-to-treat-analysis. Thirteen trials, with 278 participants, met the inclusion criteria. Four trials compared simulation-based training with conventional patient-based endoscopy training (apprenticeship model) whereas nine trials compared simulation-based training with no training. Only three trials were at low risk of bias. Simulation-based training, as compared with no training, generally appears to provide participants with some advantage over their untrained peers as measured by composite score of competency, independent procedure completion, performance time, independent insertion depth, overall rating of performance or competency error rate and mucosal visualization. Alternatively, there was no conclusive evidence that simulation-based training was superior to conventional patient-based training, although data were limited. The results of this systematic review indicate that virtual reality endoscopy training can be used to effectively supplement early conventional endoscopy training (apprenticeship model) in diagnostic oesophagogastroduodenoscopy, colonoscopy and/or sigmoidoscopy for health professions trainees with limited or no prior endoscopic experience. However, there remains insufficient evidence to advise for or against the use of virtual reality simulation-based training as a replacement for early conventional endoscopy training (apprenticeship model) for health professions trainees with limited or no prior endoscopic experience. There is a great need for the development of a reliable and valid measure of endoscopic performance prior to the completion of further randomised clinical trials with high methodological quality.
NASA Astrophysics Data System (ADS)
Yuan, Rui; Zhu, Rui; Qu, Jianhua; Wu, Jun; You, Xincai; Sun, Yuqiu; Zhou, Yuanquan (Nancy)
2018-05-01
The Mahu Depression is an important hydrocarbon-bearing foreland sag located at the northwestern margin of the Junggar Basin, China. On the northern slope of the depression, large coarse-grained proximal fan-delta depositional systems developed in the Lower Triassic Baikouquan Formation (T1b). Some lithologic hydrocarbon reservoirs have been found in the conglomerates of the formation since recent years. However, the rapid vertical and horizontal lithology variations make it is difficult to divide the base-level cycle of the formation using the conventional methods. Spectral analysis technologies, such as Integrated Prediction Error Filter Analysis (INPEFA), provide another effective way to overcome this difficultly. In this paper, processed by INPEFA, conventional resistivity logs are utilized to study the base-level cycle of the fan-delta depositional systems. The negative trend of the INPEFA curve indicates the base-level fall semi-cycles, adversely, positive trend suggests the rise semi-cycles. Base-level cycles of Baikouquan Formation are divided in single and correlation wells. One long-term base-level rise semi-cycle, including three medium-term base-level cycles, is identified overall the Baikouquan Formation. The medium-term base-level cycles are characterized as rise semi-cycles mainly in the fan-delta plain, symmetric cycles in the fan-delta front and fall semi-cycles mainly in the pro-fan-delta. The short-term base-level rise semi-cycles most developed in the braided channels, sub-aqueous distributary channels and sheet sands. While, the interdistributary bays and pro-fan-delta mud indicate short-term base-level fall semi-cycles. Finally, based on the method of INPEFA, sequence filling model of Baikouquan formation is established.
Effect of Sling Exercise Training on Balance in Patients with Stroke: A Meta-Analysis
Peng, Qiyuan; Chen, Jingjie; Zou, Yucong; Liu, Gang
2016-01-01
Objective This study aims to evaluate the effect of sling exercise training (SET) on balance in patients with stroke. Methods PubMed, Cochrane Library, Ovid LWW, CBM, CNKI, WanFang, and VIP databases were searched for randomized controlled trials of the effect of SET on balance in patients with stroke. The study design and participants were subjected to metrological analysis. Berg balance Scale (BBS), Barthel index score (BI), and Fugl-Meyer Assessment (FMA) were used as independent parameters for evaluating balance function, activities of daily living(ADL) and motor function after stroke respectively, and were subjected to meta-analysis by RevMan5.3 software. Results Nine studies with 460 participants were analyzed. Results of meta-analysis showed that the SET treatment combined with conventional rehabilitation was superior to conventional rehabilitation treatments, with increased degrees of BBS (WMD = 3.81, 95% CI [0.15, 7.48], P = 0.04), BI (WMD = 12.98, 95% CI [8.39, 17.56], P < 0.00001), and FMA (SMD = 0.76, 95% CI [0.41, 1.11], P < 0.0001). Conclusion Based on limited evidence from 9 trials, the SET treatment combined with conventional rehabilitation was superior to conventional rehabilitation treatments, with increased degrees of BBS, BI and FMA, So the SET treatment can improvement of balance function after stroke, but the interpretation of our findings is required to be made with caution due to limitations in included trials such as small sample sizes and the risk of bias. Therefore, more multi-center and large-sampled randomized controlled trials are needed to confirm its clinical applications. PMID:27727288
Selective classification for improved robustness of myoelectric control under nonideal conditions.
Scheme, Erik J; Englehart, Kevin B; Hudgins, Bernard S
2011-06-01
Recent literature in pattern recognition-based myoelectric control has highlighted a disparity between classification accuracy and the usability of upper limb prostheses. This paper suggests that the conventionally defined classification accuracy may be idealistic and may not reflect true clinical performance. Herein, a novel myoelectric control system based on a selective multiclass one-versus-one classification scheme, capable of rejecting unknown data patterns, is introduced. This scheme is shown to outperform nine other popular classifiers when compared using conventional classification accuracy as well as a form of leave-one-out analysis that may be more representative of real prosthetic use. Additionally, the classification scheme allows for real-time, independent adjustment of individual class-pair boundaries making it flexible and intuitive for clinical use.
Advances in Candida detection platforms for clinical and point-of-care applications
Safavieh, Mohammadali; Coarsey, Chad; Esiobu, Nwadiuto; Memic, Adnan; Vyas, Jatin Mahesh; Shafiee, Hadi; Asghar, Waseem
2016-01-01
Invasive candidiasis remains one of the most serious community and healthcare-acquired infections worldwide. Conventional Candida detection methods based on blood and plate culture are time-consuming and require at least 2–4 days to identify various Candida species. Despite considerable advances for candidiasis detection, the development of simple, compact and portable point-of-care diagnostics for rapid and precise testing that automatically performs cell lysis, nucleic acid extraction, purification and detection still remains a challenge. Here, we systematically review most prominent conventional and nonconventional techniques for the detection of various Candida species, including Candida staining, blood culture, serological testing and nucleic acid-based analysis. We also discuss the most advanced lab on a chip devices for candida detection. PMID:27093473
Medium-Duty Plug-in Electric Delivery Truck Fleet Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prohaska, Robert; Ragatz, Adam; Simpson, Mike
2016-06-29
In this paper, the authors present an overview of medium-duty electric vehicle (EV) operating behavior based on in-use data collected from Smith Newton electric delivery vehicles and compare their performance and operation to conventional diesel trucks operating in the same fleet. The vehicles' drive cycles and operation are analyzed and compared to demonstrate the importance of matching specific EV technologies to the appropriate operational duty cycle. The results of this analysis show that the Smith Newton EVs demonstrated a 68% reduction in energy consumption over the data reporting period compared to the conventional diesel vehicles, as well as a 46.4%more » reduction in carbon dioxide equivalent emissions based on the local energy generation source.« less
Medium-duty plug-in electric delivery truck fleet evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prohaska, Robert; Ragatz, Adam; Simpson, Mike
2016-06-01
In this paper, the authors present an overview of medium-duty electric vehicle (EV) operating behavior based on in-use data collected from Smith Newton electric delivery vehicles and compare their performance and operation to conventional diesel trucks operating in the same fleet. The vehicles' drive cycles and operation are analyzed and compared to demonstrate the importance of matching specific EV technologies to the appropriate operational duty cycle. The results of this analysis show that the Smith Newton EVs demonstrated a 68% reduction in energy consumption over the data reporting period compared to the conventional diesel vehicles, as well as a 46.4%more » reduction in carbon dioxide equivalent emissions based on the local energy generation source.« less
Medium-Duty Plug-In Electric Delivery Truck Fleet Evaluation: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prohaska, Robert; Ragatz, Adam; Simpson, Mike
2016-04-13
In this paper, the authors present an overview of medium-duty electric vehicle (EV) operating behavior based on in-use data collected from Smith Newton electric delivery vehicles and compare their performance and operation to conventional diesel trucks operating in the same fleet. The vehicles' drive cycles and operation are analyzed and compared to demonstrate the importance of matching specific EV technologies to the appropriate operational duty cycle. The results of this analysis show that the Smith Newton EVs demonstrated a 68% reduction in energy consumption over the data reporting period compared to the conventional diesel vehicles, as well as a 46.4%more » reduction in carbon dioxide equivalent emissions based on the local energy generation source.« less
Girianelli, Vania R; Thuler, Luiz Claudio S; Szklo, Moyses; Donato, Alexandre; Zardo, Lucilia M G; Lozana, José A; Almeida Neto, Olimpio F; Carvalho, Aurenice C L; Matos, Jorge H; Figueiredo, Valeska
2006-12-01
To compare the performance of human papillomavirus DNA tests (samples collected by a healthcare professional and self-collected) and liquid-based cytology with conventional cytology in the detection of cervix uteri cancer and its precursor lesions. A cross-sectional study was carried out in 1777 women living in poor communities in Rio de Janeiro State, Brazil. Eligibility criteria included ages 25-59 years and not having had a Papanicolau test within at least 3 years prior to the study. Cytology (conventional or liquid-based) and human papillomavirus DNA (collected by a healthcare professional or self-collected) tests were performed using samples collected in a single visit. Women with abnormalities in at least one test and a systematic sample of 70 women with negative test results were referred to a colposcopic examination. Test readings were double-masked, and the outcome of interest was high-grade squamous intraepithelial lesion or worse. The pathology report was used as the gold standard. The prevalence of high-grade squamous intraepithelial lesion or worse was 2.0%. Human papillomavirus DNA test collected by a health professional alone or combined with conventional cytology had the highest sensitivity (91.4 and 97.1%, respectively). The highest specificity was found for conventional cytology (91.6%) and for a human papillomavirus DNA test collected by a healthcare professional (90.2%). On the basis of only test performance, the use of human papillomavirus DNA tests, alone or combined with cytology, would seem to be recommended. Its population-wide implementation, however, is conditional on a cost-effectiveness analysis.
Pla, Maria; La Paz, José-Luis; Peñas, Gisela; García, Nora; Palaudelmàs, Montserrat; Esteve, Teresa; Messeguer, Joaquima; Melé, Enric
2006-04-01
Maize is one of the main crops worldwide and an increasing number of genetically modified (GM) maize varieties are cultivated and commercialized in many countries in parallel to conventional crops. Given the labeling rules established e.g. in the European Union and the necessary coexistence between GM and non-GM crops, it is important to determine the extent of pollen dissemination from transgenic maize to other cultivars under field conditions. The most widely used methods for quantitative detection of GMO are based on real-time PCR, which implies the results are expressed in genome percentages (in contrast to seed or grain percentages). Our objective was to assess the accuracy of real-time PCR based assays to accurately quantify the contents of transgenic grains in non-GM fields in comparison with the real cross-fertilization rate as determined by phenotypical analysis. We performed this study in a region where both GM and conventional maize are normally cultivated and used the predominant transgenic maize Mon810 in combination with a conventional maize variety which displays the characteristic of white grains (therefore allowing cross-pollination quantification as percentage of yellow grains). Our results indicated an excellent correlation between real-time PCR results and number of cross-fertilized grains at Mon810 levels of 0.1-10%. In contrast, Mon810 percentage estimated by weight of grains produced less accurate results. Finally, we present and discuss the pattern of pollen-mediated gene flow from GM to conventional maize in an example case under field conditions.
Pickup, William; Bremer, Phil; Peng, Mei
2018-03-01
The extensive time and cost associated with conventional sensory profiling methods has spurred sensory researchers to develop rapid method alternatives, such as Napping® with Ultra-Flash Profiling (UFP). Napping®-UFP generates sensory maps by requiring untrained panellists to separate samples based on perceived sensory similarities. Evaluations of this method have been restrained to manufactured/formulated food models, and predominantly structured on comparisons against the conventional descriptive method. The present study aims to extend the validation of Napping®-UFP (N = 72) to natural biological products; and to evaluate this method against Descriptive Analysis (DA; N = 8) with physiochemical measurements as an additional evaluative criterion. The results revealed that sample configurations generated by DA and Napping®-UFP were not significantly correlated (RV = 0.425, P = 0.077); however, they were both correlated with the product map generated based on the instrumental measures (P < 0.05). The finding also noted that sample characterisations from DA and Napping®-UFP were driven by different sensory attributes, indicating potential structural differences between these two methods in configuring samples. Overall, these findings lent support for the extended use of Napping®-UFP for evaluations of natural biological products. Although DA was shown to be a better method for establishing sensory-instrumental relationships, Napping®-UFP exhibited strengths in generating informative sample configurations based on holistic perception of products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
A Direct Cell Quenching Method for Cell-Culture Based Metabolomics
A crucial step in metabolomic analysis of cellular extracts is the cell quenching process. The conventional method first uses trypsin to detach cells from their growth surface. This inevitably changes the profile of cellular metabolites since the detachment of cells from the extr...
NASA Technical Reports Server (NTRS)
Achtemeier, Gary L.; Kidder, Stanley Q.; Scott, Robert W.
1988-01-01
The variational multivariate assimilation method described in a companion paper by Achtemeier and Ochs is applied to conventional and conventional plus satellite data. Ground-based and space-based meteorological data are weighted according to the respective measurement errors and blended into a data set that is a solution of numerical forms of the two nonlinear horizontal momentum equations, the hydrostatic equation, and an integrated continuity equation for a dry atmosphere. The analyses serve first, to evaluate the accuracy of the model, and second to contrast the analyses with and without satellite data. Evaluation criteria measure the extent to which: (1) the assimilated fields satisfy the dynamical constraints, (2) the assimilated fields depart from the observations, and (3) the assimilated fields are judged to be realistic through pattern analysis. The last criterion requires that the signs, magnitudes, and patterns of the hypersensitive vertical velocity and local tendencies of the horizontal velocity components be physically consistent with respect to the larger scale weather systems.
Untangling Brain-Wide Dynamics in Consciousness by Cross-Embedding
Tajima, Satohiro; Yanagawa, Toru; Fujii, Naotaka; Toyoizumi, Taro
2015-01-01
Brain-wide interactions generating complex neural dynamics are considered crucial for emergent cognitive functions. However, the irreducible nature of nonlinear and high-dimensional dynamical interactions challenges conventional reductionist approaches. We introduce a model-free method, based on embedding theorems in nonlinear state-space reconstruction, that permits a simultaneous characterization of complexity in local dynamics, directed interactions between brain areas, and how the complexity is produced by the interactions. We demonstrate this method in large-scale electrophysiological recordings from awake and anesthetized monkeys. The cross-embedding method captures structured interaction underlying cortex-wide dynamics that may be missed by conventional correlation-based analysis, demonstrating a critical role of time-series analysis in characterizing brain state. The method reveals a consciousness-related hierarchy of cortical areas, where dynamical complexity increases along with cross-area information flow. These findings demonstrate the advantages of the cross-embedding method in deciphering large-scale and heterogeneous neuronal systems, suggesting a crucial contribution by sensory-frontoparietal interactions to the emergence of complex brain dynamics during consciousness. PMID:26584045
Neural correlates of conventional and harm/welfare-based moral decision-making.
White, Stuart F; Zhao, Hui; Leong, Kelly Kimiko; Smetana, Judith G; Nucci, Larry P; Blair, R James R
2017-12-01
The degree to which social norms are processed by a unitary system or dissociable systems remains debated. Much research on children's social-cognitive judgments has supported the distinction between "moral" (harm/welfare-based) and "conventional" norms. However, the extent to which these norms are processed by dissociable neural systems remains unclear. To address this issue, 23 healthy participants were scanned with functional magnetic resonance imaging (fMRI) while they rated the wrongness of harm/welfare-based and conventional transgressions and neutral vignettes. Activation significantly greater than the neutral vignette baseline was observed in regions implicated in decision-making regions including rostral/ventral medial frontal, anterior insula and dorsomedial frontal cortices when evaluating both harm/welfare-based and social-conventional transgressions. Greater activation when rating harm/welfare-based relative to social-conventional transgressions was seen through much of ACC and bilateral inferior frontal gyrus. Greater activation was observed in superior temporal gyrus, bilateral middle temporal gyrus, left PCC, and temporal-parietal junction when rating social-conventional transgressions relative to harm/welfare-based transgressions. These data suggest that decisions regarding the wrongness of actions, irrespective of whether they involve care/harm-based or conventional transgressions, recruit regions generally implicated in affect-based decision-making. However, there is neural differentiation between harm/welfare-based and conventional transgressions. This may reflect the particular importance of processing the intent of transgressors of conventional norms and perhaps the greater emotional content or salience of harm/welfare-based transgressions.
NASA Technical Reports Server (NTRS)
Turner, M. J.; Grande, D. L.
1978-01-01
Based on estimated graphite and boron fiber properties, allowable stresses and strains were established for advanced composite materials. Stiffened panel and conventional sandwich panel concepts were designed and analyzed, using graphite/polyimide and boron/polyimide materials. The conventional sandwich panel was elected as the structural concept for the modified wing structure. Upper and lower surface panels of the arrow wing structure were then redesigned, using high strength graphite/polyimide sandwich panels, retaining the titanium spars and ribs from the prior study. The ATLAS integrated analysis and design system was used for stress analysis and automated resizing of surface panels. Flutter analysis of the hybrid structure showed a significant decrease in flutter speed relative to the titanium wing design. The flutter speed was increased to that of the titanium design by selective increase in laminate thickness and by using graphite fibers with properties intermediate between high strength and high modulus values.
Hulshof, Tessa A; Zuidema, Sytse U; Ostelo, Raymond W J G; Luijendijk, Hendrika J
2015-10-01
Numerous observational studies have reported an increased risk of mortality for conventional antipsychotics in elderly patients, and for haloperidol in particular. Subsequently, health authorities have warned against use of conventional antipsychotics in dementia. Experimental evidence is lacking. To assess the mortality risk of conventional antipsychotics in elderly patients with a meta-analysis of trials. Original studies were identified in electronic databases, online trial registers, and hand-searched references of published reviews. Two investigators found 28 potentially eligible studies, and they selected 17 randomized placebo-controlled trials in elderly patients with dementia, delirium, or a high risk of delirium. Two investigators independently abstracted trial characteristics and deaths, and 3 investigators assessed the risk of bias. Deaths were pooled with RevMan to obtain risk differences and risk ratios. Data of 17 trials with a total of 2387 participants were available. Thirty-two deaths occurred. The pooled risk difference of 0.1% was not statistically significant (95% confidence interval (CI) -1.0%-1.2%). The risk ratio was 1.07 (95% CI 0.54-2.13). Eleven of 17 trials tested haloperidol (n = 1799). The risk difference was 0.4% (95% CI -0.9%-1.6%), the risk ratio was 1.25 (95% CI 0.59-2.65). This meta-analysis of placebo-controlled randomized trials does not show that conventional antipsychotics in general or haloperidol in particular increase the risk of mortality in elderly patients. It questions the observational findings and the warning based on these findings. Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Cost-utility analysis on telemonitoring of users with pacemakers: The PONIENTE study.
Lopez-Villegas, Antonio; Catalan-Matamoros, Daniel; Robles-Musso, Emilio; Bautista-Mesa, Rafael; Peiro, Salvador
2018-01-01
Introduction Few studies have confirmed the cost-saving of telemonitoring of users with pacemakers (PMs). The purpose of this controlled, non-randomised, non-masked clinical trial was to perform an economic assessment of telemonitoring (TM) of users with PMs and check whether TM offers a cost-utility alternative to conventional follow-up in hospital. Methods Eighty-two patients implanted with an internet-based transmission PM were selected to receive either conventional follow-up in hospital ( n = 52) or TM ( n = 30) from their homes. The data were collected during 12 months while patients were being monitored. The economic assessment of the PONIENTE study was performed as per the perspectives of National Health Service (NHS) and patients. A cost-utility analysis was conducted to measure whether the TM of patients with PMs is cost-effective in terms of costs per gained quality-adjusted life years (QALYs). Results There was a significant cost-saving for participants in the TM group in comparison with the participants in the conventional follow-up group. From the NHS's perspective, the patients in the TM group gained 0.09 QALYs more than the patients in the conventional follow-up group over 12 months, with a cost saving of 57.64% (€46.51 versus €109.79, respectively; p < 0.001) per participant per year. In-office visits were reduced by 52.49% in the TM group. The costs related to the patient perspective were lower in the TM group than in the conventional follow-up group (€31.82 versus €73.48, respectively; p < 0.005). The costs per QALY were 61.68% higher in the in-office monitoring group. Discussion The cost-utility analysis performed in the PONIENTE study showed that the TM of users with PMs appears to be a significant cost-effective alternative to conventional follow-up in hospital.
Stringent DDI-based Prediction of H. sapiens-M. tuberculosis H37Rv Protein-Protein Interactions
2013-01-01
Background H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are very important information to illuminate the infection mechanism of M. tuberculosis H37Rv. But current H. sapiens-M. tuberculosis H37Rv PPI data are very scarce. This seriously limits the study of the interaction between this important pathogen and its host H. sapiens. Computational prediction of H. sapiens-M. tuberculosis H37Rv PPIs is an important strategy to fill in the gap. Domain-domain interaction (DDI) based prediction is one of the frequently used computational approaches in predicting both intra-species and inter-species PPIs. However, the performance of DDI-based host-pathogen PPI prediction has been rather limited. Results We develop a stringent DDI-based prediction approach with emphasis on (i) differences between the specific domain sequences on annotated regions of proteins under the same domain ID and (ii) calculation of the interaction strength of predicted PPIs based on the interacting residues in their interaction interfaces. We compare our stringent DDI-based approach to a conventional DDI-based approach for predicting PPIs based on gold standard intra-species PPIs and coherent informative Gene Ontology terms assessment. The assessment results show that our stringent DDI-based approach achieves much better performance in predicting PPIs than the conventional approach. Using our stringent DDI-based approach, we have predicted a small set of reliable H. sapiens-M. tuberculosis H37Rv PPIs which could be very useful for a variety of related studies. We also analyze the H. sapiens-M. tuberculosis H37Rv PPIs predicted by our stringent DDI-based approach using cellular compartment distribution analysis, functional category enrichment analysis and pathway enrichment analysis. The analyses support the validity of our prediction result. Also, based on an analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent DDI-based approach, we have discovered some important properties of domains involved in host-pathogen PPIs. We find that both host and pathogen proteins involved in host-pathogen PPIs tend to have more domains than proteins involved in intra-species PPIs, and these domains have more interaction partners than domains on proteins involved in intra-species PPI. Conclusions The stringent DDI-based prediction approach reported in this work provides a stringent strategy for predicting host-pathogen PPIs. It also performs better than a conventional DDI-based approach in predicting PPIs. We have predicted a small set of accurate H. sapiens-M. tuberculosis H37Rv PPIs which could be very useful for a variety of related studies. PMID:24564941
Oosting, Ellen; Hoogeboom, Thomas J; Appelman-de Vries, Suzan A; Swets, Adam; Dronkers, Jaap J; van Meeteren, Nico L U
2016-01-01
The aim of this study was to evaluate the value of conventional factors, the Risk Assessment and Predictor Tool (RAPT) and performance-based functional tests as predictors of delayed recovery after total hip arthroplasty (THA). A prospective cohort study in a regional hospital in the Netherlands with 315 patients was attending for THA in 2012. The dependent variable recovery of function was assessed with the Modified Iowa Levels of Assistance scale. Delayed recovery was defined as taking more than 3 days to walk independently. Independent variables were age, sex, BMI, Charnley score, RAPT score and scores for four performance-based tests [2-minute walk test, timed up and go test (TUG), 10-meter walking test (10 mW) and hand grip strength]. Regression analysis with all variables identified older age (>70 years), Charnley score C, slow walking speed (10 mW >10.0 s) and poor functional mobility (TUG >10.5 s) as the best predictors of delayed recovery of function. This model (AUC 0.85, 95% CI 0.79-0.91) performed better than a model with conventional factors and RAPT scores, and significantly better (p = 0.04) than a model with only conventional factors (AUC 0.81, 95% CI 0.74-0.87). The combination of performance-based tests and conventional factors predicted inpatient functional recovery after THA. Two simple functional performance-based tests have a significant added value to a more conventional screening with age and comorbidities to predict recovery of functioning immediately after total hip surgery. Patients over 70 years old, with comorbidities, with a TUG score >10.5 s and a walking speed >1.0 m/s are at risk for delayed recovery of functioning. Those high risk patients need an accurate discharge plan and could benefit from targeted pre- and postoperative therapeutic exercise programs.
Gowda, Dhananjaya; Airaksinen, Manu; Alku, Paavo
2017-09-01
Recently, a quasi-closed phase (QCP) analysis of speech signals for accurate glottal inverse filtering was proposed. However, the QCP analysis which belongs to the family of temporally weighted linear prediction (WLP) methods uses the conventional forward type of sample prediction. This may not be the best choice especially in computing WLP models with a hard-limiting weighting function. A sample selective minimization of the prediction error in WLP reduces the effective number of samples available within a given window frame. To counter this problem, a modified quasi-closed phase forward-backward (QCP-FB) analysis is proposed, wherein each sample is predicted based on its past as well as future samples thereby utilizing the available number of samples more effectively. Formant detection and estimation experiments on synthetic vowels generated using a physical modeling approach as well as natural speech utterances show that the proposed QCP-FB method yields statistically significant improvements over the conventional linear prediction and QCP methods.
NASA Technical Reports Server (NTRS)
Wilkie, W. Keats; Belvin, W. Keith; Park, K. C.
1996-01-01
A simple aeroelastic analysis of a helicopter rotor blade incorporating embedded piezoelectric fiber composite, interdigitated electrode blade twist actuators is described. The analysis consists of a linear torsion and flapwise bending model coupled with a nonlinear ONERA based unsteady aerodynamics model. A modified Galerkin procedure is performed upon the rotor blade partial differential equations of motion to develop a system of ordinary differential equations suitable for dynamics simulation using numerical integration. The twist actuation responses for three conceptual fullscale blade designs with realistic constraints on blade mass are numerically evaluated using the analysis. Numerical results indicate that useful amplitudes of nonresonant elastic twist, on the order of one to two degrees, are achievable under one-g hovering flight conditions for interdigitated electrode poling configurations. Twist actuation for the interdigitated electrode blades is also compared with the twist actuation of a conventionally poled piezoelectric fiber composite blade. Elastic twist produced using the interdigitated electrode actuators was found to be four to five times larger than that obtained with the conventionally poled actuators.
NASA Technical Reports Server (NTRS)
Wilkie, W. Keats; Park, K. C.
1996-01-01
A simple aeroelastic analysis of a helicopter rotor blade incorporating embedded piezoelectric fiber composite, interdigitated electrode blade twist actuators is described. The analysis consist of a linear torsion and flapwise bending model coupled with a nonlinear ONERA based unsteady aerodynamics model. A modified Galerkin procedure is performed upon the rotor blade partial differential equations of motion to develop a system of ordinary differential equations suitable for numerical integration. The twist actuation responses for three conceptual full-scale blade designs with realistic constraints on blade mass are numerically evaluated using the analysis. Numerical results indicate that useful amplitudes of nonresonant elastic twist, on the order of one to two degrees, are achievable under one-g hovering flight conditions for interdigitated electrode poling configurations. Twist actuation for the interdigitated electrode blades is also compared with the twist actuation of a conventionally poled piezoelectric fiber composite blade. Elastic twist produced using the interdigitated electrode actuators was found to be four to five times larger than that obtained with the conventionally poled actuators.
Tsirogiannis, Panagiotis; Reissmann, Daniel R; Heydecke, Guido
2016-09-01
In existing published reports, some studies indicate the superiority of digital impression systems in terms of the marginal accuracy of ceramic restorations, whereas others show that the conventional method provides restorations with better marginal fit than fully digital fabrication. Which impression method provides the lowest mean values for marginal adaptation is inconclusive. The findings from those studies cannot be easily generalized, and in vivo studies that could provide valid and meaningful information are limited in the existing publications. The purpose of this study was to systematically review existing reports and evaluate the marginal fit of ceramic single-tooth restorations after either digital or conventional impression methods by combining the available evidence in a meta-analysis. The search strategy for this systematic review of the publications was based on a Population, Intervention, Comparison, and Outcome (PICO) framework. For the statistical analysis, the mean marginal fit values of each study were extracted and categorized according to the impression method to calculate the mean value, together with the 95% confidence intervals (CI) of each category, and to evaluate the impact of each impression method on the marginal adaptation by comparing digital and conventional techniques separately for in vitro and in vivo studies. Twelve studies were included in the meta-analysis from the 63 identified records after database searching. For the in vitro studies, where ceramic restorations were fabricated after conventional impressions, the mean value of the marginal fit was 58.9 μm (95% CI: 41.1-76.7 μm), whereas after digital impressions, it was 63.3 μm (95% CI: 50.5-76.0 μm). In the in vivo studies, the mean marginal discrepancy of the restorations after digital impressions was 56.1 μm (95% CI: 46.3-65.8 μm), whereas after conventional impressions, it was 79.2 μm (95% CI: 59.6-98.9 μm) No significant difference was observed regarding the marginal discrepancy of single-unit ceramic restorations fabricated after digital or conventional impressions. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellinger-Ziegelbauer, Heidrun, E-mail: heidrun.ellinger-ziegelbauer@bayerhealthcare.com; Adler, Melanie; Amberg, Alexander
2011-04-15
The InnoMed PredTox consortium was formed to evaluate whether conventional preclinical safety assessment can be significantly enhanced by incorporation of molecular profiling ('omics') technologies. In short-term toxicological studies in rats, transcriptomics, proteomics and metabolomics data were collected and analyzed in relation to routine clinical chemistry and histopathology. Four of the sixteen hepato- and/or nephrotoxicants given to rats for 1, 3, or 14 days at two dose levels induced similar histopathological effects. These were characterized by bile duct necrosis and hyperplasia and/or increased bilirubin and cholestasis, in addition to hepatocyte necrosis and regeneration, hepatocyte hypertrophy, and hepatic inflammation. Combined analysis ofmore » liver transcriptomics data from these studies revealed common gene expression changes which allowed the development of a potential sequence of events on a mechanistic level in accordance with classical endpoint observations. This included genes implicated in early stress responses, regenerative processes, inflammation with inflammatory cell immigration, fibrotic processes, and cholestasis encompassing deregulation of certain membrane transporters. Furthermore, a preliminary classification analysis using transcriptomics data suggested that prediction of cholestasis may be possible based on gene expression changes seen at earlier time-points. Targeted bile acid analysis, based on LC-MS metabonomics data demonstrating increased levels of conjugated or unconjugated bile acids in response to individual compounds, did not provide earlier detection of toxicity as compared to conventional parameters, but may allow distinction of different types of hepatobiliary toxicity. Overall, liver transcriptomics data delivered mechanistic and molecular details in addition to the classical endpoint observations which were further enhanced by targeted bile acid analysis using LC/MS metabonomics.« less
Automated power management and control
NASA Technical Reports Server (NTRS)
Dolce, James L.
1991-01-01
A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.
Formal hardware verification of digital circuits
NASA Technical Reports Server (NTRS)
Joyce, J.; Seger, C.-J.
1991-01-01
The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.
High-performance wavelet engine
NASA Astrophysics Data System (ADS)
Taylor, Fred J.; Mellot, Jonathon D.; Strom, Erik; Koren, Iztok; Lewis, Michael P.
1993-11-01
Wavelet processing has shown great promise for a variety of image and signal processing applications. Wavelets are also among the most computationally expensive techniques in signal processing. It is demonstrated that a wavelet engine constructed with residue number system arithmetic elements offers significant advantages over commercially available wavelet accelerators based upon conventional arithmetic elements. Analysis is presented predicting the dynamic range requirements of the reported residue number system based wavelet accelerator.
Comparing crop rotations between organic and conventional farming.
Barbieri, Pietro; Pellerin, Sylvain; Nesme, Thomas
2017-10-23
Cropland use activities are major drivers of global environmental changes and of farming system resilience. Rotating crops is a critical land-use driver, and a farmers' key strategy to control environmental stresses and crop performances. Evidence has accumulated that crop rotations have been dramatically simplified over the last 50 years. In contrast, organic farming stands as an alternative production way that promotes crop diversification. However, our understanding of crop rotations is surprisingly limited. In order to understand if organic farming would result in more diversified and multifunctional landscapes, we provide here a novel, systematic comparison of organic-to-conventional crop rotations at the global scale based on a meta-analysis of the scientific literature, paired with an independent analysis of organic-to-conventional land-use. We show that organic farming leads to differences in land-use compared to conventional: overall, crop rotations are 15% longer and result in higher diversity and evener crop species distribution. These changes are driven by a higher abundance of temporary fodders, catch and cover-crops, mostly to the detriment of cereals. We also highlighted differences in organic rotations between Europe and North-America, two leading regions for organic production. This increased complexity of organic crop rotations is likely to enhance ecosystem service provisioning to agroecosystems.
Subtraction coronary CT angiography using second-generation 320-detector row CT.
Yoshioka, Kunihiro; Tanaka, Ryoichi; Muranaka, Kenta; Sasaki, Tadashi; Ueda, Takanori; Chiba, Takuya; Takeda, Kouta; Sugawara, Tsuyoshi
2015-06-01
The purpose of this study was to explore the feasibility of subtraction coronary computed tomography angiography (CCTA) by second-generation 320-detector row CT in patients with severe coronary artery calcification using invasive coronary angiography (ICA) as the gold standard. This study was approved by the institutional board, and all subjects provided written consent. Twenty patients with calcium scores of >400 underwent conventional CCTA and subtraction CCTA followed by ICA. A total of 82 segments were evaluated for image quality using a 4-point scale and the presence of significant (>50 %) luminal stenosis by two independent readers. The average image quality was 2.3 ± 0.8 with conventional CCTA and 3.2 ± 0.6 with subtraction CCTA (P < 0.001). The percentage of segments with non-diagnostic image quality was 43.9 % on conventional CCTA versus 8.5 % on subtraction CCTA (P = 0.004). The segment-based diagnostic accuracy for detecting significant stenosis according to ICA revealed an area under the receiver operating characteristics curve of 0.824 (95 % confidence interval [CI], 0.750-0.899) for conventional CCTA and 0.936 (95 % CI 0.889-0.936) for subtraction CCTA (P = 0.001). The sensitivity, specificity, positive predictive value, and negative predictive value for conventional CCTA were 88.2, 62.5, 62.5, and 88.2 %, respectively, and for subtraction CCTA they were 94.1, 85.4, 82.1, and 95.3 %, respectively. As compared to conventional, subtraction CCTA using a second-generation 320-detector row CT showed improvement in diagnostic accuracy at segment base analysis in patients with severe calcifications.
Wang, Fang; Ouyang, Guang; Zhou, Changsong; Wang, Suiping
2015-01-01
A number of studies have explored the time course of Chinese semantic and syntactic processing. However, whether syntactic processing occurs earlier than semantics during Chinese sentence reading is still under debate. To further explore this issue, an event-related potentials (ERPs) experiment was conducted on 21 native Chinese speakers who read individually-presented Chinese simple sentences (NP1+VP+NP2) word-by-word for comprehension and made semantic plausibility judgments. The transitivity of the verbs was manipulated to form three types of stimuli: congruent sentences (CON), sentences with a semantically violated NP2 following a transitive verb (semantic violation, SEM), and sentences with a semantically violated NP2 following an intransitive verb (combined semantic and syntactic violation, SEM+SYN). The ERPs evoked from the target NP2 were analyzed by using the Residue Iteration Decomposition (RIDE) method to reconstruct the ERP waveform blurred by trial-to-trial variability, as well as by using the conventional ERP method based on stimulus-locked averaging. The conventional ERP analysis showed that, compared with the critical words in CON, those in SEM and SEM+SYN elicited an N400-P600 biphasic pattern. The N400 effects in both violation conditions were of similar size and distribution, but the P600 in SEM+SYN was bigger than that in SEM. Compared with the conventional ERP analysis, RIDE analysis revealed a larger N400 effect and an earlier P600 effect (in the time window of 500-800 ms instead of 570-810ms). Overall, the combination of conventional ERP analysis and the RIDE method for compensating for trial-to-trial variability confirmed the non-significant difference between SEM and SEM+SYN in the earlier N400 time window. Converging with previous findings on other Chinese structures, the current study provides further precise evidence that syntactic processing in Chinese does not occur earlier than semantic processing.
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
ASIL determination for motorbike's Electronics Throttle Control System (ETCS) mulfunction
NASA Astrophysics Data System (ADS)
Zaman Rokhani, Fakhrul; Rahman, Muhammad Taqiuddin Abdul; Ain Kamsani, Noor; Sidek, Roslina Mohd; Saripan, M. Iqbal; Samsudin, Khairulmizam; Khair Hassan, Mohd
2017-11-01
Electronics Throttle Control System (ETCS) is the principal electronic unit in all fuel injection engine motorbike, augmenting the engine performance efficiency in comparison to the conventional carburetor based engine. ETCS is regarded as a safety-critical component, whereby ETCS malfunction can cause unintended acceleration or deceleration event, which can be hazardous to riders. In this study, Hazard Analysis and Risk Assessment, an ISO26262 functional safety standard analysis has been applied on motorbike's ETCS to determine the required automotive safety integrity level. Based on the analysis, the established automotive safety integrity level can help to derive technical and functional safety measures for ETCS development.
NASA Astrophysics Data System (ADS)
Sampurno, A. W.; Rahmat, A.; Diana, S.
2017-09-01
Diagrams/pictures conventions is one form of visual media that often used to assist students in understanding the biological concepts. The effectiveness of use diagrams/pictures in biology learning at school level has also been mostly reported. This study examines the ability of high school students in reading diagrams/pictures biological convention which is described by Mental Representation based on formation of causal networks. The study involved 30 students 11th grade MIA senior high school Banten Indonesia who are studying the excretory system. MR data obtained by Instrument worksheet, developed based on CNET-protocol, in which there are diagrams/drawings of nephron structure and urinary mechanism. Three patterns formed MR, namely Markov chain, feedback control with a single measurement, and repeated feedback control with multiple measurement. The third pattern is the most dominating pattern, differences in the pattern of MR reveal the difference in how and from which point the students begin to uncover important information contained in the diagram to establish a causal networks. Further analysis shows that a difference in the pattern of MR relate to how complex the students process the information contained in the diagrams/pictures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heyman, Heino M.; Zhang, Xing; Tang, Keqi
2016-02-16
Metabolomics is the quantitative analysis of all metabolites in a given sample. Due to the chemical complexity of the metabolome, optimal separations are required for comprehensive identification and quantification of sample constituents. This chapter provides an overview of both conventional and advanced separations methods in practice for reducing the complexity of metabolite extracts delivered to the mass spectrometer detector, and covers gas chromatography (GC), liquid chromatography (LC), capillary electrophoresis (CE), supercritical fluid chromatography (SFC) and ion mobility spectrometry (IMS) separation techniques coupled with mass spectrometry (MS) as both uni-dimensional and as multi-dimensional approaches.
Application of new radio tracking data types to critical spacecraft navigation problems
NASA Technical Reports Server (NTRS)
Ondrasik, V. J.; Rourke, K. H.
1972-01-01
Earth-based radio tracking data types are considered, which involve simultaneous or nearly simultaneous spacecraft tracking from widely separated tracking stations. These data types are conventional tracking instrumentation analogs of the very long baseline interferometry (VLBI) of radio astronomy-hence the name quasi-VLBI. A preliminary analysis of quasi-VLBI is presented using simplified tracking data models. The results of accuracy analyses are presented for a representative mission, Viking 1975. The results indicate that, contingent on projected tracking system accuracy, quasi-VLBI can be expected to significantly improve navigation performance over that expected from conventional tracking data types.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tejabhiram, Y., E-mail: tejabhiram@gmail.com; Pradeep, R.; Helen, A.T.
2014-12-15
Highlights: • Novel low temperature synthesis of nickel ferrite nanoparticles. • Comparison with two conventional synthesis techniques including hydrothermal method. • XRD results confirm the formation of crystalline nickel ferrites at 110 °C. • Superparamagnetic particles with applications in drug delivery and hyperthermia. • Magnetic properties superior to conventional methods found in new process. - Abstract: We report a simple, low temperature and surfactant free co-precipitation method for the preparation of nickel ferrite nanostructures using ferrous sulfate as the iron precursor. The products obtained from this method were compared for their physical properties with nickel ferrites produced through conventional co-precipitationmore » and hydrothermal methods which used ferric nitrate as the iron precursor. X-ray diffraction analysis confirmed the synthesis of single phase inverse spinel nanocrystalline nickel ferrites at temperature as low as 110 °C in the low temperature method. Electron microscopy analysis on the samples revealed the formation of nearly spherical nanostructures in the size range of 20–30 nm which are comparable to other conventional methods. Vibrating sample magnetometer measurements showed the formation of superparamagnetic particles with high magnetic saturation 41.3 emu/g which corresponds well with conventional synthesis methods. The spontaneous synthesis of the nickel ferrite nanoparticles by the low temperature synthesis method was attributed to the presence of 0.808 kJ mol{sup −1} of excess Gibbs free energy due to ferrous sulfate precursor.« less
NASA Astrophysics Data System (ADS)
Reinemann, Deborah Jean
2000-10-01
Measures of time are essential to human life, especially in the Western world. Human understanding of time develops from the preschool stages of using "before" and "after" to an adult understanding and appreciation of time. Previous researchers (for example, Piaget, Friedman) have investigated and described stages of time development. Time, as it was investigated here, can be classified as conventional, logical or experiential. Conventional time is the ordered representation of time; the days of the week, the months of the year, or clock time: seconds and hours. Logical time is the deduction of duration based on regular events; for example, calculating the passage of time based on two separate events. Experiential time involves the duration of events and estimating intervals. With the recent production of the National Science Education Standards (NSES), many schools are aligning their science curriculum with the NSES. Time appears both implicitly and explicitly in the NSES. Do Middle School students possess the understanding of time necessary to meet the recommendations of the NSES? An interview protocol of four sessions was developed to investigate middle school students understanding of time. The four sessions included: building and testing water clocks; an interview about water clocks and time intervals; a laserdisc presentation about relative time spans; and a mind mapping session. Students were also given the GALT test of Logical Thinking. The subjects of the study were interviewed; eleven eighth grade students and thirteen sixth grade students. The data was transcribed and coded, and a rubric was developed to evaluate students based on their responses to the four sessions. The Time Analysis Rubric is a grid of the types of time: conventional, logical and experiential time versus the degree of understanding of time. Student results were assigned to levels of understanding based on the Time Analysis Rubric. There was a relationship (although not significant) between the students' GALT score and the Time Analysis Rubric results. There was no difference in Time Analysis levels between sixth and eighth grade students. On the basis of this study, Middle School students' level of understanding of time appears to be sufficient to master the requirements of the NSES.
Chang, Jenny Zwei-Chieng; Liu, Pao-Hsin; Chen, Yi-Jane; Yao, Jane Chung-Chen; Chang, Hong-Po; Chang, Chih-Han; Chang, Frank Hsin-Fu
2006-02-01
Face mask therapy is indicated for growing patients who suffer from maxillary retrognathia. Most previous studies used conventional cephalometric analysis to evaluate the effects of face mask treatment. Cephalometric analysis has been shown to be insufficient for complex craniofacial configurations. The purpose of this study was to investigate changes in the craniofacial structure of children with maxillary retrognathism following face mask treatment by means of thin-plate spline analysis. Thirty children with skeletal Class III malocclusions who had been treated with face masks were compared with a group of 30 untreated gender-matched, age-matched, observation period-matched, and craniofacial configuration-matched subjects. Average geometries, scaled to an equivalent size, were generated by means of Procrustes analysis. Thin-plate spline analysis was then performed for localization of the shape changes. Face mask treatment induced a forward displacement of the maxilla, a counterclockwise rotation of the palatal plane, a horizontal compression of the anterior border of the symphysis and the condylar region, and a downward deformation of the menton. The cranial base exhibited a counterclockwise deformation as a whole. We conclude that thin-plate spline analysis is a valuable supplement to conventional cephalometric analysis.
Laboratory Workflow Analysis of Culture of Periprosthetic Tissues in Blood Culture Bottles.
Peel, Trisha N; Sedarski, John A; Dylla, Brenda L; Shannon, Samantha K; Amirahmadi, Fazlollaah; Hughes, John G; Cheng, Allen C; Patel, Robin
2017-09-01
Culture of periprosthetic tissue specimens in blood culture bottles is more sensitive than conventional techniques, but the impact on laboratory workflow has yet to be addressed. Herein, we examined the impact of culture of periprosthetic tissues in blood culture bottles on laboratory workflow and cost. The workflow was process mapped, decision tree models were constructed using probabilities of positive and negative cultures drawn from our published study (T. N. Peel, B. L. Dylla, J. G. Hughes, D. T. Lynch, K. E. Greenwood-Quaintance, A. C. Cheng, J. N. Mandrekar, and R. Patel, mBio 7:e01776-15, 2016, https://doi.org/10.1128/mBio.01776-15), and the processing times and resource costs from the laboratory staff time viewpoint were used to compare periprosthetic tissues culture processes using conventional techniques with culture in blood culture bottles. Sensitivity analysis was performed using various rates of positive cultures. Annualized labor savings were estimated based on salary costs from the U.S. Labor Bureau for Laboratory staff. The model demonstrated a 60.1% reduction in mean total staff time with the adoption of tissue inoculation into blood culture bottles compared to conventional techniques (mean ± standard deviation, 30.7 ± 27.6 versus 77.0 ± 35.3 h per month, respectively; P < 0.001). The estimated annualized labor cost savings of culture using blood culture bottles was $10,876.83 (±$337.16). Sensitivity analysis was performed using various rates of culture positivity (5 to 50%). Culture in blood culture bottles was cost-effective, based on the estimated labor cost savings of $2,132.71 for each percent increase in test accuracy. In conclusion, culture of periprosthetic tissue in blood culture bottles is not only more accurate than but is also cost-saving compared to conventional culture methods. Copyright © 2017 American Society for Microbiology.
2015-01-01
for IC fault detection . This section provides background information on inversion methods. Conventional inversion techniques and their shortcomings are...physical techniques, electron beam imaging/analysis, ion beam techniques, scanning probe techniques. Electrical tests are used to detect faults in 13 an...hand, there is also the second harmonic technique through which duty cycle degradation faults are detected by collecting the magnitude and the phase of
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Analysis of Thick Sandwich Shells with Embedded Ceramic Tiles
NASA Technical Reports Server (NTRS)
Davila, Carlos G.; Smith, C.; Lumban-Tobing, F.
1996-01-01
The Composite Armored Vehicle (CAV) is an advanced technology demonstrator of an all-composite ground combat vehicle. The CAV upper hull is made of a tough light-weight S2-glass/epoxy laminate with embedded ceramic tiles that serve as armor. The tiles are bonded to a rubber mat with a carefully selected, highly viscoelastic adhesive. The integration of armor and structure offers an efficient combination of ballistic protection and structural performance. The analysis of this anisotropic construction, with its inherent discontinuous and periodic nature, however, poses several challenges. The present paper describes a shell-based 'element-layering' technique that properly accounts for these effects and for the concentrated transverse shear flexibility in the rubber mat. One of the most important advantages of the element-layering technique over advanced higher-order elements is that it is based on conventional elements. This advantage allows the models to be portable to other structural analysis codes, a prerequisite in a program that involves the computational facilities of several manufacturers and government laboratories. The element-layering technique was implemented into an auto-layering program that automatically transforms a conventional shell model into a multi-layered model. The effects of tile layer homogenization, tile placement patterns, and tile gap size on the analysis results are described.
NASA Technical Reports Server (NTRS)
Robinson, A. C.; Gorman, H. J.; Hillman, M.; Lawhon, W. T.; Maase, D. L.; Mcclure, T. A.
1976-01-01
The potential U.S. market for tertiary municipal wastewater treatment facilities which make use of water hyacinths was investigated. A baseline design was developed which approximates the "typical" or "average" situation under which hyacinth-based systems can be used. The total market size for tertiary treatment was then estimated for those geographical regions in which hyacinths appear to be applicable. Market penetration of the baseline hyacinth system when competing with conventional chemical and physical processing systems was approximated, based primarily on cost differences. A limited analysis was made of the sensitivity of market penetration to individual changes in these assumptions.
CNN based approach for activity recognition using a wrist-worn accelerometer.
Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R
2017-07-01
In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.
Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K
2017-12-01
Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.
Han, Jubong; Lee, K B; Lee, Jong-Man; Park, Tae Soon; Oh, J S; Oh, Pil-Jei
2016-03-01
We discuss a new method to incorporate Type B uncertainty into least-squares procedures. The new method is based on an extension of the likelihood function from which a conventional least-squares function is derived. The extended likelihood function is the product of the original likelihood function with additional PDFs (Probability Density Functions) that characterize the Type B uncertainties. The PDFs are considered to describe one's incomplete knowledge on correction factors being called nuisance parameters. We use the extended likelihood function to make point and interval estimations of parameters in the basically same way as the least-squares function used in the conventional least-squares method is derived. Since the nuisance parameters are not of interest and should be prevented from appearing in the final result, we eliminate such nuisance parameters by using the profile likelihood. As an example, we present a case study for a linear regression analysis with a common component of Type B uncertainty. In this example we compare the analysis results obtained from using our procedure with those from conventional methods. Copyright © 2015. Published by Elsevier Ltd.
Shrestha, Rojeet; Miura, Yusuke; Hirano, Ken-Ichi; Chen, Zhen; Okabe, Hiroaki; Chiba, Hitoshi; Hui, Shu-Ping
2018-01-01
Fatty acid (FA) profiling of milk has important applications in human health and nutrition. Conventional methods for the saponification and derivatization of FA are time-consuming and laborious. We aimed to develop a simple, rapid, and economical method for the determination of FA in milk. We applied a beneficial approach of microwave-assisted saponification (MAS) of milk fats and microwave-assisted derivatization (MAD) of FA to its hydrazides, integrated with HPLC-based analysis. The optimal conditions for MAS and MAD were determined. Microwave irradiation significantly reduced the sample preparation time from 80 min in the conventional method to less than 3 min. We used three internal standards for the measurement of short-, medium- and long-chain FA. The proposed method showed satisfactory analytical sensitivity, recovery and reproducibility. There was a significant correlation in the milk FA concentrations between the proposed and conventional methods. Being quick, economic, and convenient, the proposed method for the milk FA measurement can be substitute for the convention method.
Vihervaara, Hanna; Grönroos, Juha M; Hurme, Saija; Gullichsen, Risto; Salminen, Paulina
2017-01-01
Endoscopic stents are used to relieve obstructive jaundice. The purpose of this prospective randomized study was to compare the patency of antireflux and conventional plastic biliary stent in relieving distal malignant biliary obstruction. All jaundiced patients admitted to hospital with suspected unresectable malignant distal biliary stricture between October 2009 and September 2010 were evaluated for the study. Eligible patients were randomized either to antireflux or conventional plastic stent arms. The primary endpoint was stent patency and the follow-up was continued either until the stent was occluded or until 6 months after the stent placement. At an interim analysis, antireflux stents (ARSs; n = 6) had a significantly shorter median patency of 34 (8-49) days compared with the conventional stent (n = 7) patency of 167 (38-214) days (P = .0003). Based on these results, the study was terminated due to ethical concerns. According to these results, the use of this ARS is not recommended.
NASA Astrophysics Data System (ADS)
Gray, Bonnie L.
2012-04-01
Microfluidics is revolutionizing laboratory methods and biomedical devices, offering new capabilities and instrumentation in multiple areas such as DNA analysis, proteomics, enzymatic analysis, single cell analysis, immunology, point-of-care medicine, personalized medicine, drug delivery, and environmental toxin and pathogen detection. For many applications (e.g., wearable and implantable health monitors, drug delivery devices, and prosthetics) mechanically flexible polymer devices and systems that can conform to the body offer benefits that cannot be achieved using systems based on conventional rigid substrate materials. However, difficulties in implementing active devices and reliable packaging technologies have limited the success of flexible microfluidics. Employing highly compliant materials such as PDMS that are typically employed for prototyping, we review mechanically flexible polymer microfluidic technologies based on free-standing polymer substrates and novel electronic and microfluidic interconnection schemes. Central to these new technologies are hybrid microfabrication methods employing novel nanocomposite polymer materials and devices. We review microfabrication methods using these materials, along with demonstrations of example devices and packaging schemes that employ them. We review these recent developments and place them in the context of the fields of flexible microfluidics and conformable systems, and discuss cross-over applications to conventional rigid-substrate microfluidics.
Combining real-time monitoring and knowledge-based analysis in MARVEL
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.
1993-01-01
Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.
14 CFR 35.37 - Fatigue limits and evaluation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.37 Fatigue limits and evaluation. This section does not apply to fixed-pitch wood propellers of conventional design. (a) Fatigue limits must be established by tests, or analysis based on tests, for propeller: (1) Hubs. (2) Blades. (3) Blade retention...
14 CFR 35.37 - Fatigue limits and evaluation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.37 Fatigue limits and evaluation. This section does not apply to fixed-pitch wood propellers of conventional design. (a) Fatigue limits must be established by tests, or analysis based on tests, for propeller: (1) Hubs. (2) Blades. (3) Blade retention...
14 CFR 35.37 - Fatigue limits and evaluation.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.37 Fatigue limits and evaluation. This section does not apply to fixed-pitch wood propellers of conventional design. (a) Fatigue limits must be established by tests, or analysis based on tests, for propeller: (1) Hubs. (2) Blades. (3) Blade retention...
14 CFR 35.37 - Fatigue limits and evaluation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.37 Fatigue limits and evaluation. This section does not apply to fixed-pitch wood propellers of conventional design. (a) Fatigue limits must be established by tests, or analysis based on tests, for propeller: (1) Hubs. (2) Blades. (3) Blade retention...
Document-Oriented E-Learning Components
ERIC Educational Resources Information Center
Piotrowski, Michael
2009-01-01
This dissertation questions the common assumption that e-learning requires a "learning management system" (LMS) such as Moodle or Blackboard. Based on an analysis of the current state of the art in LMSs, we come to the conclusion that the functionality of conventional e-learning platforms consists of basic content management and…
Tutorial on Generalized Programming Language s and Systems. Instructor Edition.
ERIC Educational Resources Information Center
Fasana, Paul J., Ed.; Shank, Russell, Ed.
This instructor's manual is a comparative analysis and review of the various computer programing languages currently available and their capabilities for performing text manipulation, information storage, and data retrieval tasks. Based on materials presented at the 1967 Convention of the American Society for Information Science, the manual…
Benchmarking and Modeling of a Conventional Mid-Size Car Using ALPHA (SAE Paper 2015-01-1140)
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...
NASA Astrophysics Data System (ADS)
Mert, A.
2016-12-01
The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.
NASA Astrophysics Data System (ADS)
Inamdar, Kirti; Kosta, Y. P.; Patnaik, S.
2014-10-01
In this paper, we present the design of a metamaterial-based microstrip patch antenna, optimized for bandwidth and multiple frequency operations. A criss-cross structure has been proposed, this shape has been inspired from the famous Jerusalem cross. The theory and design formulas to calculate various parameters of the proposed antenna have been presented. Design starts with the analysis of the proposed unit cell structure, and validating the response using software- HFSS Version 13, to obtain negative response of ε and μ- metamaterial. Following this, a metamaterial-based-microstrip-patch-antenna is designed. A detailed comparative study is conducted exploring the response of the designed patch made of metamaterial and that of the conventional patch. Finally, antenna parameters such as gain, bandwidth, radiation pattern, and multiple frequency responses are investigated and optimised for the same and present in table and response graphs. It is also observed that the physical dimension of the metamaterial-based patch antenna is smaller compared to its conventional counterpart operating at the same fundamental frequency. The challenging part was to develop metamaterial based on some signature structures and techniques that would offer advantage in terms of BW and multiple frequency operation, which is demonstrated in this paper. The unique shape proposed in this paper gives improvement in bandwidth without reducing the gain of the antenna.
Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw
2017-01-01
Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes.
Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw
2017-01-01
Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes. PMID:29250096
Boeker, Martin; Andel, Peter; Vach, Werner; Frankenschmidt, Alexander
2013-01-01
When compared with more traditional instructional methods, Game-based e-learning (GbEl) promises a higher motivation of learners by presenting contents in an interactive, rule-based and competitive way. Most recent systematic reviews and meta-analysis of studies on Game-based learning and GbEl in the medical professions have shown limited effects of these instructional methods. To compare the effectiveness on the learning outcome of a Game-based e-learning (GbEl) instruction with a conventional script-based instruction in the teaching of phase contrast microscopy urinalysis under routine training conditions of undergraduate medical students. A randomized controlled trial was conducted with 145 medical students in their third year of training in the Department of Urology at the University Medical Center Freiburg, Germany. 82 subjects where allocated for training with an educational adventure-game (GbEl group) and 69 subjects for conventional training with a written script-based approach (script group). Learning outcome was measured with a 34 item single choice test. Students' attitudes were collected by a questionnaire regarding fun with the training, motivation to continue the training and self-assessment of acquired knowledge. The students in the GbEl group achieved significantly better results in the cognitive knowledge test than the students in the script group: the mean score was 28.6 for the GbEl group and 26.0 for the script group of a total of 34.0 points with a Cohen's d effect size of 0.71 (ITT analysis). Attitudes towards the recent learning experience were significantly more positive with GbEl. Students reported to have more fun while learning with the game when compared to the script-based approach. Game-based e-learning is more effective than a script-based approach for the training of urinalysis in regard to cognitive learning outcome and has a high positive motivational impact on learning. Game-based e-learning can be used as an effective teaching method for self-instruction.
Boeker, Martin; Andel, Peter; Vach, Werner; Frankenschmidt, Alexander
2013-01-01
Background When compared with more traditional instructional methods, Game-based e-learning (GbEl) promises a higher motivation of learners by presenting contents in an interactive, rule-based and competitive way. Most recent systematic reviews and meta-analysis of studies on Game-based learning and GbEl in the medical professions have shown limited effects of these instructional methods. Objectives To compare the effectiveness on the learning outcome of a Game-based e-learning (GbEl) instruction with a conventional script-based instruction in the teaching of phase contrast microscopy urinalysis under routine training conditions of undergraduate medical students. Methods A randomized controlled trial was conducted with 145 medical students in their third year of training in the Department of Urology at the University Medical Center Freiburg, Germany. 82 subjects where allocated for training with an educational adventure-game (GbEl group) and 69 subjects for conventional training with a written script-based approach (script group). Learning outcome was measured with a 34 item single choice test. Students' attitudes were collected by a questionnaire regarding fun with the training, motivation to continue the training and self-assessment of acquired knowledge. Results The students in the GbEl group achieved significantly better results in the cognitive knowledge test than the students in the script group: the mean score was 28.6 for the GbEl group and 26.0 for the script group of a total of 34.0 points with a Cohen's d effect size of 0.71 (ITT analysis). Attitudes towards the recent learning experience were significantly more positive with GbEl. Students reported to have more fun while learning with the game when compared to the script-based approach. Conclusions Game-based e-learning is more effective than a script-based approach for the training of urinalysis in regard to cognitive learning outcome and has a high positive motivational impact on learning. Game-based e-learning can be used as an effective teaching method for self-instruction. PMID:24349257
NASA Astrophysics Data System (ADS)
Singh, Manpreet; Alabanza, Anginelle; Gonzalez, Lorelis E.; Wang, Weiwei; Reeves, W. Brian; Hahm, Jong-In
2016-02-01
Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules.Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules. Electronic supplementary information (ESI) available: Typical SEM images of the ZnO NRs used in the biomarker assays are provided in Fig. S1. See DOI: 10.1039/c5nr08706f
Conceptual Design of a 150-Passenger Civil Tiltrotor
NASA Technical Reports Server (NTRS)
Costa, Guillermo
2012-01-01
The conceptual design of a short-haul civil tiltrotor aircraft is presented. The concept vehicle is designed for runway-independent operations to increase the capacity of the National Airspace System without the need for increased infrastructure. This necessitates a vehicle that is capable of integrating with conventional air traffic without interfering with established flightpaths. The NASA Design and Analysis of Rotorcraft software was used to size the concept vehicle based on the mission requirements of this market. The final configuration was selected based upon performance metrics such as acquisition and maintenance costs, fuel fraction, empty weight, and required engine power. The concept presented herein has a proposed initial operating capability date of 2035, and is intended to integrate with conventional air traffic as well as proposed future air transportation concepts.
NASA Astrophysics Data System (ADS)
Fotin, Sergei V.; Yin, Yin; Haldankar, Hrishikesh; Hoffmeister, Jeffrey W.; Periaswamy, Senthil
2016-03-01
Computer-aided detection (CAD) has been used in screening mammography for many years and is likely to be utilized for digital breast tomosynthesis (DBT). Higher detection performance is desirable as it may have an impact on radiologist's decisions and clinical outcomes. Recently the algorithms based on deep convolutional architectures have been shown to achieve state of the art performance in object classification and detection. Similarly, we trained a deep convolutional neural network directly on patches sampled from two-dimensional mammography and reconstructed DBT volumes and compared its performance to a conventional CAD algorithm that is based on computation and classification of hand-engineered features. The detection performance was evaluated on the independent test set of 344 DBT reconstructions (GE SenoClaire 3D, iterative reconstruction algorithm) containing 328 suspicious and 115 malignant soft tissue densities including masses and architectural distortions. Detection sensitivity was measured on a region of interest (ROI) basis at the rate of five detection marks per volume. Moving from conventional to deep learning approach resulted in increase of ROI sensitivity from 0:832 +/- 0:040 to 0:893 +/- 0:033 for suspicious ROIs; and from 0:852 +/- 0:065 to 0:930 +/- 0:046 for malignant ROIs. These results indicate the high utility of deep feature learning in the analysis of DBT data and high potential of the method for broader medical image analysis tasks.
Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals
NASA Astrophysics Data System (ADS)
Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam
A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.
Analysis of biomedical time signals for characterization of cutaneous diabetic micro-angiopathy
NASA Astrophysics Data System (ADS)
Kraitl, Jens; Ewald, Hartmut
2007-02-01
Photo-plethysmography (PPG) is frequently used in research on microcirculation of blood. It is a non-invasive procedure and takes minimal time to be carried out. Usually PPG time series are analyzed by conventional linear methods, mainly Fourier analysis. These methods may not be optimal for the investigation of nonlinear effects of the hearth circulation system like vasomotion, autoregulation, thermoregulation, breathing, heartbeat and vessels. The wavelet analysis of the PPG time series is a specific, sensitive nonlinear method for the in vivo identification of hearth circulation patterns and human health status. This nonlinear analysis of PPG signals provides additional information which cannot be detected using conventional approaches. The wavelet analysis has been used to study healthy subjects and to characterize the health status of patients with a functional cutaneous microangiopathy which was associated with diabetic neuropathy. The non-invasive in vivo method is based on the radiation of monochromatic light through an area of skin on the finger. A Photometrical Measurement Device (PMD) has been developed. The PMD is suitable for non-invasive continuous online monitoring of one or more biologic constituent values and blood circulation patterns.
Development of a numerical model for vehicle-bridge interaction analysis of railway bridges
NASA Astrophysics Data System (ADS)
Kim, Hee Ju; Cho, Eun Sang; Ham, Jun Su; Park, Ki Tae; Kim, Tae Heon
2016-04-01
In the field of civil engineering, analyzing dynamic response was main concern for a long time. These analysis methods can be divided into moving load analysis method and moving mass analysis method, and formulating each an equation of motion has recently been studied after dividing vehicles and bridges. In this study, the numerical method is presented, which can consider the various train types and can solve the equations of motion for a vehicle-bridge interaction analysis by non-iteration procedure through formulating the coupled equations for motion. Also, 3 dimensional accurate numerical models was developed by KTX-vehicle in order to analyze dynamic response characteristics. The equations of motion for the conventional trains are derived, and the numerical models of the conventional trains are idealized by a set of linear springs and dashpots with 18 degrees of freedom. The bridge models are simplified by the 3 dimensional space frame element which is based on the Euler-Bernoulli theory. The rail irregularities of vertical and lateral directions are generated by PSD functions of the Federal Railroad Administration (FRA).
Optimizing national immunization program supply chain management in Thailand: an economic analysis.
Riewpaiboon, A; Sooksriwong, C; Chaiyakunapruk, N; Tharmaphornpilas, P; Techathawat, S; Rookkapan, K; Rasdjarmrearnsook, A; Suraratdecha, C
2015-07-01
This study aimed to conduct an economic analysis of the transition of the conventional vaccine supply and logistics systems to the vendor managed inventory (VMI) system in Thailand. Cost analysis of health care program. An ingredients based approach was used to design the survey and collect data for an economic analysis of the immunization supply and logistics systems covering procurement, storage and distribution of vaccines from the central level to the lowest level of vaccine administration facility. Costs were presented in 2010 US dollar. The total cost of the vaccination program including cost of vaccine procured and logistics under the conventional system was US$0.60 per packed volume procured (cm(3)) and US$1.35 per dose procured compared to US$0.66 per packed volume procured (cm(3)) and US$1.43 per dose procured under the VMI system. However, the findings revealed that the transition to the VMI system and outsourcing of the supply chain system reduced the cost of immunization program at US$6.6 million per year because of reduction of un-opened vaccine wastage. The findings demonstrated that the new supply chain system would result in efficiency improvement and potential savings to the immunization program compared to the conventional system. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Matsuura, Yusuke; Rokkaku, Tomoyuki; Suzuki, Takane; Thoreson, Andrew Ryan; An, Kai-Nan; Kuniyoshi, Kazuki
2017-08-01
Forearm diaphysis fractures are usually managed by open reduction internal fixation. Recently, locking plates have been used for treatment. In the long-term period after surgery, some patients present with bone atrophy adjacent to the plate. However, a comparison of locking and conventional plates as a cause of atrophy has not been reported. The aim of this study was to investigate long-term bone atrophy associated with use of locking and conventional plates for forearm fracture treatment. In this study we included 15 patients with forearm fracture managed by either locking or conventional plates and with more than 5 years of follow-up. Computed tomographic imaging of both forearms was performed to assess bone thickness and local bone mineral density and to predict bone strength without plate reinforcement based on finite element analysis. Mean patient age at surgery was 48.0 years. Eight patients underwent reduction with fixed locking plates and were followed up for a mean of 79.5 months; the remaining 7 patients were treated with conventional plates and were followed up for a mean of 105.0 months. Compared with the conventional plate group, the locking plate group had the same fractured limb-contralateral limb ratio of cortex bone thickness, but had significantly lower ratios of mineral density adjacent to the plate and adjusted bone strength. This study demonstrated bone atrophy after locking plate fixation for forearm fractures. Treatment plans for forearm fracture should take into consideration the impact of bone atrophy long after plate fixation. Therapeutic IV. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Kharfan-Dabaja, M A; Pidala, J; Kumar, A; Terasawa, T; Djulbegovic, B
2012-09-01
Despite therapeutic advances, relapsed/refractory CLL, particularly after fludarabine-based regimens, remains a major challenge for which optimal therapy is undefined. No randomized comparative data exist to suggest the superiority of reduced-toxicity allogeneic hematopoietic cell transplantation (RT-allo-HCT) over conventional chemo-(immuno) therapy (CCIT). By using estimates from a systematic review and by meta-analysis of available published evidence, we constructed a Markov decision model to examine these competing modalities. Cohort analysis demonstrated superior outcome for RT-allo-HCT, with a 10-month overall life expectancy (and 6-month quality-adjusted life expectancy (QALE)) advantage over CCIT. Although the model was sensitive to changes in base-case assumptions and transition probabilities, RT-allo-HCT provided superior overall life expectancy through a range of values supported by the meta-analysis. QALE was superior for RT-allo-HCT compared with CCIT. This conclusion was sensitive to change in the anticipated state utility associated with the post-allogeneic HCT state; however, RT-allo-HCT remained the optimal strategy for values supported by existing literature. This analysis provides a quantitative comparison of outcomes between RT-allo-HCT and CCIT for relapsed/refractory CLL in the absence of randomized comparative trials. Confirmation of these findings requires a prospective randomized trial, which compares the most effective RT-allo-HCT and CCIT regimens for relapsed/refractory CLL.
NASA Astrophysics Data System (ADS)
Maguen, Ezra I.; Salz, James J.; McDonald, Marguerite B.; Pettit, George H.; Papaioannou, Thanassis; Grundfest, Warren S.
2002-06-01
A study was undertaken to assess whether results of laser vision correction with the LADARVISION 193-nm excimer laser (Alcon-Autonomous technologies) can be improved with the use of wavefront analysis generated by a proprietary system including a Hartman-Schack sensor and expressed using Zernicke polynomials. A total of 82 eyes underwent LASIK in several centers with an improved algorithm, using the CustomCornea system. A subgroup of 48 eyes of 24 patients was randomized so that one eye undergoes conventional treatment and one eye undergoes treatment based on wavefront analysis. Treatment parameters were equal for each type of refractive error. 83% of all eyes had uncorrected vision of 20/20 or better and 95% were 20/25 or better. In all groups, uncorrected visual acuities did not improve significantly in eyes treated with wavefront analysis compared to conventional treatments. Higher order aberrations were consistently better corrected in eyes undergoing treatment based on wavefront analysis for LASIK at 6 months postop. In addition, the number of eyes with reduced RMS was significantly higher in the subset of eyes treated with a wavefront algorithm (38% vs. 5%). Wavefront technology may improve the outcomes of laser vision correction with the LADARVISION excimer laser. Further refinements of the technology and clinical trials will contribute to this goal.
Integrated microfluidic technology for sub-lethal and behavioral marine ecotoxicity biotests
NASA Astrophysics Data System (ADS)
Huang, Yushi; Reyes Aldasoro, Constantino Carlos; Persoone, Guido; Wlodkowic, Donald
2015-06-01
Changes in behavioral traits exhibited by small aquatic invertebrates are increasingly postulated as ethically acceptable and more sensitive endpoints for detection of water-born ecotoxicity than conventional mortality assays. Despite importance of such behavioral biotests, their implementation is profoundly limited by the lack of appropriate biocompatible automation, integrated optoelectronic sensors, and the associated electronics and analysis algorithms. This work outlines development of a proof-of-concept miniaturized Lab-on-a-Chip (LOC) platform for rapid water toxicity tests based on changes in swimming patterns exhibited by Artemia franciscana (Artoxkit M™) nauplii. In contrast to conventionally performed end-point analysis based on counting numbers of dead/immobile specimens we performed a time-resolved video data analysis to dynamically assess impact of a reference toxicant on swimming pattern of A. franciscana. Our system design combined: (i) innovative microfluidic device keeping free swimming Artemia sp. nauplii under continuous microperfusion as a mean of toxin delivery; (ii) mechatronic interface for user-friendly fluidic actuation of the chip; and (iii) miniaturized video acquisition for movement analysis of test specimens. The system was capable of performing fully programmable time-lapse and video-microscopy of multiple samples for rapid ecotoxicity analysis. It enabled development of a user-friendly and inexpensive test protocol to dynamically detect sub-lethal behavioral end-points such as changes in speed of movement or distance traveled by each animal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khan, Yasin; Khare, Vaibhav Rai; Mathur, Jyotirmay
The paper describes a parametric study developed to estimate the energy savings potential of a radiant cooling system installed in a commercial building in India. The study is based on numerical modeling of a radiant cooling system installed in an Information Technology (IT) office building sited in the composite climate of Hyderabad. To evaluate thermal performance and energy consumption, simulations were carried out using the ANSYS FLUENT and EnergyPlus softwares, respectively. The building model was calibrated using the measured data for the installed radiant system. Then this calibrated model was used to simulate the energy consumption of a building usingmore » a conventional all-air system to determine the proportional energy savings. For proper handling of the latent load, a dedicated outside air system (DOAS) was used as an alternative to Fan Coil Unit (FCU). A comparison of energy consumption calculated that the radiant system was 17.5 % more efficient than a conventional all-air system and that a 30% savings was achieved by using a DOAS system compared with a conventional system. Computational Fluid Dynamics (CFD) simulation was performed to evaluate indoor air quality and thermal comfort. It was found that a radiant system offers more uniform temperatures, as well as a better mean air temperature range, than a conventional system. To further enhance the energy savings in the radiant system, different operational strategies were analyzed based on thermal analysis using EnergyPlus. Lastly, the energy savings achieved in this parametric run were more than 10% compared with a conventional all-air system.« less
Evaluation of mobile digital light-emitting diode fluorescence microscopy in Hanoi, Viet Nam.
Chaisson, L H; Reber, C; Phan, H; Switz, N; Nilsson, L M; Myers, F; Nhung, N V; Luu, L; Pham, T; Vu, C; Nguyen, H; Nguyen, A; Dinh, T; Nahid, P; Fletcher, D A; Cattamanchi, A
2015-09-01
Hanoi Lung Hospital, Hanoi, Viet Nam. To compare the accuracy of CellScopeTB, a manually operated mobile digital fluorescence microscope, with conventional microscopy techniques. Patients referred for sputum smear microscopy to the Hanoi Lung Hospital from May to September 2013 were included. Ziehl-Neelsen (ZN) smear microscopy, conventional light-emitting diode (LED) fluorescence microscopy (FM), CellScopeTB-based LED FM and Xpert(®) MTB/RIF were performed on sputum samples. The sensitivity and specificity of microscopy techniques were determined in reference to Xpert results, and differences were compared using McNemar's paired test of proportions. Of 326 patients enrolled, 93 (28.5%) were Xpert-positive for TB. The sensitivity of ZN microscopy, conventional LED FM, and CellScopeTB-based LED FM was respectively 37.6% (95%CI 27.8-48.3), 41.9% (95%CI 31.8-52.6), and 35.5% (95%CI 25.8-46.1). The sensitivity of CellScopeTB was similar to that of conventional LED FM (difference -6.5%, 95%CI -18.2 to 5.3, P = 0.33) and ZN microscopy (difference -2.2%, 95%CI -9.2 to 4.9, P = 0.73). The specificity was >99% for all three techniques. CellScopeTB performed similarly to conventional microscopy techniques in the hands of experienced TB microscopists. However, the sensitivity of all sputum microscopy techniques was low. Options enabled by digital microscopy, such as automated imaging with real-time computerized analysis, should be explored to increase sensitivity.
NASA Astrophysics Data System (ADS)
Fukumoto, S.; Minami, M.; Soeda, A.; Matsushima, M.; Takahashi, M.; Yokoyama, Y.; Fujimoto, K.
2012-08-01
Zr-based bulk metallic glasses are expected to be welded to conventional structural alloys. Dissimilar welding of metallic glasses to stainless steel was carried out by resistance microwelding. The metallurgical analysis of the weld interface revealed the welding mechanism. A thin reaction layer was formed between the two liquid materials. The melting of stainless steel should be limited to obtain sound joints.
NASA Astrophysics Data System (ADS)
Danilov, A. A.; Kramarenko, V. K.; Nikolaev, D. V.; Rudnev, S. G.; Salamatova, V. Yu; Smirnov, A. V.; Vassilevski, Yu V.
2013-04-01
In this work, an adaptive unstructured tetrahedral mesh generation technology is applied for simulation of segmental bioimpedance measurements using high-resolution whole-body model of the Visible Human Project man. Sensitivity field distributions for a conventional tetrapolar, as well as eight- and ten-electrode measurement configurations are obtained. Based on the ten-electrode configuration, we suggest an algorithm for monitoring changes in the upper lung area.
ERIC Educational Resources Information Center
Maushak, Nancy J., Ed.; Schlosser, Charles, Ed.; Lloyd, Thomas N., Ed.; Simonson, Michael, Ed.
1998-01-01
Subjects addressed by the 55 papers in this proceedings include: teaching literacy; hypermedia navigation and design; creating a community of thinkers; analysis-based message design; learner-instruction interactions; representation of time-based information in visual design; presentation interference; professional development through anecdotes;…
Średnicka-Tober, Dominika; Barański, Marcin; Seal, Chris; Sanderson, Roy; Benbrook, Charles; Steinshamn, Håvard; Gromadzka-Ostrowska, Joanna; Rembiałkowska, Ewa; Skwarło-Sońta, Krystyna; Eyre, Mick; Cozzi, Giulio; Krogh Larsen, Mette; Jordon, Teresa; Niggli, Urs; Sakowski, Tomasz; Calder, Philip C; Burdge, Graham C; Sotiraki, Smaragda; Stefanakis, Alexandros; Yolcu, Halil; Stergiadis, Sokratis; Chatzidimitriou, Eleni; Butler, Gillian; Stewart, Gavin; Leifert, Carlo
2016-03-28
Demand for organic meat is partially driven by consumer perceptions that organic foods are more nutritious than non-organic foods. However, there have been no systematic reviews comparing specifically the nutrient content of organic and conventionally produced meat. In this study, we report results of a meta-analysis based on sixty-seven published studies comparing the composition of organic and non-organic meat products. For many nutritionally relevant compounds (e.g. minerals, antioxidants and most individual fatty acids (FA)), the evidence base was too weak for meaningful meta-analyses. However, significant differences in FA profiles were detected when data from all livestock species were pooled. Concentrations of SFA and MUFA were similar or slightly lower, respectively, in organic compared with conventional meat. Larger differences were detected for total PUFA and n-3 PUFA, which were an estimated 23 (95 % CI 11, 35) % and 47 (95 % CI 10, 84) % higher in organic meat, respectively. However, for these and many other composition parameters, for which meta-analyses found significant differences, heterogeneity was high, and this could be explained by differences between animal species/meat types. Evidence from controlled experimental studies indicates that the high grazing/forage-based diets prescribed under organic farming standards may be the main reason for differences in FA profiles. Further studies are required to enable meta-analyses for a wider range of parameters (e.g. antioxidant, vitamin and mineral concentrations) and to improve both precision and consistency of results for FA profiles for all species. Potential impacts of composition differences on human health are discussed.
Khogli, Ahmed Eltigani; Cauwels, Rita; Vercruysse, Chris; Verbeeck, Ronald; Martens, Luc
2013-01-01
Optimal pit and fissure sealing is determined by surface preparation techniques and choice of materials. This study aimed (i) to compare the microleakage and penetration depth of a hydrophilic sealant and a conventional resin-based sealant using one of the following preparation techniques: acid etching (AE) only, a diamond bur + AE, and Er:YAG laser combined with AE, and (ii) to evaluate the microleakage and penetration depth of the hydrophilic pit and fissure sealant on different surface conditions. Eighty recently extracted 3rd molars were randomly assigned to eight groups of ten teeth according to the material, preparation technique, and surface condition. For saliva contamination, 0.1 mL of fresh whole human saliva was used. All samples were submitted to 1000 thermal cycles and immersed in 2% methylene blue dye for 4 h. Sections were examined by a light microscope and analysed using image analysis software (Sigmascan(®)). The combination of Er:YAG + AE + conventional sealant showed the least microleakage. The sealing ability of the hydrophilic sealant was influenced by the surface condition. Er:YAG ablation significantly decreased the microleakage at the tooth-sealant interface compared to the non-invasive technique. The hydrophilic sealant applied on different surface conditions showed comparable result to the conventional resin-based sealant. © 2012 The Authors. International Journal of Paediatric Dentistry © 2012 BSPD, IAPD and Blackwell Publishing Ltd.
Ogourtsova, Tatiana; Souza Silva, Wagner; Archambault, Philippe S; Lamontagne, Anouk
2017-04-01
Unilateral spatial neglect (USN) is a highly prevalent post-stroke deficit. Currently, there is no gold standard USN assessment which encompasses the heterogeneity of this disorder and that is sensitive to detect mild deficits. Similarly, there is a limited number of high quality studies suggesting that conventional USN treatments are effective in improving functional outcomes and reducing disability. Virtual reality (VR) provides enhanced methods for USN assessment and treatment. To establish best-practice recommendations with respect to its use, it is necessary to appraise the existing evidence. This systematic review aimed to identify and appraise existing VR-based USN assessments; and to determine whether VR is more effective than conventional therapy. Assessment tools were critically appraised using standard criteria. The methodological quality of the treatment trials was rated by two authors. The level of evidence according to stage of recovery was determined. Findings were compiled into a VR-based USN Assessment and Treatment Toolkit (VR-ATT). Twenty-three studies were identified. The proposed VR tools augmented the conventional assessment strategies. However, most studies lacked analysis of psychometric properties. There is limited evidence that VR is more effective than conventional therapy in improving USN symptoms in patients with stroke. It was concluded that VR-ATT could facilitate identification and decision-making as to the appropriateness of VR-based USN assessments and treatments across the continuum of stroke care, but more evidence is required on treatment effectiveness.
Network-Based Community Brings forth Sustainable Society
NASA Astrophysics Data System (ADS)
Kikuchi, Toshiko
It has already been shown that an artificial society based on the three relations of social configuration (market, communal, and obligatory relations) functioning in balance with each other formed a sustainable society which the social reproduction is possible. In this artificial society model, communal relations exist in a network-based community with alternating members rather than a conventional community with cooperative mutual assistance practiced in some agricultural communities. In this paper, using the comparison between network-based communities with alternating members and conventional communities with fixed members, the significance of a network-based community is considered. In concrete terms, the difference in appearance rate for sustainable society, economic activity and asset inequality between network-based communities and conventional communities is analyzed. The appearance rate for a sustainable society of network-based community is higher than that of conventional community. Moreover, most of network-based communities had a larger total number of trade volume than conventional communities. But, the value of Gini coefficient in conventional community is smaller than that of network-based community. These results show that communal relations based on a network-based community is significant for the social reproduction and economic efficiency. However, in such an artificial society, the inequality is sacrificed.
How the Hilbert integral theorem inspired flow lines
NASA Astrophysics Data System (ADS)
Winston, Roland; Jiang, Lun
2017-09-01
Nonimaging Optics has been shown to achieve the theoretical limits constrained only by thermodynamic principles. The designing principles of nonimaging optics allow a non-conventional way of thinking about and generating new optical devices. Compared to conventional imaging optics which rarely utilizes the framework of thermodynamic arguments, nonimaging optics chooses to map etendue instead of rays. This fundamental shift of design paradigm frees the optics design from ray based designs which heavily relies on error tolerance analysis. Instead, the underlying thermodynamic principles guide the nonimaging design to be naturally constructed for extended light source for illumination, non-tracking concentrators and sensors that require sharp cut-off angles. We argue in this article that such optical devices which has enabled a multitude of applications depends on probabilities, geometric flux field and radiative heat transfer while "optics" in the conventional sense recedes into the background.
NASA Astrophysics Data System (ADS)
Czán, Andrej; Kubala, Ondrej; Danis, Igor; Czánová, Tatiana; Holubják, Jozef; Mikloš, Matej
2017-12-01
The ever-increasing production and the usage of hard-to-machine progressive materials are the main cause of continual finding of new ways and methods of machining. One of these ways is the ceramic milling tool, which combines the pros of conventional ceramic cutting materials and pros of conventional coating steel-based insert. These properties allow to improve cutting conditions and so increase the productivity with preserved quality known from conventional tools usage. In this paper, there is made the identification of properties and possibilities of this tool when machining of hard-to-machine materials such as nickel alloys using in airplanes engines. This article is focused on the analysis and evaluation ordinary technological parameters and surface quality, mainly roughness of surface and quality of machined surface and tool wearing.
Rapid in vivo vertical tissue sectioning by multiphoton tomography
NASA Astrophysics Data System (ADS)
Batista, Ana; Breunig, Hans Georg; König, Karsten
2018-02-01
A conventional tool in the pathological field is histology which involves the analysis of thin sections of tissue in which specific cellular structures are stained with different dyes. The process to obtain these stained tissue sections is time consuming and invasive as it requires tissue removal, fixation, sectioning, and staining. Moreover, imaging of live tissue is not possible. We demonstrate that multiphoton tomography can provide within seconds, non-invasive, label-free, vertical images of live tissue which are in quality similar to conventional light micrographs of histologic stained specimen. In contrast to conventional setups based on laser scanning which image horizontally sections, the vertical in vivo images are directly recorded by combined line scanning and timed adjustments of the height of the focusing optics. In addition, multiphoton tomography provides autofluorescence lifetimes which can be used to determine the metabolic states of cells.
NASA Astrophysics Data System (ADS)
Rizvi, Imran; Bulin, Anne-Laure; Anbil, Sriram R.; Briars, Emma A.; Vecchio, Daniela; Celli, Jonathan P.; Broekgaarden, Mans; Hasan, Tayyaba
2017-02-01
Targeting the molecular and cellular cues that influence treatment resistance in tumors is critical to effectively treating unresponsive populations of stubborn disease. The informed design of mechanism-based combinations is emerging as increasingly important to targeting resistance and improving the efficacy of conventional treatments, while minimizing toxicity. Photodynamic therapy (PDT) has been shown to synergize with conventional agents and to overcome the evasion pathways that cause resistance. Increasing evidence shows that PDT-based combinations cooperate mechanistically with, and improve the therapeutic index of, traditional chemotherapies. These and other findings emphasize the importance of including PDT as part of comprehensive treatment plans for cancer, particularly in complex disease sites. Identifying effective combinations requires a multi-faceted approach that includes the development of bioengineered cancer models and corresponding image analysis tools. The molecular and phenotypic basis of verteporfin-mediated PDT-based enhancement of chemotherapeutic efficacy and predictability in complex 3D models for ovarian cancer will be presented.
Holland, Tanja; Blessing, Daniel; Hellwig, Stephan; Sack, Markus
2013-10-01
Radio frequency impedance spectroscopy (RFIS) is a robust method for the determination of cell biomass during fermentation. RFIS allows non-invasive in-line monitoring of the passive electrical properties of cells in suspension and can distinguish between living and dead cells based on their distinct behavior in an applied radio frequency field. We used continuous in situ RFIS to monitor batch-cultivated plant suspension cell cultures in stirred-tank bioreactors and compared the in-line data to conventional off-line measurements. RFIS-based analysis was more rapid and more accurate than conventional biomass determination, and was sensitive to changes in cell viability. The higher resolution of the in-line measurement revealed subtle changes in cell growth which were not accessible using conventional methods. Thus, RFIS is well suited for correlating such changes with intracellular states and product accumulation, providing unique opportunities for employing systems biotechnology and process analytical technology approaches to increase product yield and quality. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Control algorithms and applications of the wavefront sensorless adaptive optics
NASA Astrophysics Data System (ADS)
Ma, Liang; Wang, Bin; Zhou, Yuanshen; Yang, Huizhen
2017-10-01
Compared with the conventional adaptive optics (AO) system, the wavefront sensorless (WFSless) AO system need not to measure the wavefront and reconstruct it. It is simpler than the conventional AO in system architecture and can be applied to the complex conditions. Based on the analysis of principle and system model of the WFSless AO system, wavefront correction methods of the WFSless AO system were divided into two categories: model-free-based and model-based control algorithms. The WFSless AO system based on model-free-based control algorithms commonly considers the performance metric as a function of the control parameters and then uses certain control algorithm to improve the performance metric. The model-based control algorithms include modal control algorithms, nonlinear control algorithms and control algorithms based on geometrical optics. Based on the brief description of above typical control algorithms, hybrid methods combining the model-free-based control algorithm with the model-based control algorithm were generalized. Additionally, characteristics of various control algorithms were compared and analyzed. We also discussed the extensive applications of WFSless AO system in free space optical communication (FSO), retinal imaging in the human eye, confocal microscope, coherent beam combination (CBC) techniques and extended objects.
NASA Astrophysics Data System (ADS)
Venkata Subbaiah, K.; Raju, Ch.; Suresh, Ch.
2017-08-01
The present study aims to compare the conventional cutting inserts with wiper cutting inserts during the hard turning of AISI 4340 steel at different workpiece hardness. Type of insert, hardness, cutting speed, feed, and depth of cut are taken as process parameters. Taguchi’s L18 orthogonal array was used to conduct the experimental tests. Parametric analysis carried in order to know the influence of each process parameter on the three important Surface Roughness Characteristics (Ra, Rz, and Rt) and Material Removal Rate. Taguchi based Grey Relational Analysis (GRA) used to optimize the process parameters for individual response and multi-response outputs. Additionally, the analysis of variance (ANOVA) is also applied to identify the most significant factor.
Matsudate, Yoshihiro; Naruto, Takuya; Hayashi, Yumiko; Minami, Mitsuyoshi; Tohyama, Mikiko; Yokota, Kenji; Yamada, Daisuke; Imoto, Issei; Kubo, Yoshiaki
2017-06-01
Nevoid basal cell carcinoma syndrome (NBCCS) is an autosomal dominant disorder mainly caused by heterozygous mutations of PTCH1. In addition to characteristic clinical features, detection of a mutation in causative genes is reliable for the diagnosis of NBCCS; however, no mutations have been identified in some patients using conventional methods. To improve the method for the molecular diagnosis of NBCCS. We performed targeted exome sequencing (TES) analysis using a multi-gene panel, including PTCH1, PTCH2, SUFU, and other sonic hedgehog signaling pathway-related genes, based on next-generation sequencing (NGS) technology in 8 cases in whom possible causative mutations were not detected by previously performed conventional analysis and 2 recent cases of NBCCS. Subsequent analysis of gross deletion within or around PTCH1 detected by TES was performed using chromosomal microarray (CMA). Through TES analysis, specific single nucleotide variants or small indels of PTCH1 causing inferred amino acid changes were identified in 2 novel cases and 2 undiagnosed cases, whereas gross deletions within or around PTCH1, which are validated by CMA, were found in 3 undiagnosed cases. However, no mutations were detected even by TES in 3 cases. Among 3 cases with gross deletions of PTCH1, deletions containing the entire PTCH1 and additional neighboring genes were detected in 2 cases, one of which exhibited atypical clinical features, such as severe mental retardation, likely associated with genes located within the 4.3Mb deleted region, especially. TES-based simultaneous evaluation of sequences and copy number status in all targeted coding exons by NGS is likely to be more useful for the molecular diagnosis of NBCCS than conventional methods. CMA is recommended as a subsequent analysis for validation and detailed mapping of deleted regions, which may explain the atypical clinical features of NBCCS cases. Copyright © 2017 Japanese Society for Investigative Dermatology. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Yupeng; Chang, Kyunghi
In this paper, we analyze the coexistence issues of M-WiMAX TDD and WCDMA FDD systems. Smart antenna techniques are applied to mitigate the performance loss induced by adjacent channel interference (ACI) in the scenarios where performance is heavily degraded. In addition, an ACI model is proposed to capture the effect of transmit beamforming at the M-WiMAX base station. Furthermore, a MCS-based throughput analysis is proposed, to jointly consider the effects of ACI, system packet error rate requirement, and the available modulation and coding schemes, which is not possible by using the conventional Shannon equation based analysis. From the results, we find that the proposed MCS-based analysis method is quite suitable to analyze the system theoretical throughput in a practical manner.
Analysis of In-Route Wireless Charging for the Shuttle System at Zion National Park
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meintz, Andrew; Prohaska, Robert; Konan, Arnaud
System right-sizing is critical to implementation of wireless power transfer (WPT) for electric vehicles (EVs). This study will analyze potential WPT scenarios for the electrification of shuttle buses at Zion National Park utilizing a modelling tool developed by NREL called WPTSim. This tool uses second-by-second speed, location, and road grade data from the conventional shuttles in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variable number of chargingmore » locations. The outcome of this work is an analysis of the design tradeoffs for the electrification of the shuttle fleet with wireless charging versus conventional overnight charging.« less
Nauleau, Pierre; Apostolakis, Iason; McGarry, Matthew; Konofagou, Elisa
2018-05-29
The stiffness of the arteries is known to be an indicator of the progression of various cardiovascular diseases. Clinically, the pulse wave velocity (PWV) is used as a surrogate for arterial stiffness. Pulse wave imaging (PWI) is a non-invasive, ultrasound-based imaging technique capable of mapping the motion of the vessel walls, allowing the local assessment of arterial properties. Conventionally, a distinctive feature of the displacement wave (e.g. the 50% upstroke) is tracked across the map to estimate the PWV. However, the presence of reflections, such as those generated at the carotid bifurcation, can bias the PWV estimation. In this paper, we propose a two-step cross-correlation based method to characterize arteries using the information available in the PWI spatio-temporal map. First, the area under the cross-correlation curve is proposed as an index for locating the regions of different properties. Second, a local peak of the cross-correlation function is tracked to obtain a less biased estimate of the PWV. Three series of experiments were conducted in phantoms to evaluate the capabilities of the proposed method compared with the conventional method. In the ideal case of a homogeneous phantom, the two methods performed similarly and correctly estimated the PWV. In the presence of reflections, the proposed method provided a more accurate estimate than conventional processing: e.g. for the soft phantom, biases of -0.27 and -0.71 m · s -1 were observed. In a third series of experiments, the correlation-based method was able to locate two regions of different properties with an error smaller than 1 mm. It also provided more accurate PWV estimates than conventional processing (biases: -0.12 versus -0.26 m · s -1 ). Finally, the in vivo feasibility of the proposed method was demonstrated in eleven healthy subjects. The results indicate that the correlation-based method might be less precise in vivo but more accurate than the conventional method.
ERIC Educational Resources Information Center
Hu, Li-tze; Bentler, Peter M.
1999-01-01
The adequacy of "rule of thumb" conventional cutoff criteria and several alternatives for fit indices in covariance structure analysis was evaluated through simulation. Analyses suggest that, for all recommended fit indexes except one, a cutoff criterion greater than (or sometimes smaller than) the conventional rule of thumb is required…
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
Fluorescence method for enzyme analysis which couples aromatic amines with aromatic aldehydes
Smith, R.E.; Dolbeare, F.A.
1980-10-21
Analysis of proteinases is accomplished using conventional amino acid containing aromatic amine substrates. Aromatic amines such as 4-methoxy-2-naphthylamine (4M2NA), 2-naphthylamine, aminoisophthalic acid dimethyl ester, p-nitroaniline, 4-methoxy-1-aminofluorene and coumarin derivatives resulting from enzymatic hydrolysis of the substrate couples with aromatic aldehydes such as 5-nitrosalicylaldehyde (5-NSA), benzaldehyde and p-nitrobenzaldehyde to produce Schiff-base complexes which are water insoluble. Certain Schiff-base complexes produce a shift from blue to orange-red (visible) fluorescence. Such complexes are useful in the assay of enzymes. No Drawings
Analysis of high voltage step-up nonisolated DC-DC boost converters
NASA Astrophysics Data System (ADS)
Alisson Alencar Freitas, Antônio; Lessa Tofoli, Fernando; Junior, Edilson Mineiro Sá; Daher, Sergio; Antunes, Fernando Luiz Marcelo
2016-05-01
A high voltage step-up nonisolated DC-DC converter based on coupled inductors suitable to photovoltaic (PV) systems applications is proposed in this paper. Considering that numerous approaches exist to extend the voltage conversion ratio of DC-DC converters that do not use transformers, a detailed comparison is also presented among the proposed converter and other popular topologies such as the conventional boost converter and the quadratic boost converter. The qualitative analysis of the coupled-inductor-based topology is developed so that a design procedure can be obtained, from which an experimental prototype is implemented to validate the theoretical assumptions.
Fluorescence method for enzyme analysis which couples aromatic amines with aromatic aldehydes
Smith, Robert E. [557 Escondido Cir., Livermore, CA 94550; Dolbeare, Frank A. [5178 Diane La., Livermore, CA 94550
1980-10-21
Analysis of proteinases is accomplished using conventional amino acid containing aromatic amine substrates. Aromatic amines such as 4-methoxy-2-naphthylamine (4M2NA), 2-naphthylamine, aminoisophthalic acid dimethyl ester, p-nitroaniline, 4-methoxy-1-aminofluorene and coumarin derivatives resulting from enzymatic hydrolysis of the substrate couples with aromatic aldehydes such as 5-nitrosalicylaldehyde (5-NSA), benzaldehyde and p-nitrobenzaldehyde to produce Schiff-base complexes which are water insoluble. Certain Schiff-base complexes produce a shift from blue to orange-red (visible) fluorescence. Such complexes are useful in the assay of enzymes.
Fluorescence method for enzyme analysis which couples aromatic amines with aromatic aldehydes
Smith, Robert E.; Dolbeare, Frank A.
1979-01-01
Analysis of proteinases is accomplished using conventional amino acid containing aromatic amine substrates. Aromatic amines such as 4-methoxy-2-naphthylamine (4M2NA), 2-naphthylamine, aminoisophthalic acid dimethyl ester, p-nitroaniline, 5-methoxy-1-aminofluorene and coumarin derivatives resulting from enzymatic hydrolysis of the substrate couples with aromatic aldehydes such as 5-nitrosalicylaldehyde (5-NSA), benzaldehyde and p-nitrobenzaldehyde to produce Schiff-base complexes which are water insoluble. Certain Schiff-base complexes produce a shift from blue to orange-red (visible) fluorescence. Such complexes are useful in the assay of enzymes.
Electrofacies analysis for coal lithotype profiling based on high-resolution wireline log data
NASA Astrophysics Data System (ADS)
Roslin, A.; Esterle, J. S.
2016-06-01
The traditional approach to coal lithotype analysis is based on a visual characterisation of coal in core, mine or outcrop exposures. As not all wells are fully cored, the petroleum and coal mining industries increasingly use geophysical wireline logs for lithology interpretation.This study demonstrates a method for interpreting coal lithotypes from geophysical wireline logs, and in particular discriminating between bright or banded, and dull coal at similar densities to a decimetre level. The study explores the optimum combination of geophysical log suites for training the coal electrofacies interpretation, using neural network conception, and then propagating the results to wells with fewer wireline data. This approach is objective and has a recordable reproducibility and rule set.In addition to conventional gamma ray and density logs, laterolog resistivity, microresistivity and PEF data were used in the study. Array resistivity data from a compact micro imager (CMI tool) were processed into a single microresistivity curve and integrated with the conventional resistivity data in the cluster analysis. Microresistivity data were tested in the analysis to test the hypothesis that the improved vertical resolution of microresistivity curve can enhance the accuracy of the clustering analysis. The addition of PEF log allowed discrimination between low density bright to banded coal electrofacies and low density inertinite-rich dull electrofacies.The results of clustering analysis were validated statistically and the results of the electrofacies results were compared to manually derived coal lithotype logs.
Self-calibration method without joint iteration for distributed small satellite SAR systems
NASA Astrophysics Data System (ADS)
Xu, Qing; Liao, Guisheng; Liu, Aifei; Zhang, Juan
2013-12-01
The performance of distributed small satellite synthetic aperture radar systems degrades significantly due to the unavoidable array errors, including gain, phase, and position errors, in real operating scenarios. In the conventional method proposed in (IEEE T Aero. Elec. Sys. 42:436-451, 2006), the spectrum components within one Doppler bin are considered as calibration sources. However, it is found in this article that the gain error estimation and the position error estimation in the conventional method can interact with each other. The conventional method may converge to suboptimal solutions in large position errors since it requires the joint iteration between gain-phase error estimation and position error estimation. In addition, it is also found that phase errors can be estimated well regardless of position errors when the zero Doppler bin is chosen. In this article, we propose a method obtained by modifying the conventional one, based on these two observations. In this modified method, gain errors are firstly estimated and compensated, which eliminates the interaction between gain error estimation and position error estimation. Then, by using the zero Doppler bin data, the phase error estimation can be performed well independent of position errors. Finally, position errors are estimated based on the Taylor-series expansion. Meanwhile, the joint iteration between gain-phase error estimation and position error estimation is not required. Therefore, the problem of suboptimal convergence, which occurs in the conventional method, can be avoided with low computational method. The modified method has merits of faster convergence and lower estimation error compared to the conventional one. Theoretical analysis and computer simulation results verified the effectiveness of the modified method.
A simple, less invasive stripper micropipetter-based technique for day 3 embryo biopsy.
Cedillo, Luciano; Ocampo-Bárcenas, Azucena; Maldonado, Israel; Valdez-Morales, Francisco J; Camargo, Felipe; López-Bayghen, Esther
2016-01-01
Preimplantation genetic screening (PGS) is an important procedure for in vitro fertilization (IVF). A key step of PGS, blastomere removal, is abundant with many technical issues. The aim of this study was to compare a more simple procedure based on the Stipper Micropipetter, named S-biopsy, to the conventional aspiration method. On Day 3, 368 high-quality embryos (>7 cells on Day3 with <10% fragmentation) were collected from 38 women. For each patient, their embryos were equally separated between the conventional method ( n = 188) and S-biopsy method ( n = 180). The conventional method was performed using a standardized protocol. For the S-biopsy method, a laser was used to remove a significantly smaller portion of the zona pellucida. Afterwards, the complete embryo was aspirated with a Stripper Micropipetter, forcing the removal of the blastomere. Selected blastomeres went to PGS using CGH microarrays. Embryo integrity and blastocyst formation were assessed on Day 5. Differences between groups were assessed by either the Mann-Whitney test or Fisher Exact test. Both methods resulted in the removal of only one blastomere. The S-biopsy and the conventional method did not differ in terms of affecting embryo integrity (95.0% vs. 95.7%) or blastocyst formation (72.7% vs. 70.7%). PGS analysis indicated that aneuploidy rate were similar between the two methods (63.1% vs. 65.2%). However, the time required to perform the S-biopsy method (179.2 ± 17.5 s) was significantly shorter (5-fold) than the conventional method. The S-biopsy method is comparable to the conventional method that is used to remove a blastomere for PGS, but requires less time. Furthermore, due to the simplicity of the S-biopsy technique, this method is more ideal for IVF laboratories.
The Effect of Right Colon Retroflexion on Adenoma Detection: A Systematic Review and Meta-analysis.
Cohen, Jonah; Grunwald, Douglas; Grossberg, Laurie B; Sawhney, Mandeep S
2017-10-01
Although colonoscopy with polypectomy can prevent up to 80% of colorectal cancers, a significant adenoma miss rate still exists, particularly in the right colon. Previous studies addressing right colon retroflexion have revealed discordant evidence regarding the benefit of this maneuver on adenoma detection with concomitant concerns about safety and rates of maneuver success. In this meta-analysis, we sought to determine the effect of right colon retroflexion on improving adenoma detection compared with conventional colonoscopy without retroflexion, as well as determine the rates of retroflexion maneuver success and adverse events. Multiple databases including MEDLINE, Embase, and Web of Science were searched for studies on right colon retroflexion and its impact on adenoma detection compared with conventional colonoscopy. Pooled analyses of adenoma detection and retroflexion success were based on mixed-effects and random-effects models with heterogeneity analyses. Eight studies met the inclusion criteria (N=3660). The primary analysis comparing colonoscopy with right-sided retroflexion versus conventional colonoscopy to determine the per-adenoma miss rate in the right colon was 16.9% (95% confidence interval, 12.5%-22.5%). The overall rate of successful retroflexion was 91.9% (95% confidence interval, 86%-95%) and rate of adverse events was 0.03%. Colonoscopy with right-sided retroflexion significantly increases the detection of adenomas in the right colon compared with conventional colonoscopy with a high rate of maneuver success and small risk of adverse events. Thus, reexamination of the right colon in retroflexed view should be strongly considered in future standard of care colonoscopy guidelines for quality improvement in colon cancer prevention.
Sparse dictionary learning for resting-state fMRI analysis
NASA Astrophysics Data System (ADS)
Lee, Kangjoo; Han, Paul Kyu; Ye, Jong Chul
2011-09-01
Recently, there has been increased interest in the usage of neuroimaging techniques to investigate what happens in the brain at rest. Functional imaging studies have revealed that the default-mode network activity is disrupted in Alzheimer's disease (AD). However, there is no consensus, as yet, on the choice of analysis method for the application of resting-state analysis for disease classification. This paper proposes a novel compressed sensing based resting-state fMRI analysis tool called Sparse-SPM. As the brain's functional systems has shown to have features of complex networks according to graph theoretical analysis, we apply a graph model to represent a sparse combination of information flows in complex network perspectives. In particular, a new concept of spatially adaptive design matrix has been proposed by implementing sparse dictionary learning based on sparsity. The proposed approach shows better performance compared to other conventional methods, such as independent component analysis (ICA) and seed-based approach, in classifying the AD patients from normal using resting-state analysis.
Directed Student Inquiry: Modeling in Roborovsky Hamsters
ERIC Educational Resources Information Center
Elwess, Nancy L.; Bouchard, Adam
2007-01-01
In this inquiry-based activity, Roborovsky hamsters are used to provide students with an opportunity to develop their skills of analysis, inquiry, and design. These hamsters are easy to maintain, yet offer students a means to use conventional techniques and those of their own design to make further observations through measuring, assessing, and…
User Centered Reading Intervention for Individuals with Autism and Intellectual Disability.
Yakkundi, Anita; Dillenburger, Karola; Goodman, Lizbeth; Dounavi, Katerina
2017-01-01
Individuals with autism and intellectual disability (ID) have complex learning needs and often have difficulty in acquiring reading comprehension skills using conventional teaching tools. Evidence based reading interventions for these learners and the use of assistive technology and application of behaviour analysis to develop user-centered teaching is discussed in this paper.
DOT National Transportation Integrated Search
2010-10-01
Ultra-high performance concrete (UHPC) is an advanced cementitious composite material which has been developed in recent decades. When compared to more conventional cement-based concrete materials, UHPC tends to exhibit superior properties such as in...
ERIC Educational Resources Information Center
Talib, Othman; Matthews, Robert; Secombe, Margaret
2005-01-01
This paper discusses the potential of applying computer-animated instruction (CAnI) as an effective conceptual change strategy in teaching electrochemistry in comparison to conventional lecture-based instruction (CLI). The core assumption in this study is that conceptual change in learners is an active, constructive process that is enhanced by the…
Current trends in endotoxin detection and analysis of endotoxin-protein interactions.
Dullah, Elvina Clarie; Ongkudon, Clarence M
2017-03-01
Endotoxin is a type of pyrogen that can be found in Gram-negative bacteria. Endotoxin can form a stable interaction with other biomolecules thus making its removal difficult especially during the production of biopharmaceutical drugs. The prevention of endotoxins from contaminating biopharmaceutical products is paramount as endotoxin contamination, even in small quantities, can result in fever, inflammation, sepsis, tissue damage and even lead to death. Highly sensitive and accurate detection of endotoxins are keys in the development of biopharmaceutical products derived from Gram-negative bacteria. It will facilitate the study of the intermolecular interaction of an endotoxin with other biomolecules, hence the selection of appropriate endotoxin removal strategies. Currently, most researchers rely on the conventional LAL-based endotoxin detection method. However, new methods have been and are being developed to overcome the problems associated with the LAL-based method. This review paper highlights the current research trends in endotoxin detection from conventional methods to newly developed biosensors. Additionally, it also provides an overview of the use of electron microscopy, dynamic light scattering (DLS), fluorescence resonance energy transfer (FRET) and docking programs in the endotoxin-protein analysis.
Cuevas, F J; Moreno-Rojas, J M; Arroyo, F; Daza, A; Ruiz-Moreno, M J
2016-05-15
The volatile profiles of six plum cultivars ('Laetitia', 'Primetime', 'Sapphire', 'Showtime', 'Songold' and 'Souvenir') produced under two management systems (conventional and organic) and harvested in two consecutive years were obtained by HS-SPME-GC-MS. Twenty-five metabolites were determined, five of which (pentanal, (E)-2-heptenal, 1-octanol, eucalyptol and 2-pentylfuran) are reported for the first time in Prunus salicina Lindl. Hexanal stood out as a major volatile compound affected by the management system. In addition, partial least square discriminant analysis (PLS-DA) achieved an effective classification of genotypes based on their volatile profiles. A high classification accuracy model was obtained with a sensitivity of 97.9% and a specificity of 99.6%. Furthermore, the application of a dual criterion, based on a method of variable selection, VIP (variable importance in projection) and the results of a univariate analysis (ANOVA), allowed the identification of potential volatile markers in 'Primetime', 'Showtime' and 'Souvenir' genotypes (cultivars not characterised to date). Copyright © 2015 Elsevier Ltd. All rights reserved.
Hot-compress: A new postdeposition treatment for ZnO-based flexible dye-sensitized solar cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haque Choudhury, Mohammad Shamimul, E-mail: shamimul129@gmail.com; Department of Electrical and Electronic Engineering, International Islamic University Chittagong, b154/a, College Road, Chittagong 4203; Kishi, Naoki
2016-08-15
Highlights: • A new postdeposition treatment named hot-compress is introduced. • Hot-compression gives homogeneous compact layer ZnO photoanode. • I-V and EIS analysis data confirms the efficacy of this method. • Charge transport resistance was reduced by the application of hot-compression. - Abstract: This article introduces a new postdeposition treatment named hot-compress for flexible zinc oxide–base dye-sensitized solar cells. This postdeposition treatment includes the application of compression pressure at an elevated temperature. The optimum compression pressure of 130 Ma at an optimum compression temperature of 70 °C heating gives better photovoltaic performance compared to the conventional cells. The aptness ofmore » this method was confirmed by investigating scanning electron microscopy image, X-ray diffraction, current-voltage and electrochemical impedance spectroscopy analysis of the prepared cells. Proper heating during compression lowers the charge transport resistance, longer the electron lifetime of the device. As a result, the overall power conversion efficiency of the device was improved about 45% compared to the conventional room temperature compressed cell.« less
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results. PMID:3032390
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results.
Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.
Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiplemore » causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.« less
Sadeghi, Rokhsareh; Miremadi, Asghar
2015-01-01
Objectives: Implant primary stability is one of the important factors in achieving implant success. The osteotome technique may improve primary stability in patients with poor bone quality. The aim of this study was to compare implant stability using two different techniques namely osteotome versus conventional drilling in the posterior maxilla. Materials and Methods: In this controlled randomized clinical trial, 54 dental implants were placed in 32 patients; 29 implants were placed in the osteotome group and 25 in the conventional drilling group. Implant stability was assessed at four time intervals namely at baseline, one, two and three months after implant placement using resonance frequency analysis (RFA). Results: Primary stability based on implant stability quotient (ISQ) units was 71.4±7 for the osteotome group and 67.4±10 for the control group. There was no statistically significant difference between the two groups in implant stability at any of the measurement times. In each group, changes in implant stability from baseline to one month and also from two months to three months post-operatively were not significant but from one month to two months after implant placement, implant stability showed a significant increase in both groups. Conclusion: The results of this study revealed that in both techniques, good implant stability was achieved and osteotome technique did not have any advantage compared to conventional drilling in this regard. PMID:27148375
Pergola, M; D'Amico, M; Celano, G; Palese, A M; Scuderi, A; Di Vita, G; Pappalardo, G; Inglese, P
2013-10-15
The island of Sicily has a long standing tradition in citrus growing. We evaluated the sustainability of orange and lemon orchards, under organic and conventional farming, using an energy, environmental and economic analysis of the whole production cycle by using a life cycle assessment approach. These orchard systems differ only in terms of a few of the inputs used and the duration of the various agricultural operations. The quantity of energy consumption in the production cycle was calculated by multiplying the quantity of inputs used by the energy conversion factors drawn from the literature. The production costs were calculated considering all internal costs, including equipment, materials, wages, and costs of working capital. The performance of the two systems (organic and conventional), was compared over a period of fifty years. The results, based on unit surface area (ha) production, prove the stronger sustainability of the organic over the conventional system, both in terms of energy consumption and environmental impact, especially for lemons. The sustainability of organic systems is mainly due to the use of environmentally friendly crop inputs (fertilizers, not use of synthetic products, etc.). In terms of production costs, the conventional management systems were more expensive, and both systems were heavily influenced by wages. In terms of kg of final product, the organic production system showed better environmental and energy performances. Copyright © 2013 Elsevier Ltd. All rights reserved.
Ethical Sensitivity in Nursing Ethical Leadership: A Content Analysis of Iranian Nurses Experiences
Esmaelzadeh, Fatemeh; Abbaszadeh, Abbas; Borhani, Fariba; Peyrovi, Hamid
2017-01-01
Background: Considering that many nursing actions affect other people’s health and life, sensitivity to ethics in nursing practice is highly important to ethical leaders as a role model. Objective: The study aims to explore ethical sensitivity in ethical nursing leaders in Iran. Method: This was a qualitative study based on the conventional content analysis in 2015. Data were collected using deep and semi-structured interviews with 20 Iranian nurses. The participants were chosen using purposive sampling. Data were analyzed using conventional content analysis. In order to increase the accuracy and integrity of the data, Lincoln and Guba's criteria were considered. Results: Fourteen sub-categories and five main categories emerged. Main categories consisted of sensitivity to care, sensitivity to errors, sensitivity to communication, sensitivity in decision making and sensitivity to ethical practice. Conclusion: Ethical sensitivity appears to be a valuable attribute for ethical nurse leaders, having an important effect on various aspects of professional practice and help the development of ethics in nursing practice. PMID:28584564
NASA Astrophysics Data System (ADS)
Ikeda, Hayato; Nagaoka, Ryo; Lafond, Maxime; Yoshizawa, Shin; Iwasaki, Ryosuke; Maeda, Moe; Umemura, Shin-ichiro; Saijo, Yoshifumi
2018-07-01
High-intensity focused ultrasound is a noninvasive treatment applied by externally irradiating ultrasound to the body to coagulate the target tissue thermally. Recently, it has been proposed as a noninvasive treatment for vascular occlusion to replace conventional invasive treatments. Cavitation bubbles generated by the focused ultrasound can accelerate the effect of thermal coagulation. However, the tissues surrounding the target may be damaged by cavitation bubbles generated outside the treatment area. Conventional methods based on Doppler analysis only in the time domain are not suitable for monitoring blood flow in the presence of cavitation. In this study, we proposed a novel filtering method based on the differences in spatiotemporal characteristics, to separate tissue, blood flow, and cavitation by employing singular value decomposition. Signals from cavitation and blood flow were extracted automatically using spatial and temporal covariance matrices.
Oszust, Karolina; Frąc, Magdalena; Gryta, Agata; Bilińska, Nina
2014-01-01
The knowledge about microorganisms—activity and diversity under hop production is still limited. We assumed that, different systems of hop production (within the same soil and climatic conditions) significantly influence on the composition of soil microbial populations and its functional activity (metabolic potential). Therefore, we compared a set of soil microbial properties in the field experiment of two hop production systems (a) ecological based on the use of probiotic preparations and organic fertilization (b) conventional—with the use of chemical pesticides and mineral fertilizers. Soil analyses included following microbial properties: The total number microorganisms, a bunch of soil enzyme activities, the catabolic potential was also assessed following Biolog EcoPlates®. Moreover, the abundance of ammonia-oxidizing archaea (AOA) was characterized by terminal restriction fragment length polymorphism analysis (T-RFLP) of PCR ammonia monooxygenase α-subunit (amoA) gene products. Conventional and ecological systems of hop production were able to affect soil microbial state in different seasonal manner. Favorable effect on soil microbial activity met under ecological, was more probably due to livestock-based manure and fermented plant extracts application. No negative influence on conventional hopyard soil was revealed. Both type of production fulfilled fertilizing demands. Under ecological production it was due to livestock-based manure fertilizers and fermented plant extracts application. PMID:24897025
Strappini, Francesca; Gilboa, Elad; Pitzalis, Sabrina; Kay, Kendrick; McAvoy, Mark; Nehorai, Arye; Snyder, Abraham Z
2017-03-01
Temporal and spatial filtering of fMRI data is often used to improve statistical power. However, conventional methods, such as smoothing with fixed-width Gaussian filters, remove fine-scale structure in the data, necessitating a tradeoff between sensitivity and specificity. Specifically, smoothing may increase sensitivity (reduce noise and increase statistical power) but at the cost loss of specificity in that fine-scale structure in neural activity patterns is lost. Here, we propose an alternative smoothing method based on Gaussian processes (GP) regression for single subjects fMRI experiments. This method adapts the level of smoothing on a voxel by voxel basis according to the characteristics of the local neural activity patterns. GP-based fMRI analysis has been heretofore impractical owing to computational demands. Here, we demonstrate a new implementation of GP that makes it possible to handle the massive data dimensionality of the typical fMRI experiment. We demonstrate how GP can be used as a drop-in replacement to conventional preprocessing steps for temporal and spatial smoothing in a standard fMRI pipeline. We present simulated and experimental results that show the increased sensitivity and specificity compared to conventional smoothing strategies. Hum Brain Mapp 38:1438-1459, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Static analysis of a sonar dome rubber window
NASA Technical Reports Server (NTRS)
Lai, J. L.
1978-01-01
The application of NASTRAN (level 16.0.1) to the static analysis of a sonar dome rubber window (SDRW) was demonstrated. The assessment of the conventional model (neglecting the enclosed fluid) for the stress analysis of the SDRW was made by comparing its results to those based on a sophisticated model (including the enclosed fluid). The fluid was modeled with isoparametric linear hexahedron elements with approximate material properties whose shear modulus was much smaller than its bulk modulus. The effect of the chosen material property for the fluid is discussed.
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.
1976-01-01
A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.
dada - a web-based 2D detector analysis tool
NASA Astrophysics Data System (ADS)
Osterhoff, Markus
2017-06-01
The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.
Daniel, Kaemmerer; Maria, Athelogou; Amelie, Lupp; Isabell, Lenhardt; Stefan, Schulz; Luisa, Peter; Merten, Hommann; Vikas, Prasad; Gerd, Binnig; Paul, Baum Richard
2014-01-01
Background: Manual evaluation of somatostatin receptor (SSTR) immunohistochemistry (IHC) is a time-consuming and cost-intensive procedure. Aim of the study was to compare manual evaluation of SSTR subtype IHC to an automated software-based analysis, and to in-vivo imaging by SSTR-based PET/CT. Methods: We examined 25 gastroenteropancreatic neuroendocrine tumor (GEP-NET) patients and correlated their in-vivo SSTR-PET/CT data (determined by the standardized uptake values SUVmax,-mean) with the corresponding ex-vivo IHC data of SSTR subtype (1, 2A, 4, 5) expression. Exactly the same lesions were imaged by PET/CT, resected and analyzed by IHC in each patient. After manual evaluation, the IHC slides were digitized and automatically evaluated for SSTR expression by Definiens XD software. A virtual IHC score “BB1” was created for comparing the manual and automated analysis of SSTR expression. Results: BB1 showed a significant correlation with the corresponding conventionally determined Her2/neu score of the SSTR-subtypes 2A (rs: 0.57), 4 (rs: 0.44) and 5 (rs: 0.43). BB1 of SSTR2A also significantly correlated with the SUVmax (rs: 0.41) and the SUVmean (rs: 0.50). Likewise, a significant correlation was seen between the conventionally evaluated SSTR2A status and the SUVmax (rs: 0.42) and SUVmean (rs: 0.62).Conclusion: Our data demonstrate that the evaluation of the SSTR status by automated analysis (BB1 score), using digitized histopathology slides (“virtual microscopy”), corresponds well with the SSTR2A, 4 and 5 expression as determined by conventional manual histopathology. The BB1 score also exhibited a significant association to the SSTR-PET/CT data in accordance with the high affinity profile of the SSTR analogues used for imaging. PMID:25197368
Current Issues and Challenges in Global Analysis of Parton Distributions
NASA Astrophysics Data System (ADS)
Tung, Wu-Ki
2007-01-01
A new implementation of precise perturbative QCD calculation of deep inelastic scattering structure functions and cross sections, incorporating heavy quark mass effects, is applied to the global analysis of the full HERA I data sets on NC and CC cross sections, in conjunction with other experiments. Improved agreement between the NLO QCD theory and the global data sets are obtained. Comparison of the new results to that of previous analysis based on conventional zero-mass parton formalism is made. Exploratory work on implications of new fixed-target neutrino scattering and Drell-Yan data on global analysis is also discussed.
Kwon, Jinhyeong; Cho, Hyunmin; Eom, Hyeonjin; Lee, Habeom; Suh, Young Duk; Moon, Hyunjin; Shin, Jaeho; Hong, Sukjoon; Ko, Seung Hwan
2016-05-11
Copper nanomaterials suffer from severe oxidation problem despite the huge cost effectiveness. The effect of two different processes for conventional tube furnace heating and selective laser sintering on copper nanoparticle paste is compared in the aspects of chemical, electrical and surface morphology. The thermal behavior of the copper thin films by furnace and laser is compared by SEM, XRD, FT-IR, and XPS analysis. The selective laser sintering process ensures low annealing temperature, fast processing speed with remarkable oxidation suppression even in air environment while conventional tube furnace heating experiences moderate oxidation even in Ar environment. Moreover, the laser-sintered copper nanoparticle thin film shows good electrical property and reduced oxidation than conventional thermal heating process. Consequently, the proposed selective laser sintering process can be compatible with plastic substrate for copper based flexible electronics applications.
NASA Astrophysics Data System (ADS)
Rahe, Manfred; Ristau, Detlev; Schmidt, Holger
1993-06-01
In this paper, data of single layers of YbF3, BaF2, YF3, and NaF and multilayer coatings produced by conventional thermal evaporation (boat, e-beam) and ion assisted deposition (IAD) are compared. Hydrogen concentration depth profiling was performed using nuclear reaction analysis based on the reaction 1H(15N, (alpha) (gamma) )12C. Absorption was measured with the aid of a laser calorimeter and a cw CO2 laser. A computer-controlled test facility with a TEA CO2 laser was used for determining the 1-on-1 damage thresholds of the coatings. The results point out that the absorption and damage behavior of coatings for the CO2 laser wavelength are related to the total amount of species containing hydrogen. Most of the IAD coatings exhibit a lower hydrogen contamination than conventional thin films.
Initial assessment of hearing loss using a mobile application for audiological evaluation.
Derin, S; Cam, O H; Beydilli, H; Acar, E; Elicora, S S; Sahan, M
2016-03-01
This study aimed to compare an Apple iOS mobile operating system application for audiological evaluation with conventional audiometry, and to determine its accuracy and reliability in the initial evaluation of hearing loss. The study comprised 32 patients (16 females) diagnosed with hearing loss. The patients were first evaluated with conventional audiometry and the degree of hearing loss was recorded. Then they underwent a smartphone-based hearing test and the data were compared using Cohen's kappa analysis. Patients' mean age was 53.59 ± 18.01 years (range, 19-85 years). The mobile phone audiometry results for 39 of the 64 ears were fully compatible with the conventional audiometry results. There was a statistically significant concordant relationship between the two sets of audiometry results (p < 0.05). Ear Trumpet version 1.0.2 is a compact and simple mobile application on the Apple iPhone 5 that can measure hearing loss with reliable results.
Revealing retroperitoneal liposarcoma morphology using optical coherence tomography
NASA Astrophysics Data System (ADS)
Carbajal, Esteban F.; Baranov, Stepan A.; Manne, Venu G. R.; Young, Eric D.; Lazar, Alexander J.; Lev, Dina C.; Pollock, Raphael E.; Larin, Kirill V.
2011-02-01
A new approach to distinguish normal fat, well-differentiated (WD), and dedifferentiated liposarcoma (LS) tumors is demonstrated, based on the use of optical coherence tomography (OCT). OCT images show the same structures seen with conventional histological methods. Our visual grading analysis is supported by numerical analysis of observed structures for normal fat and WDLS samples. Further development could apply the real-time and high resolution advantages of OCT for use in liposarcoma diagnosis and clinical procedures.
Mass-Selective Chiral Analysis
NASA Astrophysics Data System (ADS)
Boesl, Ulrich; Kartouzian, Aras
2016-06-01
Three ways of realizing mass-selective chiral analysis are reviewed. The first is based on the formation of diastereomers that are of homo- and hetero- type with respect to the enantiomers of involved chiral molecules. This way is quite well-established with numerous applications. The other two ways are more recent developments, both based on circular dichroism (CD). In one, conventional or nonlinear electronic CD is linked to mass spectrometry (MS) by resonance-enhanced multiphoton ionization. The other is based on CD in the angular distribution of photoelectrons, which is measured in combination with MS via photoion photoelectron coincidence. Among the many important applications of mass-selective chiral analysis, this review focuses on its use as an analytical tool for the development of heterogeneous enantioselective chemical catalysis. There exist other approaches to combine chiral analysis and mass-selective detection, such as chiral chromatography MS, which are not discussed here.
Adventitious sounds identification and extraction using temporal-spectral dominance-based features.
Jin, Feng; Krishnan, Sridhar Sri; Sattar, Farook
2011-11-01
Respiratory sound (RS) signals carry significant information about the underlying functioning of the pulmonary system by the presence of adventitious sounds (ASs). Although many studies have addressed the problem of pathological RS classification, only a limited number of scientific works have focused on the analysis of the evolution of symptom-related signal components in joint time-frequency (TF) plane. This paper proposes a new signal identification and extraction method for various ASs based on instantaneous frequency (IF) analysis. The presented TF decomposition method produces a noise-resistant high definition TF representation of RS signals as compared to the conventional linear TF analysis methods, yet preserving the low computational complexity as compared to those quadratic TF analysis methods. The discarded phase information in conventional spectrogram has been adopted for the estimation of IF and group delay, and a temporal-spectral dominance spectrogram has subsequently been constructed by investigating the TF spreads of the computed time-corrected IF components. The proposed dominance measure enables the extraction of signal components correspond to ASs from noisy RS signal at high noise level. A new set of TF features has also been proposed to quantify the shapes of the obtained TF contours, and therefore strongly, enhances the identification of multicomponents signals such as polyphonic wheezes. An overall accuracy of 92.4±2.9% for the classification of real RS recordings shows the promising performance of the presented method.
Hwang, Hee Sang; Yoon, Dok Hyun; Suh, Cheolwon; Huh, Jooryung
2016-08-01
Extranodal involvement is a well-known prognostic factor in patients with diffuse large B-cell lymphomas (DLBCL). Nevertheless, the prognostic impact of the extranodal scoring system included in the conventional international prognostic index (IPI) has been questioned in an era where rituximab treatment has become widespread. We investigated the prognostic impacts of individual sites of extranodal involvement in 761 patients with DLBCL who received rituximab-based chemoimmunotherapy. Subsequently, we established a new extranodal scoring system based on extranodal sites, showing significant prognostic correlation, and compared this system with conventional scoring systems, such as the IPI and the National Comprehensive Cancer Network-IPI (NCCN-IPI). An internal validation procedure, using bootstrapped samples, was also performed for both univariate and multivariate models. Using multivariate analysis with a backward variable selection, we found nine extranodal sites (the liver, lung, spleen, central nervous system, bone marrow, kidney, skin, adrenal glands, and peritoneum) that remained significant for use in the final model. Our newly established extranodal scoring system, based on these sites, was better correlated with patient survival than standard scoring systems, such as the IPI and the NCCN-IPI. Internal validation by bootstrapping demonstrated an improvement in model performance of our modified extranodal scoring system. Our new extranodal scoring system, based on the prognostically relevant sites, may improve the performance of conventional prognostic models of DLBCL in the rituximab era and warrants further external validation using large study populations.
Ramey, Stephen James; Padgett, Kyle R; Lamichhane, Narottam; Neboori, Hanmath J; Kwon, Deukwoo; Mellon, Eric A; Brown, Karen; Duffy, Melissa; Victoria, James; Dogan, Nesrin; Portelance, Lorraine
2018-03-01
This study aims to perform a dosimetric comparison of 2 magnetic resonance (MR)-guided radiation therapy systems capable of performing online adaptive radiation therapy versus a conventional radiation therapy system for pancreas stereotactic body radiation therapy. Ten cases of patients with pancreatic adenocarcinoma previously treated in our institution were used for this analysis. MR-guided tri-cobalt 60 therapy (MR-cobalt) and MR-LINAC plans were generated and compared with conventional LINAC (volumetric modulated arc therapy) plans. The prescription dose was 40 Gy in 5 fractions covering 95% of the planning tumor volume for the 30 plans. The same organs at risk (OARs) dose constraints were used in all plans. Dose-volume-based indices were used to compare PTV coverage and OAR sparing. The conformity index of 40 Gy in 5 fractions covering 95% of the planning tumor volume demonstrated higher conformity in both LINAC-based plans compared with MR-cobalt plans. Although there was no difference in mean conformity index between LINAC and MR-LINAC plans (1.08 in both), there was a large difference between LINAC and MR-cobalt plans (1.08 vs 1.52). Overall, 79%, 72%, and 78% of critical structure dosimetric constraints were met with LINAC, MR-cobalt, and MR-LINAC plans, respectively. The MR-cobalt plans delivered more doses to all OARs compared with the LINAC plans. In contrast, the doses to the OARs of the MR-LINAC plans were similar to LINAC plans except in 2 cases: liver mean dose (MR-LINAC, 2 .8 Gy vs LINAC, 2.1 Gy) and volume of duodenum receiving at least 15 Gy (MR-LINAC, 13.2 mL vs LINAC, 15.4 mL). Both differences are likely not clinically significant. This study demonstrates that dosimetrically similar plans were achieved with conventional LINAC and MR-LINAC, whereas doses to OARs were statistically higher for MR-cobalt compared with conventional LINAC plans because of low-dose spillage. Given the improved tumor-tracking capabilities of MR-LINAC, further studies should evaluate potential benefits of adaptive radiation therapy-capable MR-guided LINAC treatment. Copyright © 2018. Published by Elsevier Inc.
Pagotto, Luis Eduardo Charles; de Santana Santos, Thiago; de Vasconcellos, Sara Juliana de Abreu; Santos, Joanes Silva; Martins-Filho, Paulo Ricardo Saquete
2017-10-01
The purpose of this study was to perform a systematic review and meta-analysis of complications after orthognathic surgery comparing piezo-surgery with conventional osteotomy. We conducted this study according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We performed a systematic search of PubMed, Scopus, Science Direct, Lilacs, Cochrane Central Register of Controlled Trials, Google Scholar, and OpenThesis to identify randomized and nonrandomized controlled trials (RCTs and nRCTs, respectively) comparing patient outcomes (operative time, intraoperative blood loss, postoperative swelling, pain, neurosensitivity) after orthognathic surgery by piezoelectric or conventional osteotomy. We pooled individual results of continuous and dichotomous outcome data using the mean difference (MD) and risk difference (RD) with the 95% confidence interval, respectively. Three RCTs and five nRCTs were selected. No difference in operative time was observed between piezo-surgery and conventional osteotomies. We found a decrease of intraoperative blood loss with piezo-surgery (MD -128 mL; P < 0.001) and a pooled difference in severe blood loss of 35% (P = 0.008) favouring piezo-surgery. Based on pooled individual results of studies evaluating neurosensitivity by clinical neurosensory testing, our meta-analysis showed a pooled difference in severe nerve disturbance of 25% (P < 0.0001) favouring piezo-surgery. Test for subgroup differences (I2 = 26.6%) indicated that follow-up time may have an effect on neurosensory disturbance. We found differences between piezo-surgery and conventional osteotomy at 3 months (RD 28%; P < 0.001) and 6 months (RD 15%; P = 0.001) after surgery. Meta-analyses for pain and swelling were not performed because of a lack of sufficient studies. Currently available evidence suggests that piezo-surgery has favorable effects on complications associated with orthognathic surgery, including reductions in intraoperative blood loss and severe nerve disturbance. Copyright © 2017 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Roibás, Laura; Martínez, Ismael; Goris, Alfonso; Barreiro, Rocío; Hospido, Almudena
2016-10-01
This study compares a premium brand of UHT milk, Unicla, characterised by an improved nutritional composition, to conventional milk, in terms of health effects and environmental impacts. Unlike enriched milks, in which nutrients are added to the final product, Unicla is obtained naturally by improving the diet of the dairy cows. Health effects have been analysed based on literature findings, while the environmental analysis focused on those spheres of the environment where milk is expected to cause the higher impacts, and thus carbon (CF) and water footprints (WF) have been determined. Five final products have been compared: 3 conventional (skimmed, semi-skimmed, whole) and 2 Unicla (skimmed, semi-skimmed) milks. As a functional unit, one litre of packaged UHT milk entering the regional distribution centre has been chosen. The improved composition of Unicla milk is expected to decrease the risk of cardiovascular disease and to protect consumers against oxidative damage, among other health benefits. Concerning the environmental aspect, CF of Unicla products are, on average, 10% lower than their conventional equivalents, mainly due to the lower enteric emissions of caused by the Unicla diet. No significant differences were found between the WF of Unicla and conventional milk. Raw milk is the main contributor to both footprints (on average, 83.2 and 84.3% of the total CF of Unicla and conventional milk, respectively, and 99.9% of WF). The results have been compared to those found in literature, and a sensitivity analysis has been performed to verify their robustness. The study concludes that switching to healthier milk compositions can help slowing down global warming, without contributing to other environmental issues such as water scarcity. The results should encourage other milk companies to commit to the development of healthier, less environmentally damaging products, and also to stimulate consumers to bet on them. Copyright © 2016 Elsevier B.V. All rights reserved.
Bookstein, Fred L; Domjanić, Jacqueline
2014-09-01
The relationship of geometric morphometrics (GMM) to functional analysis of the same morphological resources is currently a topic of active interest among functional morphologists. Although GMM is typically advertised as free of prior assumptions about shape features or morphological theories, it is common for GMM findings to be concordant with findings from studies based on a-priori lists of shape features whenever prior insights or theories have been properly accounted for in the study design. The present paper demonstrates this happy possibility by revisiting a previously published GMM analysis of footprint outlines for which there is also functionally relevant information in the form of a-pri-ori foot measurements. We show how to convert the conventional measurements into the language of shape, thereby affording two parallel statistical analyses. One is the classic multivariate analysis of "shape features", the other the equally classic GMM of semilandmark coordinates. In this example, the two data sets, analyzed by protocols that are remarkably different in both their geometry and their algebra, nevertheless result in one common biometrical summary: wearing high heels is bad for women inasmuch as it leads to the need for orthotic devices to treat the consequently flattened arch. This concordance bears implications for other branches of applied anthropology. To carry out a good biomedical analysis of applied anthropometric data it may not matter whether one uses GMM or instead an adequate assortment of conventional measurements. What matters is whether the conventional measurements have been selected in order to match the natural spectrum of functional variation.
Economics of Undiscovered Oil and Gas in the North Slope of Alaska: Economic Update and Synthesis
Attanasi, E.D.; Freeman, P.A.
2009-01-01
The U.S. Geological Survey (USGS) has published assessments by geologists of undiscovered conventional oil and gas accumulations in the North Slope of Alaska; these assessments contain a set of scientifically based estimates of undiscovered, technically recoverable quantities of oil and gas in discrete oil and gas accumulations that can be produced with conventional recovery technology. The assessments do not incorporate economic factors such as recovery costs and product prices. The assessors considered undiscovered conventional oil and gas resources in four areas of the North Slope: (1) the central North Slope, (2) the National Petroleum Reserve in Alaska (NPRA), (3) the 1002 Area of the Arctic National Wildlife Refuge (ANWR), and (4) the area west of the NPRA, called in this report the 'western North Slope'. These analyses were prepared at different times with various minimum assessed oil and gas accumulation sizes and with slightly different assumptions. Results of these past studies were recently supplemented with information by the assessment geologists that allowed adjustments for uniform minimum assessed accumulation sizes and a consistent set of assumptions. The effort permitted the statistical aggregation of the assessments of the four areas composing the study area. This economic analysis is based on undiscovered assessed accumulation distributions represented by the four-area aggregation and incorporates updates of costs and technological and fiscal assumptions used in the initial economic analysis that accompanied the geologic assessment of each study area.
NASA Astrophysics Data System (ADS)
Mukherjee, Bappa; Roy, P. N. S.
The identification of prospective and dry zone is of major importance from well log data. Truthfulness in the identification of potential zone is a very crucial issue in hydrocarbon exploration. In this line, the problem has received considerable attention and many conventional techniques have been proposed. The purpose of this study is to recognize the hydrocarbon and non-hydrocarbon bearing portion within a reservoir by using the non-conventional technique. The wavelet based fractal analysis (WBFA) has been applied on the wire-line log data in order to obtain the pre-defined hydrocarbon (HC) and non-hydrocarbon (NHC) zones by their self-affine signal nature is demonstrated in this paper. The feasibility of the proposed technique is tested with the help of most commonly used logs, like self-potential, gamma ray, resistivity and porosity log responses. These logs are obtained from the industry to make out several HC and NHC zones of all wells in the study region belonging to the upper Assam basin. The results obtained in this study for a particular log response, where in the case of HC bearing zones, it is found that they are mainly situated in a variety of sandstones lithology which leads to the higher Hurst exponent. Further, the NHC zones found to be analogous to lithology with higher shale content having lower Hurst exponent. The above proposed technique can overcome the chance of miss interpretation in conventional reservoir characterization.
Moura, Renata Vasconcellos; Kojima, Alberto Noriyuki; Saraceni, Cintia Helena Coury; Bassolli, Lucas; Balducci, Ivan; Özcan, Mutlu; Mesquita, Alfredo Mikail Melo
2018-05-01
The increased use of CAD systems can generate doubt about the accuracy of digital impressions for angulated implants. The aim of this study was to evaluate the accuracy of different impression techniques, two conventional and one digital, for implants with and without angulation. We used a polyurethane cast that simulates the human maxilla according to ASTM F1839, and 6 tapered implants were installed with external hexagonal connections to simulate tooth positions 17, 15, 12, 23, 25, and 27. Implants 17 and 23 were placed with 15° of mesial angulation and distal angulation, respectively. Mini cone abutments were installed on these implants with a metal strap 1 mm in height. Conventional and digital impression procedures were performed on the maxillary master cast, and the implants were separated into 6 groups based on the technique used and measurement type: G1 - control, G2 - digital impression, G3 - conventional impression with an open tray, G4 - conventional impression with a closed tray, G5 - conventional impression with an open tray and a digital impression, and G6 - conventional impression with a closed tray and a digital impression. A statistical analysis was performed using two-way repeated measures ANOVA to compare the groups, and a Kruskal-Wallis test was conducted to analyze the accuracy of the techniques. No significant difference in the accuracy of the techniques was observed between the groups. Therefore, no differences were found among the conventional impression and the combination of conventional and digital impressions, and the angulation of the implants did not affect the accuracy of the techniques. All of the techniques exhibited trueness and had acceptable precision. The variation of the angle of the implants did not affect the accuracy of the techniques. © 2018 by the American College of Prosthodontists.
Curl, Cynthia L; Fenske, Richard A; Elgethun, Kai
2003-01-01
We assessed organophosphorus (OP) pesticide exposure from diet by biological monitoring among Seattle, Washington, preschool children. Parents kept food diaries for 3 days before urine collection, and they distinguished organic and conventional foods based on label information. Children were then classified as having consumed either organic or conventional diets based on analysis of the diary data. Residential pesticide use was also recorded for each home. We collected 24-hr urine samples from 18 children with organic diets and 21 children with conventional diets and analyzed them for five OP pesticide metabolites. We found significantly higher median concentrations of total dimethyl alkylphosphate metabolites than total diethyl alkylphosphate metabolites (0.06 and 0.02 micro mol/L, respectively; p = 0.0001). The median total dimethyl metabolite concentration was approximately six times higher for children with conventional diets than for children with organic diets (0.17 and 0.03 micro mol/L; p = 0.0003); mean concentrations differed by a factor of nine (0.34 and 0.04 micro mol/L). We calculated dose estimates from urinary dimethyl metabolites and from agricultural pesticide use data, assuming that all exposure came from a single pesticide. The dose estimates suggest that consumption of organic fruits, vegetables, and juice can reduce children's exposure levels from above to below the U.S. Environmental Protection Agency's current guidelines, thereby shifting exposures from a range of uncertain risk to a range of negligible risk. Consumption of organic produce appears to provide a relatively simple way for parents to reduce their children's exposure to OP pesticides. PMID:12611667
Khan, Yasin; Khare, Vaibhav Rai; Mathur, Jyotirmay; ...
2015-03-26
The paper describes a parametric study developed to estimate the energy savings potential of a radiant cooling system installed in a commercial building in India. The study is based on numerical modeling of a radiant cooling system installed in an Information Technology (IT) office building sited in the composite climate of Hyderabad. To evaluate thermal performance and energy consumption, simulations were carried out using the ANSYS FLUENT and EnergyPlus softwares, respectively. The building model was calibrated using the measured data for the installed radiant system. Then this calibrated model was used to simulate the energy consumption of a building usingmore » a conventional all-air system to determine the proportional energy savings. For proper handling of the latent load, a dedicated outside air system (DOAS) was used as an alternative to Fan Coil Unit (FCU). A comparison of energy consumption calculated that the radiant system was 17.5 % more efficient than a conventional all-air system and that a 30% savings was achieved by using a DOAS system compared with a conventional system. Computational Fluid Dynamics (CFD) simulation was performed to evaluate indoor air quality and thermal comfort. It was found that a radiant system offers more uniform temperatures, as well as a better mean air temperature range, than a conventional system. To further enhance the energy savings in the radiant system, different operational strategies were analyzed based on thermal analysis using EnergyPlus. Lastly, the energy savings achieved in this parametric run were more than 10% compared with a conventional all-air system.« less
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
NASA Astrophysics Data System (ADS)
Tse, P. W.; Liu, X. C.; Liu, Z. H.; Wu, B.; He, C. F.; Wang, X. J.
2011-05-01
Magnetostrictive sensors (MsSs) that can excite and receive guided waves are commonly used in detecting defects that may occur in cables and strands for supporting heavy structures. A conventional MsS has a hard sensing coil that is wound onto a bobbin with electric wires to generate the necessary dynamic magnetic field to excite the desired guided waves. This tailor-made hard coil is usually bulky and is not flexible enough to fit steel strands of various sizes. The conventional MsS also cannot be mounted to any steel strand that does not have a free end to allow the bobbin to pass through the structure of the tested strand. Such inflexibilities limit the use of conventional MsSs in practical situations. To solve these limitations, an innovative type of coil, called a flexible printed coil (FPC), which is made out of flexible printed film, has been designed to replace the inflexible hard coil. The flexible structure of the FPC ensures that the new MsS can be easily installed on and removed from steel strands with different diameters and without free ends. Moreover, the FPC-based MsS can be wrapped into multiple layers due to its thin and flexible design. Although multi-layer FPC creates a minor asymmetry in the dynamic magnetic field, the results of finite element analysis and experiments confirm that the longitudinal guided waves excited by a FPC-based MsS are comparable to those excited by a conventional hard coil MsS. No significant reduction in defect inspection performance was found; in fact, further advantages were identified when using the FPC-based MsS. When acting as the transmitter, the innovative FPC-based MsS can cover a longer inspection length of strand. When acting as the receiver, the FPC-based MsS is more sensitive to smaller defects that are impossible to detect using a hard coil MsS. Hence, the multi-layer FPC-based MsS has great potential for replacing the conventional hard coil MsS because of its convenient installation, and ease of fitting to different strand diameters; it is smaller, and, most importantly, performs much better in strand defect detection.
Zhang, Bo; Pirmoradian, Mohammad; Chernobrovkin, Alexey; Zubarev, Roman A.
2014-01-01
Based on conventional data-dependent acquisition strategy of shotgun proteomics, we present a new workflow DeMix, which significantly increases the efficiency of peptide identification for in-depth shotgun analysis of complex proteomes. Capitalizing on the high resolution and mass accuracy of Orbitrap-based tandem mass spectrometry, we developed a simple deconvolution method of “cloning” chimeric tandem spectra for cofragmented peptides. Additional to a database search, a simple rescoring scheme utilizes mass accuracy and converts the unwanted cofragmenting events into a surprising advantage of multiplexing. With the combination of cloning and rescoring, we obtained on average nine peptide-spectrum matches per second on a Q-Exactive workbench, whereas the actual MS/MS acquisition rate was close to seven spectra per second. This efficiency boost to 1.24 identified peptides per MS/MS spectrum enabled analysis of over 5000 human proteins in single-dimensional LC-MS/MS shotgun experiments with an only two-hour gradient. These findings suggest a change in the dominant “one MS/MS spectrum - one peptide” paradigm for data acquisition and analysis in shotgun data-dependent proteomics. DeMix also demonstrated higher robustness than conventional approaches in terms of lower variation among the results of consecutive LC-MS/MS runs. PMID:25100859
The relation between periods’ identification and noises in hydrologic series data
NASA Astrophysics Data System (ADS)
Sang, Yan-Fang; Wang, Dong; Wu, Ji-Chun; Zhu, Qing-Ping; Wang, Ling
2009-04-01
SummaryIdentification of dominant periods is a typical and important issue in hydrologic series data analysis, since it is the basis of building effective stochastic models, understanding complex hydrologic processes, etc. However it is still a difficult task due to the influence of many interrelated factors, such as noises in hydrologic series data. In this paper, firstly the great influence of noises on periods' identification has been analyzed. Then, based on two conventional methods of hydrologic series analysis: wavelet analysis (WA) and maximum entropy spectral analysis (MESA), a new method of periods' identification of hydrologic series data, main series spectral analysis (MSSA), has been put forward, whose main idea is to identify periods of the main series on the basis of reducing hydrologic noises. Various methods (include fast Fourier transform (FFT), MESA and MSSA) have been applied to both synthetic series and observed hydrologic series. Results show that conventional methods (FFT and MESA) are not as good as expected due to the great influence of noises. However, this influence is not so strong while using the new method MSSA. In addition, by using the new de-noising method proposed in this paper, which is suitable for both normal noises and skew noises, the results are more reasonable, since noises separated from hydrologic series data generally follow skew probability distributions. In conclusion, based on comprehensive analyses, it can be stated that the proposed method MSSA could improve periods' identification by effectively reducing the influence of hydrologic noises.
Reliability Analysis of a Green Roof Under Different Storm Scenarios
NASA Astrophysics Data System (ADS)
William, R. K.; Stillwell, A. S.
2015-12-01
Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.
Analysis of a flux-coupling type superconductor fault current limiter with pancake coils
NASA Astrophysics Data System (ADS)
Liu, Shizhuo; Xia, Dong; Zhang, Zhifeng; Qiu, Qingquan; Zhang, Guomin
2017-10-01
The characteristics of a flux-coupling type superconductor fault current limiter (SFCL) with pancake coils are investigated in this paper. The conventional double-wound non-inductive pancake coil used in AC power systems has an inevitable defect in Voltage Sourced Converter Based High Voltage DC (VSC-HVDC) power systems. Due to its special structure, flashover would occur easily during the fault in high voltage environment. Considering the shortcomings of conventional resistive SFCLs with non-inductive coils, a novel flux-coupling type SFCL with pancake coils is carried out. The module connections of pancake coils are performed. The electromagnetic field and force analysis of the module are contrasted under different parameters. To ensure proper operation of the module, the impedance of the module under representative operating conditions is calculated. Finally, the feasibility of the flux-coupling type SFCL in VSC-HVDC power systems is discussed.
Molecular pathology and genetics of gastrointestinal neuroendocrine tumours.
Lewis, Mark A; Yao, James C
2014-02-01
Neuroendocrine tumours (NETs) of the luminal gastrointestinal tract and pancreas are increasing in incidence and prevalence. Prior assumptions about the benign nature of 'carcinoids' and the clinical importance of distinguishing functional vs. nonfunctional tumours are being overturned through greater understanding of disease behaviour and heterogeneity. This review highlights the most contemporary genetic and molecular insights into gastroenteropancreatic NETs. Biomarkers such as neuron-specific enolase or chromogranin A could be supplemented or supplanted by PCR-based analysis of NET genes detectable in the blood transcriptome. Conventional pathology, including Ki67 testing, could be enhanced with immunohistochemistry and exome analysis. Prognostic markers and/or putative therapeutic targets uncovered through recent studies include heparanase, Id, ATM, SRC, EGFR, hsp90 and PDGFR. After a long-standing paucity of options for conventional cytotoxic therapy, the comprehension and treatment of gastroenteropancreatic NETs has been enriched by advancements in taxonomy, molecular pathology and genetic/epigenetic testing.
Analysis of In-Route Wireless Charging for the Shuttle System at Zion National Park
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meintz, Andrew; Prohaska, Robert; Konan, Arnaud
System right-sizing is critical to implementation of wireless power transfer (WPT) for electric vehicles. This study will analyze potential WPT scenarios for the electrification of shuttle buses at Zion National Park utilizing a modelling tool developed by the National Renewable Energy Laboratory called WPTSim. This tool uses second-by-second speed, location, and road grade data from the conventional shuttles in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variablemore » number of charging locations. The outcome of this work is an analysis of the design tradeoffs for the electrification of the shuttle fleet with wireless charging versus conventional overnight charging.« less
Camelo-Méndez, Gustavo A; Flores-Silva, Pamela C; Agama-Acevedo, Edith; Bello-Pérez, Luis A
2017-12-01
The phenolic compounds, color and antioxidant capacity of gluten-free pasta prepared with non-conventional flours such as chickpea (CHF), unripe plantain (UPF), white maize (WMF) and blue maize (BMF) were analyzed. Fifteen phenolic compounds (five anthocyanins, five hydroxybenzoic acids, three hydroxycinnamic acids, one hydroxyphenylacetic acid and one flavonol) were identified in pasta prepared with blue maize, and 10 compounds were identified for samples prepared with white maize. The principal component analysis (PCA) led to results describing 98% of the total variance establishing a clear separation for each pasta. Both the proportion (25, 50 and 75%) and type of maize flour (white and blue) affected the color parameters (L*, C ab *, h ab and ΔE* ab ) and antioxidant properties (DPPH, ABTS and FRAP methods) of samples, thus producing gluten-free products with potential health benefits intended for general consumers (including the population with celiac disease).
Enzymatic signal amplification for sensitive detection of intracellular antigens by flow cytometry.
Karkmann, U; Radbruch, A; Hölzel, V; Scheffold, A
1999-11-19
Flow cytometry is the method of choice for the analysis of single cells with respect to the expression of specific antigens. Antigens can be detected with specific antibodies either on the cell surface or within the cells, after fixation and permeabilization of the cell membrane. Using conventional fluorochrome-labeled antibodies several thousand antigens are required for clear-cut separation of positive and negative cells. More sensitive reagents, e.g., magnetofluorescent liposomes conjugated to specific antibodies permit the detection of less than 200 molecules per cell but cannot be used for the detection of intracellular antigens. Here, we describe an enzymatic amplification technique (intracellular tyramine-based signal amplification, ITSA) for the sensitive cytometric analysis of intracellular cytokines by immunofluorescence. This approach results in a 10 to 15-fold improvement of the signal-to-noise ratio compared to conventional fluorochrome labeled antibodies and permits the detection of as few as 300-400 intracellular antigens per cell.
NASA Astrophysics Data System (ADS)
Valotto, Gabrio; Cattaruzza, Elti; Bardelli, Fabrizio
2017-02-01
The appropriate selection of representative pure compounds to be used as reference is a crucial step for successful analysis of X-ray absorption near edge spectroscopy (XANES) data, and it is often not a trivial task. This is particularly true when complex environmental matrices are investigated, being their elemental speciation a priori unknown. In this paper, an investigation on the speciation of Cu, Zn, and Sb based on the use of conventional (stoichiometric compounds) and non-conventional (environmental samples or relevant certified materials) references is explored. This method can be useful in when the effectiveness of XANES analysis is limited because of the difficulty in obtaining a set of references sufficiently representative of the investigated samples. Road dust samples collected along the bridge connecting Venice to the mainland were used to show the potentialities and the limits of this approach.
Energy-Discriminative Performance of a Spectral Micro-CT System
He, Peng; Yu, Hengyong; Bennett, James; Ronaldson, Paul; Zainon, Rafidah; Butler, Anthony; Butler, Phil; Wei, Biao; Wang, Ge
2013-01-01
Experiments were performed to evaluate the energy-discriminative performance of a spectral (multi-energy) micro-CT system. The system, designed by MARS (Medipix All Resolution System) Bio-Imaging Ltd. (Christchurch, New Zealand), employs a photon-counting energy-discriminative detector technology developed by CERN (European Organization for Nuclear Research). We used the K-edge attenuation characteristic of some known materials to calibrate the detector’s photon energy discrimination. For tomographic analysis, we used the compressed sensing (CS) based ordered-subset simultaneous algebraic reconstruction techniques (OS-SART) to reconstruct sample images, which is effective to reduce noise and suppress artifacts. Unlike conventional CT, the principal component analysis (PCA) method can be applied to extract and quantify additional attenuation information from a spectral CT dataset. Our results show that the spectral CT has a good energy-discriminative performance and provides more attenuation information than the conventional CT. PMID:24004864
An analysis of stepped trapezoidal-shaped microcantilever beams for MEMS-based devices
NASA Astrophysics Data System (ADS)
Ashok, Akarapu; Gangele, Aparna; Pal, Prem; Pandey, Ashok Kumar
2018-07-01
Microcantilever beams are the most widely used mechanical elements in the design and fabrication of MEMS/NEMS-based sensors and actuators. In this work, we have proposed a new microcantilever beam design based on a stepped trapezoidal-shaped microcantilever. Single-, double-, triple- and quadruple-stepped trapezoidal-shaped microcantilever beams along with conventional rectangular-shaped microcantilever beams were analysed experimentally, numerically and analytically. The microcantilever beams were fabricated from silicon dioxide material using wet bulk micromachining in 25 wt% TMAH. The length, width and thickness of the microcantilever beams were fixed at 200, 40 and 0.96 µm, respectively. A laser vibrometer was utilized to measure the resonance frequency and Q-factor of the microcantilever beams in vacuum as well as in ambient conditions. Furthermore, finite element analysis software, ANSYS, was employed to numerically analyse the resonance frequency, maximum deflection and torsional end rotation of all the microcantilever beam designs. The analytical and numerical resonance frequencies are found to be in good agreement with the experimental resonance frequencies. In the stepped trapezoidal-shaped microcantilever beams with an increasing number of steps, the Q-factor, maximum deflection and torsional end rotation were improved, whereas the resonance frequency was slightly reduced. Nevertheless, the resonance frequency is higher than the basic rectangular-shaped microcantilever beam. The observed quality factor, maximum deflection and torsional end rotation for a quadruple-stepped trapezoidal-shaped microcantilever are 38%, 41% and 52%, respectively, which are higher than those of conventional rectangular-shaped microcantilever beams. Furthermore, for an applied concentrated mass of 1 picogram on the cantilever surface, a greater shift in frequency is obtained for all the stepped trapezoidal-shaped microcantilever beam designs compared to the conventional rectangular microcantilever beam.
Mino, Takuya; Maekawa, Kenji; Ueda, Akihiro; Higuchi, Shizuo; Sejima, Junichi; Takeuchi, Tetsuo; Hara, Emilio Satoshi; Kimura-Ono, Aya; Sonoyama, Wataru; Kuboki, Takuo
2015-04-01
The aim of this article was to investigate the accuracy in the reproducibility of full-arch implant provisional restorations to final restorations between a 3D Scan/CAD/CAM technique and the conventional method. We fabricated two final restorations for rehabilitation of maxillary and mandibular complete edentulous area and performed a computer-based comparative analysis of the accuracy in the reproducibility of the provisional restoration to final restoration between a 3D scanning and CAD/CAM (Scan/CAD/CAM) technique and the conventional silicone-mold transfer technique. Final restorations fabricated either by the conventional or Scan/CAD/CAM method were successfully installed in the patient. The total concave/convex volume discrepancy observed with the Scan/CAD/CAM technique was 503.50mm(3) and 338.15 mm(3) for maxillary and mandibular implant-supported prostheses (ISPs), respectively. On the other hand, total concave/convex volume discrepancy observed with the conventional method was markedly high (1106.84 mm(3) and 771.23 mm(3) for maxillary and mandibular ISPs, respectively). The results of the present report suggest that Scan/CAD/CAM method enables a more precise and accurate transfer of provisional restorations to final restorations compared to the conventional method. Copyright © 2014 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Nuclear Magnetic Resonance Spectroscopy-Based Identification of Yeast.
Himmelreich, Uwe; Sorrell, Tania C; Daniel, Heide-Marie
2017-01-01
Rapid and robust high-throughput identification of environmental, industrial, or clinical yeast isolates is important whenever relatively large numbers of samples need to be processed in a cost-efficient way. Nuclear magnetic resonance (NMR) spectroscopy generates complex data based on metabolite profiles, chemical composition and possibly on medium consumption, which can not only be used for the assessment of metabolic pathways but also for accurate identification of yeast down to the subspecies level. Initial results on NMR based yeast identification where comparable with conventional and DNA-based identification. Potential advantages of NMR spectroscopy in mycological laboratories include not only accurate identification but also the potential of automated sample delivery, automated analysis using computer-based methods, rapid turnaround time, high throughput, and low running costs.We describe here the sample preparation, data acquisition and analysis for NMR-based yeast identification. In addition, a roadmap for the development of classification strategies is given that will result in the acquisition of a database and analysis algorithms for yeast identification in different environments.
ERIC Educational Resources Information Center
Hermannsson, Kristinn; Lisenkova, Katerina; McGregor, Peter G.; Swales, J. Kim
2015-01-01
This paper analyses the impact of London-based higher education institutions (HEIs) on the English economy. When we treat each of the HEIs as separate sectors in conventional input-output analysis, their expenditure impacts appear rather homogenous, with the apparent heterogeneity of their overall impacts being primarily driven by scale. However,…
NASA Astrophysics Data System (ADS)
He, Xin; Frey, Eric C.
2007-03-01
Binary ROC analysis has solid decision-theoretic foundations and a close relationship to linear discriminant analysis (LDA). In particular, for the case of Gaussian equal covariance input data, the area under the ROC curve (AUC) value has a direct relationship to the Hotelling trace. Many attempts have been made to extend binary classification methods to multi-class. For example, Fukunaga extended binary LDA to obtain multi-class LDA, which uses the multi-class Hotelling trace as a figure-of-merit, and we have previously developed a three-class ROC analysis method. This work explores the relationship between conventional multi-class LDA and three-class ROC analysis. First, we developed a linear observer, the three-class Hotelling observer (3-HO). For Gaussian equal covariance data, the 3- HO provides equivalent performance to the three-class ideal observer and, under less strict conditions, maximizes the signal to noise ratio for classification of all pairs of the three classes simultaneously. The 3-HO templates are not the eigenvectors obtained from multi-class LDA. Second, we show that the three-class Hotelling trace, which is the figureof- merit in the conventional three-class extension of LDA, has significant limitations. Third, we demonstrate that, under certain conditions, there is a linear relationship between the eigenvectors obtained from multi-class LDA and 3-HO templates. We conclude that the 3-HO based on decision theory has advantages both in its decision theoretic background and in the usefulness of its figure-of-merit. Additionally, there exists the possibility of interpreting the two linear features extracted by the conventional extension of LDA from a decision theoretic point of view.
Evaluation of mechanical and thermal properties of commonly used denture base resins.
Phoenix, Rodney D; Mansueto, Michael A; Ackerman, Neal A; Jones, Robert E
2004-03-01
The purpose of this investigation was to evaluate and compare the mechanical and thermal properties of 6 commonly used polymethyl methacrylate denture base resins. Sorption, solubility, color stability, adaptation, flexural stiffness, and hardness were assessed to determine compliance with ADA Specification No. 12. Thermal assessments were performed using differential scanning calorimetry and dynamic mechanical analysis. Results were assessed using statistical and observational analyses. All materials satisfied ADA requirements for sorption, solubility, and color stability. Adaptation testing indicated that microwave-activated systems provided better adaptation to associated casts than conventional heat-activated resins. According to flexural testing results, microwaveable resins were relatively stiff, while rubber-modified resins were more flexible. Differential scanning calorimetry indicated that microwave-activated systems were more completely polymerized than conventional heat-activated materials. The microwaveable resins displayed better adaptation, greater stiffness, and greater surface hardness than other denture base resins included in this investigation. Elastomeric toughening agents yielded decreased stiffness, decreased surface hardness, and decreased glass transition temperatures.
Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara
2017-01-01
Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.
Zhou, Hufeng; Gao, Shangzhi; Nguyen, Nam Ninh; Fan, Mengyuan; Jin, Jingjing; Liu, Bing; Zhao, Liang; Xiong, Geng; Tan, Min; Li, Shijun; Wong, Limsoon
2014-04-08
H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are essential for understanding the infection mechanism of the formidable pathogen M. tuberculosis H37Rv. Computational prediction is an important strategy to fill the gap in experimental H. sapiens-M. tuberculosis H37Rv PPI data. Homology-based prediction is frequently used in predicting both intra-species and inter-species PPIs. However, some limitations are not properly resolved in several published works that predict eukaryote-prokaryote inter-species PPIs using intra-species template PPIs. We develop a stringent homology-based prediction approach by taking into account (i) differences between eukaryotic and prokaryotic proteins and (ii) differences between inter-species and intra-species PPI interfaces. We compare our stringent homology-based approach to a conventional homology-based approach for predicting host-pathogen PPIs, based on cellular compartment distribution analysis, disease gene list enrichment analysis, pathway enrichment analysis and functional category enrichment analysis. These analyses support the validity of our prediction result, and clearly show that our approach has better performance in predicting H. sapiens-M. tuberculosis H37Rv PPIs. Using our stringent homology-based approach, we have predicted a set of highly plausible H. sapiens-M. tuberculosis H37Rv PPIs which might be useful for many of related studies. Based on our analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent homology-based approach, we have discovered several interesting properties which are reported here for the first time. We find that both host proteins and pathogen proteins involved in the host-pathogen PPIs tend to be hubs in their own intra-species PPI network. Also, both host and pathogen proteins involved in host-pathogen PPIs tend to have longer primary sequence, tend to have more domains, tend to be more hydrophilic, etc. And the protein domains from both host and pathogen proteins involved in host-pathogen PPIs tend to have lower charge, and tend to be more hydrophilic. Our stringent homology-based prediction approach provides a better strategy in predicting PPIs between eukaryotic hosts and prokaryotic pathogens than a conventional homology-based approach. The properties we have observed from the predicted H. sapiens-M. tuberculosis H37Rv PPI network are useful for understanding inter-species host-pathogen PPI networks and provide novel insights for host-pathogen interaction studies.
The Cost-Effectiveness of Nuclear Power for Navy Surface Ships
2011-05-01
shipbuilding plan. 1 All of the Navy’s aircraft car- riers (and submarines) are powered by nuclear reactors ; its other surface combatants are powered by...in whether the ships were powered by conventional systems that used petroleum-based fuels or by nuclear reactors . Estimates of the relative costs...would existing ships be retrofitted with nuclear reactors . 5. Those fuel -reduction findings are based on CBO’s analysis and on data provided to CBO by
NASA Astrophysics Data System (ADS)
Zhang, Rui; Xin, Binjie
2016-08-01
Yarn density is always considered as the fundamental structural parameter used for the quality evaluation of woven fabrics. The conventional yarn density measurement method is based on one-side analysis. In this paper, a novel density measurement method is developed for yarn-dyed woven fabrics based on a dual-side fusion technique. Firstly, a lab-used dual-side imaging system is established to acquire both face-side and back-side images of woven fabric and the affine transform is used for the alignment and fusion of the dual-side images. Then, the color images of the woven fabrics are transferred from the RGB to the CIE-Lab color space, and the intensity information of the image extracted from the L component is used for texture fusion and analysis. Subsequently, three image fusion methods are developed and utilized to merge the dual-side images: the weighted average method, wavelet transform method and Laplacian pyramid blending method. The fusion efficacy of each method is evaluated by three evaluation indicators and the best of them is selected to do the reconstruction of the complete fabric texture. Finally, the yarn density of the fused image is measured based on the fast Fourier transform, and the yarn alignment image could be reconstructed using the inverse fast Fourier transform. Our experimental results show that the accuracy of density measurement by using the proposed method is close to 99.44% compared with the traditional method and the robustness of this new proposed method is better than that of conventional analysis methods.
Granchi, Simona; Vannacci, Enrico; Biagi, Elena
2017-04-22
To evaluate the capability of the HyperSPACE (Hyper SPectral Analysis for Characterization in Echography) method in tissue characterization, in order to provide information for the laser treatment of benign thyroid nodules in respect of conventional B-mode images and elastography. The method, based on the spectral analysis of the raw radiofrequency ultrasonic signal, was applied to characterize the nodule before and after laser treatment. Thirty patients (25 females and 5 males, age between 37 and 81 years) with thyroid benign nodule at cytology (Thyr 2) were evaluated by conventional ultrasonography, elastography, and HyperSPACE, before and after laser ablation. The images processed by HyperSPACE exhibit different color distributions that are referred to different tissue features. By calculating the percentages of the color coverages, the analysed nodules were subdivided into 3 groups. Each nodule belonging to the same group experienced, on average, similar necrosis extension. The nodules exhibit different Configurations (colors) distributions that could be indicative of the response of nodular tissue to the laser treatmentConclusions: HyperSPACEcan characterize benign nodules by providing additional information in respect of conventional ultrasound and elastography which is useful for support in the laser treatment of nodules in order to increase the probability of success.
FEA and microstructure characterization of a one-piece Y-TZP abutment.
da Silva, Lucas Hian; Ribeiro, Sebastião; Borges, Alexandre Luís Souto; Cesar, Paulo Francisco; Tango, Rubens Nisie
2014-11-01
The most important drawback of dental implant/abutment assemblies is the need for a fixing screw. This study aimed to develop an esthetic one-piece Y-TZP abutment to suppress the use of the screw. Material characterization was performed using a bar-shaped specimen obtained by slip-casting to validate the method prior to prototype abutment fabrication by the same process. The mechanical behavior of the prototype abutment was verified and compared with a conventional abutment by finite element analysis (FEA). The abutment was evaluated by micro-CT analysis and its density was measured. FEA showed stress concentration at the first thread pitch during installation and in the cervical region during oblique loading for both abutments. However, stress concentration was observed at the base of the screw head and stem in the conventional abutment. The relative density for the fabricated abutment was 95.68%. Micro-CT analysis revealed the presence of elongated cracks with sharp edges over the surface and porosity in the central region. In the light of these findings, the behavior of a one-piece abutment is expected to be better than that of the conventional model. New studies should be conducted to clarify the performance and longevity of this one-piece Y-TZP abutment. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Dai, Yuntao; Rozema, Evelien; Verpoorte, Robert; Choi, Young Hae
2016-02-19
Natural deep eutectic solvents (NADES) have attracted a great deal of attention in recent times as promising green media. They are generally composed of neutral, acidic or basic compounds that form liquids of high viscosity when mixed in certain molar ratio. Despite their potential, viscosity and acid or basic nature of some ingredients may affect the extraction capacity and stabilizing ability of the target compounds. To investigate these effects, extraction with a series of NADES was employed for the analysis of anthocyanins in flower petals of Catharanthus roseus in combination with HPLC-DAD-based metabolic profiling. Along with the extraction yields of anthocyanins their stability in NADES was also studied. Multivariate data analysis indicates that the lactic acid-glucose (LGH), and 1,2-propanediol-choline chloride (PCH) NADES present a similar extraction power for anthocyanins as conventional organic solvents. Furthermore, among the NADES employed, LGH exhibits an at least three times higher stabilizing capacity for cyanidins than acidified ethanol, which facilitates their extraction and analysis process. Comparing NADES to the conventional organic solvents, in addition to their reduced environmental impact, they proved to provide higher stability for anthocyanins, and therefore have a great potential as possible alternatives to those organic solvents in health related areas such as food, pharmaceuticals and cosmetics. Copyright © 2016 Elsevier B.V. All rights reserved.
Next generation sequencing (NGS): a golden tool in forensic toolkit.
Aly, S M; Sabri, D M
The DNA analysis is a cornerstone in contemporary forensic sciences. DNA sequencing technologies are powerful tools that enrich molecular sciences in the past based on Sanger sequencing and continue to glowing these sciences based on Next generation sequencing (NGS). Next generation sequencing has excellent potential to flourish and increase the molecular applications in forensic sciences by jumping over the pitfalls of the conventional method of sequencing. The main advantages of NGS compared to conventional method that it utilizes simultaneously a large number of genetic markers with high-resolution of genetic data. These advantages will help in solving several challenges such as mixture analysis and dealing with minute degraded samples. Based on these new technologies, many markers could be examined to get important biological data such as age, geographical origins, tissue type determination, external visible traits and monozygotic twins identification. It also could get data related to microbes, insects, plants and soil which are of great medico-legal importance. Despite the dozens of forensic research involving NGS, there are requirements before using this technology routinely in forensic cases. Thus, there is a great need to more studies that address robustness of these techniques. Therefore, this work highlights the applications of forensic sciences in the era of massively parallel sequencing.
Patil, Nagaraj; Soni, Jalpa; Ghosh, Nirmalya; De, Priyadarsi
2012-11-29
Thermodynamically favored polymer-water interactions below the lower critical solution temperature (LCST) caused swelling-induced optical anisotropy (linear retardance) of thermoresponsive hydrogels based on poly(2-(2-methoxyethoxy)ethyl methacrylate). This was exploited to study the macroscopic deswelling kinetics quantitatively by a generalized polarimetry analysis method, based on measurement of the Mueller matrix and its subsequent inverse analysis via the polar decomposition approach. The derived medium polarization parameters, namely, linear retardance (δ), diattenuation (d), and depolarization coefficient (Δ), of the hydrogels showed interesting differences between the gels prepared by conventional free radical polymerization (FRP) and reversible addition-fragmentation chain transfer polymerization (RAFT) and also between dry and swollen state. The effect of temperature, cross-linking density, and polymerization technique employed to synthesize hydrogel on deswelling kinetics was systematically studied via conventional gravimetry and corroborated further with the corresponding Mueller matrix derived quantitative polarimetry characteristics (δ, d, and Δ). The RAFT gels exhibited higher swelling ratio and swelling-induced optical anisotropy compared to FRP gels and also deswelled faster at 30 °C. On the contrary, at 45 °C, deswelling was significantly retarded for the RAFT gels due to formation of a skin layer, which was confirmed and quantified via the enhanced diattenuation and depolarization parameters.
Bashir, Saba; Qamar, Usman; Khan, Farhan Hassan
2015-06-01
Conventional clinical decision support systems are based on individual classifiers or simple combination of these classifiers which tend to show moderate performance. This research paper presents a novel classifier ensemble framework based on enhanced bagging approach with multi-objective weighted voting scheme for prediction and analysis of heart disease. The proposed model overcomes the limitations of conventional performance by utilizing an ensemble of five heterogeneous classifiers: Naïve Bayes, linear regression, quadratic discriminant analysis, instance based learner and support vector machines. Five different datasets are used for experimentation, evaluation and validation. The datasets are obtained from publicly available data repositories. Effectiveness of the proposed ensemble is investigated by comparison of results with several classifiers. Prediction results of the proposed ensemble model are assessed by ten fold cross validation and ANOVA statistics. The experimental evaluation shows that the proposed framework deals with all type of attributes and achieved high diagnosis accuracy of 84.16 %, 93.29 % sensitivity, 96.70 % specificity, and 82.15 % f-measure. The f-ratio higher than f-critical and p value less than 0.05 for 95 % confidence interval indicate that the results are extremely statistically significant for most of the datasets.
A system for the rapid detection of bacterial contamination in cell-based therapeutica
NASA Astrophysics Data System (ADS)
Bolwien, Carsten; Erhardt, Christian; Sulz, Gerd; Thielecke, Hagen; Johann, Robert; Pudlas, Marieke; Mertsching, Heike; Koch, Steffen
2010-02-01
Monitoring the sterility of cell or tissue cultures is of major concern, particularly in the fields of regenerative medicine and tissue engineering when implanting cells into the human body. Our sterility-control system is based on a Raman micro-spectrometer and is able to perform fast sterility testing on microliters of liquid samples. In conventional sterility control, samples are incubated for weeks to proliferate the contaminants to concentrations above the detection limit of conventional analysis. By contrast, our system filters particles from the liquid sample. The filter chip fabricated in microsystem technology comprises a silicon nitride membrane with millions of sub-micrometer holes to retain particles of critical sizes and is embedded in a microfluidic cell specially suited for concomitant microscopic observation. After filtration, identification is carried out on the single particle level: image processing detects possible contaminants and prepares them for Raman spectroscopic analysis. A custom-built Raman-spectrometer-attachment coupled to the commercial microscope uses 532nm or 785nm Raman excitation and records spectra up to 3400cm-1. In the final step, the recorded spectrum of a single particle is compared to an extensive library of GMP-relevant organisms, and classification is carried out based on a support vector machine.
PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD
NASA Astrophysics Data System (ADS)
Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao
Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.
Kuesten, Carla; Bi, Jian
2018-06-03
Conventional drivers of liking analysis was extended with a time dimension into temporal drivers of liking (TDOL) based on functional data analysis methodology and non-additive models for multiple-attribute time-intensity (MATI) data. The non-additive models, which consider both direct effects and interaction effects of attributes to consumer overall liking, include Choquet integral and fuzzy measure in the multi-criteria decision-making, and linear regression based on variance decomposition. Dynamics of TDOL, i.e., the derivatives of the relative importance functional curves were also explored. Well-established R packages 'fda', 'kappalab' and 'relaimpo' were used in the paper for developing TDOL. Applied use of these methods shows that the relative importance of MATI curves offers insights for understanding the temporal aspects of consumer liking for fruit chews.
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirvani, Shervin M.; Jiang, Jing; Chang, Joe Y.
2012-12-01
Purpose: The incidence of early-stage non-small cell lung cancer (NSCLC) among older adults is expected to increase because of demographic trends and computed tomography-based screening; yet, optimal treatment in the elderly remains controversial. Using the Surveillance, Epidemiology, and End Results (SEER)-Medicare cohort spanning 2001-2007, we compared survival outcomes associated with 5 strategies used in contemporary practice: lobectomy, sublobar resection, conventional radiation therapy, stereotactic ablative radiation therapy (SABR), and observation. Methods and Materials: Treatment strategy and covariates were determined in 10,923 patients aged {>=}66 years with stage IA-IB NSCLC. Cox regression, adjusted for patient and tumor factors, compared overall and disease-specificmore » survival for the 5 strategies. In a second exploratory analysis, propensity-score matching was used for comparison of SABR with other options. Results: The median age was 75 years, and 29% had moderate to severe comorbidities. Treatment distribution was lobectomy (59%), sublobar resection (11.7%), conventional radiation (14.8%), observation (12.6%), and SABR (1.1%). In Cox regression analysis with a median follow-up time of 3.2 years, SABR was associated with the lowest risk of death within 6 months of diagnosis (hazard ratio [HR] 0.48; 95% confidence interval [CI] 0.38-0.63; referent is lobectomy). After 6 months, lobectomy was associated with the best overall and disease-specific survival. In the propensity-score matched analysis, survival after SABR was similar to that after lobectomy (HR 0.71; 95% CI 0.45-1.12; referent is SABR). Conventional radiation and observation were associated with poor outcomes in all analyses. Conclusions: In this population-based experience, lobectomy was associated with the best long-term outcomes in fit elderly patients with early-stage NSCLC. Exploratory analysis of SABR early adopters suggests efficacy comparable with that of surgery in select populations. Evaluation of these therapies in randomized trials is urgently needed.« less
Tissue classification for laparoscopic image understanding based on multispectral texture analysis
NASA Astrophysics Data System (ADS)
Zhang, Yan; Wirkert, Sebastian J.; Iszatt, Justin; Kenngott, Hannes; Wagner, Martin; Mayer, Benjamin; Stock, Christian; Clancy, Neil T.; Elson, Daniel S.; Maier-Hein, Lena
2016-03-01
Intra-operative tissue classification is one of the prerequisites for providing context-aware visualization in computer-assisted minimally invasive surgeries. As many anatomical structures are difficult to differentiate in conventional RGB medical images, we propose a classification method based on multispectral image patches. In a comprehensive ex vivo study we show (1) that multispectral imaging data is superior to RGB data for organ tissue classification when used in conjunction with widely applied feature descriptors and (2) that combining the tissue texture with the reflectance spectrum improves the classification performance. Multispectral tissue analysis could thus evolve as a key enabling technique in computer-assisted laparoscopy.
Longo, Dario Livio; Dastrù, Walter; Consolino, Lorena; Espak, Miklos; Arigoni, Maddalena; Cavallo, Federica; Aime, Silvio
2015-07-01
The objective of this study was to compare a clustering approach to conventional analysis methods for assessing changes in pharmacokinetic parameters obtained from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) during antiangiogenic treatment in a breast cancer model. BALB/c mice bearing established transplantable her2+ tumors were treated with a DNA-based antiangiogenic vaccine or with an empty plasmid (untreated group). DCE-MRI was carried out by administering a dose of 0.05 mmol/kg of Gadocoletic acid trisodium salt, a Gd-based blood pool contrast agent (CA) at 1T. Changes in pharmacokinetic estimates (K(trans) and vp) in a nine-day interval were compared between treated and untreated groups on a voxel-by-voxel analysis. The tumor response to therapy was assessed by a clustering approach and compared with conventional summary statistics, with sub-regions analysis and with histogram analysis. Both the K(trans) and vp estimates, following blood-pool CA injection, showed marked and spatial heterogeneous changes with antiangiogenic treatment. Averaged values for the whole tumor region, as well as from the rim/core sub-regions analysis were unable to assess the antiangiogenic response. Histogram analysis resulted in significant changes only in the vp estimates (p<0.05). The proposed clustering approach depicted marked changes in both the K(trans) and vp estimates, with significant spatial heterogeneity in vp maps in response to treatment (p<0.05), provided that DCE-MRI data are properly clustered in three or four sub-regions. This study demonstrated the value of cluster analysis applied to pharmacokinetic DCE-MRI parametric maps for assessing tumor response to antiangiogenic therapy. Copyright © 2015 Elsevier Inc. All rights reserved.
Impacts on Conventional Generators | Energy Analysis | NREL
Impacts on Conventional Generators Impacts on Conventional Generators NREL is working to understand and quantify the impacts of increased penetrations of solar and wind generation on conventional plant efficiency and emissions. NREL's analyses of impacts of renewable electricity generation on
Tutorial: Crystal orientations and EBSD — Or which way is up?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Britton, T.B., E-mail: b.britton@imperial.ac.uk; Jiang, J.; Guo, Y.
2016-07-15
Electron backscatter diffraction (EBSD) is an automated technique that can measure the orientation of crystals in a sample very rapidly. There are many sophisticated software packages that present measured data. Unfortunately, due to crystal symmetry and differences in the set-up of microscope and EBSD software, there may be accuracy issues when linking the crystal orientation to a particular microstructural feature. In this paper we outline a series of conventions used to describe crystal orientations and coordinate systems. These conventions have been used to successfully demonstrate that a consistent frame of reference is used in the sample, unit cell, pole figuremore » and diffraction pattern frames of reference. We establish a coordinate system rooted in measurement of the diffraction pattern and subsequently link this to all other coordinate systems. A fundamental outcome of this analysis is to note that the beamshift coordinate system needs to be precisely defined for consistent 3D microstructure analysis. This is supported through a series of case studies examining particular features of the microscope settings and/or unambiguous crystallographic features. These case studies can be generated easily in most laboratories and represent an opportunity to demonstrate confidence in use of recorded orientation data. Finally, we include a simple software tool, written in both MATLAB® and Python, which the reader can use to compare consistency with their own microscope set-up and which may act as a springboard for further offline analysis. - Highlights: • Presentation of conventions used to describe crystal orientations • Three case studies that outline how conventions are consistent • Demonstrates a pathway for calibration and validation of EBSD based orientation measurements • EBSD computer code supplied for validation by the reader.« less
Kiefer, Lukas; Menzel, Friederike; Bahrs, Enno
2014-12-01
The reduction of product-related greenhouse gas (GHG) emissions in milk production appears to be necessary. The reduction of emissions on an individual farm might be highly accepted by farm owners if it were accompanied by an increase in profitability. Using life cycle assessments to determine the product carbon footprints (PCF) and farm-level evaluations to record profitability, we explored opportunities for optimization based on analysis of 81 organic and conventional pasture-based dairy farms in southern Germany. The objective of the present study was to detect common determining factors for low PCF and high management incomes (MI) to achieve GHG reductions at the lowest possible operational cost. In our sample, organic farms, which performed economically better than conventional farms, produced PCF that were significantly higher than those produced by conventional farms [1.61 ± 0.29 vs. 1.45 ± 0.28 kg of CO₂ equivalents (CO₂eq) per kg of milk; means ± SD)]. A multiple linear regression analysis of the sample demonstrated that low feed demand per kilogram of milk, high grassland yield, and low forage area requirements per cow are the main factors that decrease PCF. These factors are also useful for improving a farm's profitability in principle. For organic farms, a reduction of feed demand of 100 g/kg of milk resulted in a PCF reduction of 105 g of CO₂eq/kg of milk and an increase in MI of approximately 2.1 euro cents (c)/kg of milk. For conventional farms, a decrease of feed demand of 100 g/kg of milk corresponded to a reduction in PCF of 117 g of CO₂eq/kg of milk and an increase in MI of approximately 3.1 c/kg of milk. Accordingly, farmers could achieve higher profits while reducing GHG emissions. Improved education and training of farmers and consultants regarding GHG mitigation and farm profitability appear to be the best methods of improving efficiency under traditional and organic farming practices.
Analysis of accelerated motion in the theory of relativity
NASA Technical Reports Server (NTRS)
Jones, R. T.
1976-01-01
Conventional treatments of accelerated motion in the theory of relativity have led to certain difficulties of interpretation. Certain reversals in the apparent gravitational field of an accelerated body may be avoided by simpler analysis based on the use of restricted conformal transformations. In the conformal theory the velocity of light remains constant even for experimenters in accelerated motion. The problem considered is that of rectilinear motion with a variable velocity. The motion takes place along the x or x' axis of two coordinate systems.
Evaluation of direct and indirect additive manufacture of maxillofacial prostheses.
Eggbeer, Dominic; Bibb, Richard; Evans, Peter; Ji, Lu
2012-09-01
The efficacy of computer-aided technologies in the design and manufacture of maxillofacial prostheses has not been fully proven. This paper presents research into the evaluation of direct and indirect additive manufacture of a maxillofacial prosthesis against conventional laboratory-based techniques. An implant/magnet-retained nasal prosthesis case from a UK maxillofacial unit was selected as a case study. A benchmark prosthesis was fabricated using conventional laboratory-based techniques for comparison against additive manufactured prostheses. For the computer-aided workflow, photogrammetry, computer-aided design and additive manufacture (AM) methods were evaluated in direct prosthesis body fabrication and indirect production using an additively manufactured mould. Qualitative analysis of position, shape, colour and edge quality was undertaken. Mechanical testing to ISO standards was also used to compare the silicone rubber used in the conventional prosthesis with the AM material. Critical evaluation has shown that utilising a computer-aided work-flow can produce a prosthesis body that is comparable to that produced using existing best practice. Technical limitations currently prevent the direct fabrication method demonstrated in this paper from being clinically viable. This research helps prosthesis providers understand the application of a computer-aided approach and guides technology developers and researchers to address the limitations identified.
Fatania, Nita; Fraser, Mark; Savage, Mike; Hart, Jason; Abdolrasouli, Alireza
2015-12-01
Performance of matrix-assisted laser desorption ionisation-time of flight mass spectrometry (MALDI-TOF MS) was compared in a side-by side-analysis with conventional phenotypic methods currently in use in our laboratory for identification of yeasts in a routine diagnostic setting. A diverse collection of 200 clinically important yeasts (19 species, five genera) were identified by both methods using standard protocols. Discordant or unreliable identifications were resolved by sequencing of the internal transcribed spacer region of the rRNA gene. MALDI-TOF and conventional methods were in agreement for 182 isolates (91%) with correct identification to species level. Eighteen discordant results (9%) were due to rarely encountered species, hence the difficulty in their identification using traditional phenotypic methods. MALDI-TOF MS enabled rapid, reliable and accurate identification of clinically important yeasts in a routine diagnostic microbiology laboratory. Isolates with rare, unusual or low probability identifications should be confirmed using robust molecular methods. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Fast, high temperature and thermolabile GC--MS in supersonic molecular beams
NASA Astrophysics Data System (ADS)
Dagan, Shai; Amirav, Aviv
1994-05-01
This work describes and evaluates the coupling of a fast gas chromatograph (GC) based on a short column and high carrier gas flow rate to a supersonic molecular beam mass spectrometer (MS). A 50 cm long megabore column serves for fast GC separation and connects the injector to the supersonic nozzle source. Sampling is achieved with a conventional syringe based splitless sample injection. The injector contains no septum and is open to the atmosphere. The linear velocity of the carrier gas is controlled by a by-pass (make-up) gas flow introduced after the column and prior to the supersonic nozzle. The supersonic expansion serves as a jet separator and the skimmed supersonic molecular beam (SMB) is highly enriched with the heavier organic molecules. The supersonic molecular beam constituents are ionized either by electron impact (EI) or hyperthermal surface ionization (HSI) and mass analyzed. A 1 s fast GC--MS of four aromatic molecules in methanol is demonstrated and some fundamental aspects of fast GC--MS with time limit constraints are outlined. The flow control (programming) of the speed of analysis is shown and the analysis of thermolabile and relatively non-volatile molecules is demonstrated and discussed. The tail-free, fast GC--MS of several mixtures is shown and peak tailing of caffeine is compared with that of conventional GC--MS. The improvement of the peak shapes with the SMB--MS is analyzed with the respect to the elimination of thermal vacuum chamber background. The extrapolated minimum detected amount was about 400 ag of anthracence-d10, with an elution time which was shorter than 2s. Repetitive injections could be performed within less than 10 s. The fast GC--MS in SMB seems to be ideal for fast target compound analysis even in real world, complex mixtures. The few seconds GC--MS separation and quantification of lead (as tetraethyllead) in gasoline, caffeine in coffee, and codeine in a drug is demonstrated. Controlled HSI selectivity is demonstrated in the range of 101 to 104 anthracene/decane which helped to simplify the selective analysis of aromatic molecules in gasoline. The contribution of SMB to the operation of the fast GC--MS is summarized and the compatibility with conventional GC having a megabore column is shown. Splitless injections of 100 [mu]L sample solutions for trace level concentration detection is also presented (with a conventional GC).
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Rosen, Jules; Mulsant, Benoit H; Marino, Patricia; Groening, Christopher; Young, Robert C; Fox, Debra
2008-10-30
Despite the importance of establishing shared scoring conventions and assessing interrater reliability in clinical trials in psychiatry, these elements are often overlooked. Obstacles to rater training and reliability testing include logistic difficulties in providing live training sessions, or mailing videotapes of patients to multiple sites and collecting the data for analysis. To address some of these obstacles, a web-based interactive video system was developed. It uses actors of diverse ages, gender and race to train raters how to score the Hamilton Depression Rating Scale and to assess interrater reliability. This system was tested with a group of experienced and novice raters within a single site. It was subsequently used to train raters of a federally funded multi-center clinical trial on scoring conventions and to test their interrater reliability. The advantages and limitations of using interactive video technology to improve the quality of clinical trials are discussed.
Jin, Hong-Ying; Li, Da-Wei; Zhang, Na; Gu, Zhen; Long, Yi-Tao
2015-06-10
We demonstrated a practical method to analyze carbohydrate-protein interaction based on single plasmonic nanoparticles by conventional dark field microscopy (DFM). Protein concanavalin A (ConA) was modified on large sized gold nanoparticles (AuNPs), and dextran was conjugated on small sized AuNPs. As the interaction between ConA and dextran resulted in two kinds of gold nanoparticles coupled together, which caused coupling of plasmonic oscillations, apparent color changes (from green to yellow) of the single AuNPs were observed through DFM. Then, the color information was instantly transformed into a statistic peak wavelength distribution in less than 1 min by a self-developed statistical program (nanoparticleAnalysis). In addition, the interaction between ConA and dextran was proved with biospecific recognition. This approach is high-throughput and real-time, and is a convenient method to analyze carbohydrate-protein interaction at the single nanoparticle level efficiently.
Field-Controlled Electrical Switch with Liquid Metal.
Wissman, James; Dickey, Michael D; Majidi, Carmel
2017-12-01
When immersed in an electrolyte, droplets of Ga-based liquid metal (LM) alloy can be manipulated in ways not possible with conventional electrocapillarity or electrowetting. This study demonstrates how LM electrochemistry can be exploited to coalesce and separate droplets under moderate voltages of ~1-10 V. This novel approach to droplet interaction can be explained with a theory that accounts for oxidation and reduction as well as fluidic instabilities. Based on simulations and experimental analysis, this study finds that droplet separation is governed by a unique limit-point instability that arises from gradients in bipolar electrochemical reactions that lead to gradients in interfacial tension. The LM coalescence and separation are used to create a field-programmable electrical switch. As with conventional relays or flip-flop latch circuits, the system can transition between bistable (separated or coalesced) states, making it useful for memory storage, logic, and shape-programmable circuitry using entirely liquids instead of solid-state materials.
Bottom, William P
2009-01-01
Conventional history of the predominant, research-based model of business education (RBM) traces its origins to programs initiated by the Ford Foundation after World War II. This paper maps the elite network responsible for developing behavioral science and the Ford Foundation agenda. Archival records of the actions taken by central nodes in the network permit identification of the original vision statement for the model. Analysis also permits tracking progress toward realizing that vision over several decades. Behavioral science was married to business education from the earliest stages of development. The RBM was a fundamental promise made by advocates for social science funding. Appraisals of the model and recommendations for reform must address its full history, not the partial, distorted view that is the conventional account. Implications of this more complete history for business education and for behavioral theory are considered.
[The use of fenspiride for the combined treatment of exacerbation of chronic laryngitis].
Ryabova, M A
The present study was carried out based at the Department of Otorhinolaryngology of I.P. Pavlov First State Medical University of Saint-Petersburg. The objective of this work was to elucidate the efficacy and safety of fenspiride therapy for the treatment of exacerbation of chronic laryngitis associated with an acute respiratory infection. The patients comprising the main group received fenspiride (Eurespal, 'Servier', France) at the standard dose in addition to the conventional therapy with the use of antibiotics, inhalation, and voice rest. The patients in the group of comparison were treated following the conventional protocol without fenspiride. The clinical symptoms evaluated based on the scoring system, the results of videolaryngoscopy, and computer-assisted analysis of the voice were compared before and after treatment in the patients of both groups. The results of the study have confirmed the high effectiveness and safety of fenspiride therapy of exacerbation of chronic laryngitis.
NASA Astrophysics Data System (ADS)
Yeo, U. J.; Taylor, M. L.; Kron, T.; Pham, D.; Siva, S.; Franich, R. D.
2013-06-01
Respiratory motion induces dosimetric uncertainties for thoracic and abdominal cancer radiotherapy (RT) due to deforming and moving anatomy. This study investigates the extent of dosimetric differences between conventional 3D treatment planning and path-integrated 4D treatment planning in liver stereotactic body radiotherapy (SBRT). Respiratory-correlated 4DCT image sets with 10 phases were acquired for patients with liver tumours. Path-integrated 4D dose accumulation was performed using dose-warping techniques based on deformable image registration. Dose-volume histogram analysis demonstrated that the 3D planning approach overestimated doses to targets by up to 24% and underestimated dose to normal liver by ~4.5%, compared to the 4D planning methodology. Therefore, 4D planning has the potential to quantify such issues of under- and/or over-dosage and improve treatment accuracy.
Denoising time-domain induced polarisation data using wavelet techniques
NASA Astrophysics Data System (ADS)
Deo, Ravin N.; Cull, James P.
2016-05-01
Time-domain induced polarisation (TDIP) methods are routinely used for near-surface evaluations in quasi-urban environments harbouring networks of buried civil infrastructure. A conventional technique for improving signal to noise ratio in such environments is by using analogue or digital low-pass filtering followed by stacking and rectification. However, this induces large distortions in the processed data. In this study, we have conducted the first application of wavelet based denoising techniques for processing raw TDIP data. Our investigation included laboratory and field measurements to better understand the advantages and limitations of this technique. It was found that distortions arising from conventional filtering can be significantly avoided with the use of wavelet based denoising techniques. With recent advances in full-waveform acquisition and analysis, incorporation of wavelet denoising techniques can further enhance surveying capabilities. In this work, we present the rationale for utilising wavelet denoising methods and discuss some important implications, which can positively influence TDIP methods.
Model-based spectral estimation of Doppler signals using parallel genetic algorithms.
Solano González, J; Rodríguez Vázquez, K; García Nocetti, D F
2000-05-01
Conventional spectral analysis methods use a fast Fourier transform (FFT) on consecutive or overlapping windowed data segments. For Doppler ultrasound signals, this approach suffers from an inadequate frequency resolution due to the time segment duration and the non-stationarity characteristics of the signals. Parametric or model-based estimators can give significant improvements in the time-frequency resolution at the expense of a higher computational complexity. This work describes an approach which implements in real-time a parametric spectral estimator method using genetic algorithms (GAs) in order to find the optimum set of parameters for the adaptive filter that minimises the error function. The aim is to reduce the computational complexity of the conventional algorithm by using the simplicity associated to GAs and exploiting its parallel characteristics. This will allow the implementation of higher order filters, increasing the spectrum resolution, and opening a greater scope for using more complex methods.
Examination of simplified travel demand model. [Internal volume forecasting model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, R.L. Jr.; McFarlane, W.J.
1978-01-01
A simplified travel demand model, the Internal Volume Forecasting (IVF) model, proposed by Low in 1972 is evaluated as an alternative to the conventional urban travel demand modeling process. The calibration of the IVF model for a county-level study area in Central Wisconsin results in what appears to be a reasonable model; however, analysis of the structure of the model reveals two primary mis-specifications. Correction of the mis-specifications leads to a simplified gravity model version of the conventional urban travel demand models. Application of the original IVF model to ''forecast'' 1960 traffic volumes based on the model calibrated for 1970more » produces accurate estimates. Shortcut and ad hoc models may appear to provide reasonable results in both the base and horizon years; however, as shown by the IVF mode, such models will not always provide a reliable basis for transportation planning and investment decisions.« less
Reliability of patient specific instrumentation in total knee arthroplasty.
Jennart, Harold; Ngo Yamben, Marie-Ange; Kyriakidis, Theofylaktos; Zorman, David
2015-12-01
The aim of this study was to compare the precision between Patient Specific Instrumentation (PSI) and Conventional Instrumentation (CI) as determined intra-operatively by a pinless navigation system. Eighty patients were included in this prospective comparative study and they were divided into two homogeneous groups. We defined an original score from 6 to 30 points to evaluate the accuracy of the position of the cutting guides. This score is based on 6 objective criteria. The analysis indicated that PSI was not superior to conventional instrumentation in the overall score (p = 0.949). Moreover, no statistically significant difference was observed for any individual criteria of our score. Level of evidence II.
Prediction of Groundwater Level at Slope Areas using Electrical Resistivity Method
NASA Astrophysics Data System (ADS)
Baharuddin, M. F. T.; Hazreek, Z. A. M.; Azman, M. A. A.; Madun, A.
2018-04-01
Groundwater level plays an important role as an agent that triggers landslides. Commonly, the conventional method used to monitor the groundwater level is done by using standpipe piezometer. There were several disadvantages of the conventional method related to cost, time and data coverage. The aim of this study is to determine groundwater level at slope areas using electrical resistivity method and to verify groundwater level of the study area with standpipe piezometer data. The data acquisition was performed using ABEM Terrameter SAS4000. For data analysis and processing, RES2DINV and SURFER were used. The groundwater level was calibrated with reference of standpipe piezometer based on electrical resistivity value (ERV).
Ingraffea, Anthony R; Wells, Martin T; Santoro, Renee L; Shonkoff, Seth B C
2014-07-29
Casing and cement impairment in oil and gas wells can lead to methane migration into the atmosphere and/or into underground sources of drinking water. An analysis of 75,505 compliance reports for 41,381 conventional and unconventional oil and gas wells in Pennsylvania drilled from January 1, 2000-December 31, 2012, was performed with the objective of determining complete and accurate statistics of casing and cement impairment. Statewide data show a sixfold higher incidence of cement and/or casing issues for shale gas wells relative to conventional wells. The Cox proportional hazards model was used to estimate risk of impairment based on existing data. The model identified both temporal and geographic differences in risk. For post-2009 drilled wells, risk of a cement/casing impairment is 1.57-fold [95% confidence interval (CI) (1.45, 1.67); P < 0.0001] higher in an unconventional gas well relative to a conventional well drilled within the same time period. Temporal differences between well types were also observed and may reflect more thorough inspections and greater emphasis on finding well leaks, more detailed note taking in the available inspection reports, or real changes in rates of structural integrity loss due to rushed development or other unknown factors. Unconventional gas wells in northeastern (NE) Pennsylvania are at a 2.7-fold higher risk relative to the conventional wells in the same area. The predicted cumulative risk for all wells (unconventional and conventional) in the NE region is 8.5-fold [95% CI (7.16, 10.18); P < 0.0001] greater than that of wells drilled in the rest of the state.
Gassner, C; Karlsson, R; Lipsmeier, F; Moelleken, J
2018-05-30
Previously we have introduced two SPR-based assay principles (dual-binding assay and bridging assay), which allow the determination of two out of three possible interaction parameters for bispecific molecules within one assay setup: two individual interactions to both targets, and/or one simultaneous/overall interaction, which potentially reflects the inter-dependency of both individual binding events. However, activity and similarity are determined by comparing report points over a concentration range, which also mirrors the way data is generated by conventional ELISA-based methods So far, binding kinetics have not been specifically considered in generic approaches for activity assessment. Here, we introduce an improved slope-ratio model which, together with a sensorgram comparison based similarity assessment, allows the development of a detailed, USP-conformal ligand binding assay using only a single sample concentration. We compare this novel analysis method to the usual concentration-range approach for both SPR-based assay principles and discuss its impact on data quality and increased sample throughput. Copyright © 2018 Elsevier B.V. All rights reserved.
Cluster-based exposure variation analysis
2013-01-01
Background Static posture, repetitive movements and lack of physical variation are known risk factors for work-related musculoskeletal disorders, and thus needs to be properly assessed in occupational studies. The aims of this study were (i) to investigate the effectiveness of a conventional exposure variation analysis (EVA) in discriminating exposure time lines and (ii) to compare it with a new cluster-based method for analysis of exposure variation. Methods For this purpose, we simulated a repeated cyclic exposure varying within each cycle between “low” and “high” exposure levels in a “near” or “far” range, and with “low” or “high” velocities (exposure change rates). The duration of each cycle was also manipulated by selecting a “small” or “large” standard deviation of the cycle time. Theses parameters reflected three dimensions of exposure variation, i.e. range, frequency and temporal similarity. Each simulation trace included two realizations of 100 concatenated cycles with either low (ρ = 0.1), medium (ρ = 0.5) or high (ρ = 0.9) correlation between the realizations. These traces were analyzed by conventional EVA, and a novel cluster-based EVA (C-EVA). Principal component analysis (PCA) was applied on the marginal distributions of 1) the EVA of each of the realizations (univariate approach), 2) a combination of the EVA of both realizations (multivariate approach) and 3) C-EVA. The least number of principal components describing more than 90% of variability in each case was selected and the projection of marginal distributions along the selected principal component was calculated. A linear classifier was then applied to these projections to discriminate between the simulated exposure patterns, and the accuracy of classified realizations was determined. Results C-EVA classified exposures more correctly than univariate and multivariate EVA approaches; classification accuracy was 49%, 47% and 52% for EVA (univariate and multivariate), and C-EVA, respectively (p < 0.001). All three methods performed poorly in discriminating exposure patterns differing with respect to the variability in cycle time duration. Conclusion While C-EVA had a higher accuracy than conventional EVA, both failed to detect differences in temporal similarity. The data-driven optimality of data reduction and the capability of handling multiple exposure time lines in a single analysis are the advantages of the C-EVA. PMID:23557439
Fernández-Arévalo, T; Lizarralde, I; Fdz-Polanco, F; Pérez-Elvira, S I; Garrido, J M; Puig, S; Poch, M; Grau, P; Ayesa, E
2017-07-01
The growing development of technologies and processes for resource treatment and recovery is offering endless possibilities for creating new plant-wide configurations or modifying existing ones. However, the configurations' complexity, the interrelation between technologies and the influent characteristics turn decision-making into a complex or unobvious process. In this frame, the Plant-Wide Modelling (PWM) library presented in this paper allows a thorough, comprehensive and refined analysis of different plant configurations that are basic aspects in decision-making from an energy and resource recovery perspective. In order to demonstrate the potential of the library and the need to run simulation analyses, this paper carries out a comparative analysis of WWTPs, from a techno-economic point of view. The selected layouts were (1) a conventional WWTP based on a modified version of the Benchmark Simulation Model No. 2, (2) an upgraded or retrofitted WWTP, and (3) a new Wastewater Resource Recovery Facilities (WRRF) concept denominated as C/N/P decoupling WWTP. The study was based on a preliminary analysis of the organic matter and nutrient energy use and recovery options, a comprehensive mass and energy flux distribution analysis in each configuration in order to compare and identify areas for improvement, and a cost analysis of each plant for different influent COD/TN/TP ratios. Analysing the plants from a standpoint of resources and energy utilization, a low utilization of the energy content of the components could be observed in all configurations. In the conventional plant, the COD used to produce biogas was around 29%, the upgraded plant was around 36%, and 34% in the C/N/P decoupling WWTP. With regard to the self-sufficiency of plants, achieving self-sufficiency was not possible in the conventional plant, in the upgraded plant it depended on the influent C/N ratio, and in the C/N/P decoupling WWTP layout self-sufficiency was feasible for almost all influents, especially at high COD concentrations. The plant layouts proposed in this paper are just a sample of the possibilities offered by current technologies. Even so, the library presented here is generic and can be used to construct any other plant layout, provided that a model is available. Copyright © 2017 Elsevier Ltd. All rights reserved.
The utility of indocyanine green fluorescence imaging during robotic adrenalectomy.
Colvin, Jennifer; Zaidi, Nisar; Berber, Eren
2016-08-01
Indocyanine green (ICG) has been used for medical imaging since 1950s, but has more recently become available for use in minimally invasive surgery owing to improvements in technology. This study investigates the use of ICG florescence to guide an accurate dissection by delineating the borders of adrenal tumors during robotic adrenalectomy (RA). This prospective study compared conventional robotic view with ICG fluorescence imaging in 40 consecutive patients undergoing RA. Independent, non-blinded observers assessed how accurately ICG fluorescence delineated the borders of adrenal tumors compared to conventional robotic view. A total of 40 patients underwent 43 adrenalectomies. ICG imaging was superior, equivalent, or inferior to conventional robotic view in 46.5% (n = 20), 25.6% (n = 11), and 27.9% (n = 12) of the procedures. On univariate analysis, the only parameter that predicted the superiority of ICG imaging over conventional robotic view was the tumor type, with adrenocortical tumors being delineated more accurately on ICG imaging compared to conventional robotic view. This study demonstrates the utility of ICG to guide the dissection and removal of adrenal tumors during RA. A simple reproducible method is reported, with a detailed description of the utility based on tumor type, approach and side. J. Surg. Oncol. 2016;114:153-156. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Xu, Aili; Du, Hongbo
2017-01-01
Objective This aim is to evaluate the effect of Sijunzi decoction (SJZD) treating chronic atrophic gastritis (CAG). Methods We performed searches in seven databases. The randomized controlled trials (RCTs) comparing SJZD with standard medical care or inactive intervention for CAG were enrolled. Combined therapy of SJZD plus conventional therapies compared with conventional therapies alone was also retrieved. The primary outcome included the incidence of gastric cancer and the improvement of atrophy, intestinal metaplasia, and dysplasia based on the gastroscopy and pathology. The secondary outcomes were Helicobacter pylori clearance rate, quality of life, and adverse event/adverse drug reaction. Results Six RCTs met the inclusion criteria. The research quality was low in the trials. For the overall effect rate, pooled analysis from 4 trials showed that modified SJZD plus conventional medications exhibited a significant improvement (OR = 4.86; 95% CI: 2.80 to 8.44; P < 0.00001) and without significant heterogeneity compared with the conventional medications alone. None reported the adverse effect. Conclusions Modified SJZD combined with conventional western medicines appears to have benefits for CAG. Due to the limited number and methodological flaw, the beneficial and harmful effects of SJZD for CAG could not be identified. More high-quality clinical trials are needed to confirm the results. PMID:29138645
Effects of different soil management practices on soil properties and microbial diversity
NASA Astrophysics Data System (ADS)
Gajda, Anna M.; Czyż, Ewa A.; Dexter, Anthony R.; Furtak, Karolina M.; Grządziel, Jarosław; Stanek-Tarkowska, Jadwiga
2018-01-01
The effects of different tillage systems on the properties and microbial diversity of an agricultural soil was investigated. In doing so, soil physical, chemical and biological properties were analysed in 2013-2015, on a long-term field experiment on a loamy sand at the IUNG-PIB Experimental Station in Grabów, Poland. Winter wheat was grown under two tillage treatments: conventional tillage using a mouldboard plough and traditional soil tillage equipment, and reduced tillage based on soil crushing-loosening equipment and a rigid-tine cultivator. Chopped wheat straw was used as a mulch on both treatments. Reduced tillage resulted in increased water content throughout the whole soil profile, in comparison with conventional tillage. Under reduced tillage, the content of readily dispersible clay was also reduced, and, therefore, soil stability was increased in the toplayers, compared with conventional tillage. In addition, the beneficial effects of reduced tillage were reflected in higher soil microbial activity as measured with dehydrogenases and hydrolysis of fluorescein diacetate, compared with conventional tillage. Moreover, the polimerase chain reaction - denaturing gradient gel electrophoresis analysis showed that soil under reduced till-age had greater diversity of microbial communities, compared with conventionally-tilled soil. Finally, reduced tillage increased organic matter content, stability in water and microbial diversity in the top layer of the soil.
Vivostat®: an autologous fibrin sealant as useful adjunct in endoscopic transnasal CSF-leak repair.
Tomazic, Peter Valentin; Edlinger, Stefan; Gellner, Verena; Koele, Wolfgang; Gerstenberger, Claus; Braun, Hannes; Mokry, Michael; Stammberger, Heinz
2015-06-01
The benefit of fibrin glue for reduction of postoperative CSF-leaks after endoscopic skull base surgery is not clearly evident in literature. However, its use is supposed to be beneficial in fixing grafting material. As of today there is no specific data available for otolaryngological procedures. A retrospective data analysis at a tertiary care referral center on 73 patients treated endoscopically transnasally for CSF-leaks at the ENT-department Graz between 2009 and 2012 was performed. Primary closure rate between conventional fibrin glue and autologous fibrin glue were analyzed. The Vivostat(®) system was used in 33 CSF-leak closures and in 40 cases conventional fibrin glue was used. Comparing the two methods the primary closure rate using the autologous Vivostat(®) system was 75.8 and 85.0 % with conventional fibrin glue. The secondary closure the rates were 90.9 % with Vivostat(®) 92.5 % with conventional fibrin glue. The Vivosat(®) system is a useful adjunct in endoscopic CSF-leak closure. Its advantages over conventional fibrin glue are its application system for fixation of grafting material particularly in underlay techniques. Despite this advantage it cannot replace grafting material or is a substitute for proper endoscopic closure which is reflected by the closure rates.
Life-cycle greenhouse gas emissions of shale gas, natural gas, coal, and petroleum.
Burnham, Andrew; Han, Jeongwoo; Clark, Corrie E; Wang, Michael; Dunn, Jennifer B; Palou-Rivera, Ignasi
2012-01-17
The technologies and practices that have enabled the recent boom in shale gas production have also brought attention to the environmental impacts of its use. It has been debated whether the fugitive methane emissions during natural gas production and transmission outweigh the lower carbon dioxide emissions during combustion when compared to coal and petroleum. Using the current state of knowledge of methane emissions from shale gas, conventional natural gas, coal, and petroleum, we estimated up-to-date life-cycle greenhouse gas emissions. In addition, we developed distribution functions for key parameters in each pathway to examine uncertainty and identify data gaps such as methane emissions from shale gas well completions and conventional natural gas liquid unloadings that need to be further addressed. Our base case results show that shale gas life-cycle emissions are 6% lower than conventional natural gas, 23% lower than gasoline, and 33% lower than coal. However, the range in values for shale and conventional gas overlap, so there is a statistical uncertainty whether shale gas emissions are indeed lower than conventional gas. Moreover, this life-cycle analysis, among other work in this area, provides insight on critical stages that the natural gas industry and government agencies can work together on to reduce the greenhouse gas footprint of natural gas.
AlBarakati, SF; Kula, KS; Ghoneima, AA
2012-01-01
Objective The aim of this study was to assess the reliability and reproducibility of angular and linear measurements of conventional and digital cephalometric methods. Methods A total of 13 landmarks and 16 skeletal and dental parameters were defined and measured on pre-treatment cephalometric radiographs of 30 patients. The conventional and digital tracings and measurements were performed twice by the same examiner with a 6 week interval between measurements. The reliability within the method was determined using Pearson's correlation coefficient (r2). The reproducibility between methods was calculated by paired t-test. The level of statistical significance was set at p < 0.05. Results All measurements for each method were above 0.90 r2 (strong correlation) except maxillary length, which had a correlation of 0.82 for conventional tracing. Significant differences between the two methods were observed in most angular and linear measurements except for ANB angle (p = 0.5), angle of convexity (p = 0.09), anterior cranial base (p = 0.3) and the lower anterior facial height (p = 0.6). Conclusion In general, both methods of conventional and digital cephalometric analysis are highly reliable. Although the reproducibility of the two methods showed some statistically significant differences, most differences were not clinically significant. PMID:22184624
Stabile, Sueli Aparecida Batista; Evangelista, Dilson Henrique Ramos; Talamonte, Valdely Helena; Lippi, Umberto Gazi; Lopes, Reginaldo Guedes Coelho
2012-01-01
To compare two oncotic cervical cytology techniques, the conventional and the liquid-based cytology, in low risk patients for uterine cervical cancer. Comparative prospective study with 100 patients who came to their annual gynecological exam, and were submitted simultaneously to both techniques. We used the McNemar test, with a significance level of p < 0.05 to compare the results obtained related to adequacy of the smear quality, descriptive diagnosis prevalence, guided biopsy confirmation and histology. Adequacy of the smear was similar for both methods. The quality with squamocolumnar junction in 93% of conventional cytology and in 84% of the liquid-based cytology had statistical significance. As for the diagnosis of atypical cells they were detected in 3% of conventional cytology and in 10% of liquid-based cytology (p = 0.06). Atypical squamous cells of undetermined significance were the most prevalent abnormality. The liquid-based cytology performance was better when compared with colposcopy (guided biopsy), presenting sensitivity of 66.7% and specificity of 100%. There was no cytological and histological concordance for the conventional cytology. Liquid-based cytology had a better performance to diagnose atypical cells and the cytohistological concordance was higher than in the conventional cytology.
15 CFR 742.18 - Chemical Weapons Convention (CWC or Convention).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false Chemical Weapons Convention (CWC or... REGULATIONS CONTROL POLICY-CCL BASED CONTROLS § 742.18 Chemical Weapons Convention (CWC or Convention). States... Use of Chemical Weapons and on Their Destruction, also known as the Chemical Weapons Convention (CWC...
15 CFR 742.18 - Chemical Weapons Convention (CWC or Convention).
Code of Federal Regulations, 2013 CFR
2013-01-01
... 15 Commerce and Foreign Trade 2 2013-01-01 2013-01-01 false Chemical Weapons Convention (CWC or... REGULATIONS CONTROL POLICY-CCL BASED CONTROLS § 742.18 Chemical Weapons Convention (CWC or Convention). States... Use of Chemical Weapons and on Their Destruction, also known as the Chemical Weapons Convention (CWC...
MacGillivray, Brian H
2017-08-01
In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.
Cole, Pamela S; Quisberg, Jennifer; Melin, M Mark
2009-01-01
Small studies have indicated that the addition of acoustic pressure wound therapy (APWT) to conventional wound care may hasten healing of chronic wounds. We evaluated our early clinical experience using APWT as an adjunct to conventional wound care. The study was a retrospective chart review of consecutive patients receiving APWT in addition to conventional wound care in a hospital-based, primarily outpatient setting. Medical records of all patients treated with APWT between August 2006 and October 2007 were reviewed. Analysis included the 41 patients with 52 wounds who received APWT at least 2 times per week during the study period. Statistical comparisons were made for wound dimensions, tissue characteristics, and pain at start versus end of APWT. Thirty-eight percent of wounds (N = 20) healed completely with a mean 6.8 weeks of APWT. Median wound area and volume decreased significantly (88% [P < .0001] and 100% [P < .0001], respectively) from start to end of APWT. The proportion of wounds with greater than 75% granulation tissue increased from 26% (n = 12) to 80% (n = 41) (P < .0001), and normal periwound skin increased from 25% (n = 13) to 54% (n = 28) (P = .0001). Presence of greater than 50% fibrin slough decreased from 50% (n = 24) to 9% (n = 4) of wounds (P = .006). This early experience supplementing conventional wound care with APWT suggests it may promote healing in chronic wounds, where the ordered cellular and molecular processes leading to healing have stalled.
Evaluation of a new imaging tool for use with major trauma cases in the emergency department.
Crönlein, Moritz; Holzapfel, Konstantin; Beirer, Marc; Postl, Lukas; Kanz, Karl-Georg; Pförringer, Dominik; Huber-Wagner, Stefan; Biberthaler, Peter; Kirchhoff, Chlodwig
2016-11-17
The aim of this study was to evaluate potential benefits of a new diagnostic software prototype (Trauma Viewer, TV) automatically reformatting computed tomography (CT) data on diagnostic speed and quality, compared to CT-image data evaluation using a conventional CT console. Multiple trauma CT data sets were analysed by one expert radiology and one expert traumatology fellow independently twice, once using the TV and once using the secondary conventional CT console placed in the CT control room. Actual analysis time and precision of diagnoses assessment were evaluated. The TV and CT-console results were compared respectively, but also a comparison to the initial multiple trauma CT reports assessed by emergency radiology fellows considered as the gold standard was performed. Finally, design and function of the Trauma Viewer were evaluated in a descriptive manner. CT data sets of 30 multiple trauma patients were enrolled. Mean time needed for analysis of one CT dataset was 2.43 min using the CT console and 3.58 min using the TV respectively. Thus, secondary conventional CT console analysis was on average 1.15 min shorter compared to the TV analysis. Both readers missed a total of 11 diagnoses using the secondary conventional CT console compared to 12 missed diagnoses using the TV. However, none of these overlooked diagnoses resulted in an Abbreviated Injury Scale (AIS) > 2 corresponding to life threatening injuries. Even though it took the two expert fellows a little longer to analyse the CT scans on the prototype TV compared to the CT console, which can be explained by the new user interface of the TV, our preliminary results demonstrate that, after further development, the TV might serve as a new diagnostic feature in the trauma room management. Its high potential to improve time and quality of CT-based diagnoses might help in fast decision making regarding treatment of severely injured patients.
NASA Astrophysics Data System (ADS)
Bártová, H.; Trojek, T.; Čechák, T.; Šefců, R.; Chlumská, Š.
2017-10-01
The presence of heavy chemical elements in old pigments is possible to identify in historical paintings using X-ray fluorescence analysis (XRF). This is a non-destructive analytical method frequently used in examination of objects that require in situ analysis, where it is necessary to avoid damaging the object by taking samples. Different modalities are available, such as microanalysis, scanning selected areas, or depth profiling techniques. Surface scanning is particularly profitable since 2D element distribution maps are much more understandable than the results of individual analyses. Information on the layered structure of the painting can be also obtained by handheld portable systems. Results presented in our paper combine 2D element distribution maps obtained by scanning analysis, and depth profiling using conventional XRF. The latter is very suitable for objects of art, as it can be evaluated from data measured with portable XRF device. Depth profiling by conventional XRF is based on the differences in X-ray absorption in paint layers. The XRF technique was applied for analysis of panel paintings of the Master of the St George Altarpiece who was active in Prague in the 1470s and 1480s. The results were evaluated by taking micro-samples and performing a material analysis.
Input-output Transfer Function Analysis of a Photometer Circuit Based on an Operational Amplifier.
Hernandez, Wilmar
2008-01-09
In this paper an input-output transfer function analysis based on the frequencyresponse of a photometer circuit based on operational amplifier (op amp) is carried out. Opamps are universally used in monitoring photodetectors and there are a variety of amplifierconnections for this purpose. However, the electronic circuits that are usually used to carryout the signal treatment in photometer circuits introduce some limitations in theperformance of the photometers that influence the selection of the op amps and otherelectronic devices. For example, the bandwidth, slew-rate, noise, input impedance and gain,among other characteristics of the op amp, are often the performance limiting factors ofphotometer circuits. For this reason, in this paper a comparative analysis between twophotodiode amplifier circuits is carried out. One circuit is based on a conventional currentto-voltage converter connection and the other circuit is based on a robust current-to-voltageconverter connection. The results are satisfactory and show that the photodiode amplifierperformance can be improved by using robust control techniques.
Lööf, Gunilla; Liljeberg, Cecilia; Eksborg, Staffan; Lönnqvist, Per-Arne
2017-06-01
Information transfer to patients is an integral part of modern medicine. Internet-based alternatives represent a new and attractive way for information transfer. The study used a prospective observer-blinded design. Children (3-12 years) and parents were instructed to get further preoperative information either through an interactive web-based platform, the Anaesthesia-Web, or conventional brochure material until day of outpatient surgery. On the day of surgery, children and parents were separately asked six different questions. The primary end-point was to compare the total question score in children between the two information options (maximum score = 36). Secondary aims were the total question score for parents and the influence of age, sex, and time between the preoperative visit and day of surgery. A total of 125 children were recruited, of which 103 were included in the final analysis (the Anaesthesia-Web group, n = 49; the brochure material group, n = 54). At the predetermined interim analysis, the total question score in children was found to be substantially higher in the Anaesthesia-Web group than in the brochure material group (median score: 27; IQR: 16.5-33 and median score: 19.5; IQR: 11.25-27.75, respectively, P = 0.0076). The median difference in score was 6; 95% CI: 0-9. The total question score in parents was also higher in the Anaesthesia-Web group than in the brochure material group. Increasing child age was associated with a higher total question score in both groups. Sex did not influence the total question score in the Anaesthesia-Web group, whereas girls scored better than boys in the brochure material group. Children in the age range 3-12 years of age as well as their parents do better attain preoperative information from an interactive web-based platform compared to conventional brochure material. © 2017 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Ho, K. K.; Moody, G. B.; Peng, C. K.; Mietus, J. E.; Larson, M. G.; Levy, D.; Goldberger, A. L.
1997-01-01
BACKGROUND: Despite much recent interest in quantification of heart rate variability (HRV), the prognostic value of conventional measures of HRV and of newer indices based on nonlinear dynamics is not universally accepted. METHODS AND RESULTS: We have designed algorithms for analyzing ambulatory ECG recordings and measuring HRV without human intervention, using robust methods for obtaining time-domain measures (mean and SD of heart rate), frequency-domain measures (power in the bands of 0.001 to 0.01 Hz [VLF], 0.01 to 0.15 Hz [LF], and 0.15 to 0.5 Hz [HF] and total spectral power [TP] over all three of these bands), and measures based on nonlinear dynamics (approximate entropy [ApEn], a measure of complexity, and detrended fluctuation analysis [DFA], a measure of long-term correlations). The study population consisted of chronic congestive heart failure (CHF) case patients and sex- and age-matched control subjects in the Framingham Heart Study. After exclusion of technically inadequate studies and those with atrial fibrillation, we used these algorithms to study HRV in 2-hour ambulatory ECG recordings of 69 participants (mean age, 71.7+/-8.1 years). By use of separate Cox proportional-hazards models, the conventional measures SD (P<.01), LF (P<.01), VLF (P<.05), and TP (P<.01) and the nonlinear measure DFA (P<.05) were predictors of survival over a mean follow-up period of 1.9 years; other measures, including ApEn (P>.3), were not. In multivariable models, DFA was of borderline predictive significance (P=.06) after adjustment for the diagnosis of CHF and SD. CONCLUSIONS: These results demonstrate that HRV analysis of ambulatory ECG recordings based on fully automated methods can have prognostic value in a population-based study and that nonlinear HRV indices may contribute prognostic value to complement traditional HRV measures.
NASA Astrophysics Data System (ADS)
Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali
2016-08-01
The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.
Laser-Based Lighting: Experimental Analysis and Perspectives
Yushchenko, Maksym; Buffolo, Matteo; Meneghini, Matteo; Zanoni, Enrico
2017-01-01
This paper presents an extensive analysis of the operating principles, theoretical background, advantages and limitations of laser-based lighting systems. In the first part of the paper we discuss the main advantages and issues of laser-based lighting, and present a comparison with conventional LED-lighting technology. In the second part of the paper, we present original experimental data on the stability and reliability of phosphor layers for laser lighting, based on high light-intensity and high-temperature degradation tests. In the third part of the paper (for the first time) we present a detailed comparison between three different solutions for laser lighting, based on (i) transmissive phosphor layers; (ii) a reflective/angled phosphor layer; and (iii) a parabolic reflector, by discussing the advantages and drawbacks of each approach. The results presented within this paper can be used as a guideline for the development of advanced lighting systems based on laser diodes. PMID:29019958
Innovations in diagnostic imaging of localized prostate cancer.
Pummer, Karl; Rieken, Malte; Augustin, Herbert; Gutschi, Thomas; Shariat, Shahrokh F
2014-08-01
In recent years, various imaging modalities have been developed to improve diagnosis, staging, and localization of early-stage prostate cancer (PCa). A MEDLINE literature search of the time frame between 01/2007 and 06/2013 was performed on imaging of localized PCa. Conventional transrectal ultrasound (TRUS) is mainly used to guide prostate biopsy. Contrast-enhanced ultrasound is based on the assumption that PCa tissue is hypervascularized and might be better identified after intravenous injection of a microbubble contrast agent. However, results on its additional value for cancer detection are controversial. Computer-based analysis of the transrectal ultrasound signal (C-TRUS) appears to detect cancer in a high rate of patients with previous biopsies. Real-time elastography seems to have higher sensitivity, specificity, and positive predictive value than conventional TRUS. However, the method still awaits prospective validation. The same is true for prostate histoscanning, an ultrasound-based method for tissue characterization. Currently, multiparametric MRI provides improved tissue visualization of the prostate, which may be helpful in the diagnosis and targeting of prostate lesions. However, most published series are small and suffer from variations in indication, methodology, quality, interpretation, and reporting. Among ultrasound-based techniques, real-time elastography and C-TRUS seem the most promising techniques. Multiparametric MRI appears to have advantages over conventional T2-weighted MRI in the detection of PCa. Despite these promising results, currently, no recommendation for the routine use of these novel imaging techniques can be made. Prospective studies defining the value of various imaging modalities are urgently needed.
Clinical applications of cell-based approaches in alveolar bone augmentation: a systematic review.
Shanbhag, Siddharth; Shanbhag, Vivek
2015-01-01
Cell-based approaches, utilizing adult mesenchymal stem cells (MSCs), are reported to overcome the limitations of conventional bone augmentation procedures. The study aims to systematically review the available evidence on the characteristics and clinical effectiveness of cell-based ridge augmentation, socket preservation, and sinus-floor augmentation, compared to current evidence-based methods in human adult patients. MEDLINE, EMBASE, and CENTRAL databases were searched for related literature. Both observational and experimental studies reporting outcomes of "tissue engineered" or "cell-based" augmentation in ≥5 adult patients alone, or in comparison with non-cell-based (conventional) augmentation methods, were eligible for inclusion. Primary outcome was histomorphometric analysis of new bone formation. Effectiveness of cell-based augmentation was evaluated based on outcomes of controlled studies. Twenty-seven eligible studies were identified. Of these, 15 included a control group (8 randomized controlled trials [RCTs]), and were judged to be at a moderate-to-high risk of bias. Most studies reported the combined use of cultured autologous MSCs with an osteoconductive bone substitute (BS) scaffold. Iliac bone marrow and mandibular periosteum were frequently reported sources of MSCs. In vitro culture of MSCs took between 12 days and 1.5 months. A range of autogenous, allogeneic, xenogeneic, and alloplastic scaffolds was identified. Bovine bone mineral scaffold was frequently reported with favorable outcomes, while polylactic-polyglycolic acid copolymer (PLGA) scaffold resulted in graft failure in three studies. The combination of MSCs and BS resulted in outcomes similar to autogenous bone (AB) and BS. Three RCTs and one controlled trial reported significantly greater bone formation in cell-based than conventionally grafted sites after 3 to 8 months. Based on limited controlled evidence at a moderate-to-high risk of bias, cell-based approaches are comparable, if not superior, to current evidence-based bone grafting methods, with a significant advantage of avoiding AB harvesting. Future clinical trials should additionally evaluate patient-based outcomes and the time-/cost-effectiveness of these approaches. © 2013 Wiley Periodicals, Inc.
On Bi-Grid Local Mode Analysis of Solution Techniques for 3-D Euler and Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Ibraheem, S. O.; Demuren, A. O.
1994-01-01
A procedure is presented for utilizing a bi-grid stability analysis as a practical tool for predicting multigrid performance in a range of numerical methods for solving Euler and Navier-Stokes equations. Model problems based on the convection, diffusion and Burger's equation are used to illustrate the superiority of the bi-grid analysis as a predictive tool for multigrid performance in comparison to the smoothing factor derived from conventional von Neumann analysis. For the Euler equations, bi-grid analysis is presented for three upwind difference based factorizations, namely Spatial, Eigenvalue and Combination splits, and two central difference based factorizations, namely LU and ADI methods. In the former, both the Steger-Warming and van Leer flux-vector splitting methods are considered. For the Navier-Stokes equations, only the Beam-Warming (ADI) central difference scheme is considered. In each case, estimates of multigrid convergence rates from the bi-grid analysis are compared to smoothing factors obtained from single-grid stability analysis. Effects of grid aspect ratio and flow skewness are examined. Both predictions are compared with practical multigrid convergence rates for 2-D Euler and Navier-Stokes solutions based on the Beam-Warming central scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Rachana; Al-Hallaq, Hania; Pelizzari, Charles A.
2003-12-31
The purpose of this study was to compare conventional low-dose-rate prostate brachytherapy dosimetric quality parameters with their biological effective dose (BED) counterparts. To validate a model for transformation from conventional dose to BED, the postimplant plans of 31 prostate brachytherapy patients were evaluated using conventional dose-volume histogram (DVH) quality endpoints and analogous BED-DVH endpoints. Based on CT scans obtained 4 weeks after implantation, DVHs were computed and standard dosimetric endpoints V100 (volume receiving 100% of the prescribed dose), V150, V200, HI (1-[V150/V100]), and D90 (dose that 90% of the target volume received) were obtained for quality analysis. Using known andmore » reported transformations, dose grids were transformed to BED-early ({alpha}/{beta} = 10 Gy) and BED-late ({alpha}/{beta} = 3 Gy) grids, and the same dosimetric endpoints were analyzed. For conventional, BED-early and BED-late DVHs, no differences in V100 were seen (0.896, 0.893, and 0.894, respectively). However, V150 and V200 were significantly higher for both BED-early (0.582 and 0.316) and BED-late (0.595 and 0.337), compared with the conventional (0.539 and 0.255) DVHs. D90 was significantly lower for the BED-early (103.1 Gy) and BED-late transformations (106.9 Gy) as compared with the conventional (119.5 Gy) DVHs. The conventional prescription parameter V100 is the same for the corresponding BED-early and BED-late transformed DVHs. The toxicity parameters V150 and V200 are slightly higher using the BED transformations, suggesting that the BED doses are somewhat higher than predicted using conventional DVHs. The prescription/quality parameter D90 is slightly lower, implying that target coverage is lower than predicted using conventional DVHs. This methodology can be applied to analyze BED dosimetric endpoints to improve clinical outcome and reduce complications of prostate brachytherapy.« less
NASA Astrophysics Data System (ADS)
Delgado, Carlos; Cátedra, Manuel Felipe
2018-05-01
This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.
Method for improving accuracy in full evaporation headspace analysis.
Xie, Wei-Qi; Chai, Xin-Sheng
2017-05-01
We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Failure modes and effects analysis automation
NASA Technical Reports Server (NTRS)
Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron
1988-01-01
A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.
DNA barcode-based molecular identification system for fish species.
Kim, Sungmin; Eo, Hae-Seok; Koo, Hyeyoung; Choi, Jun-Kil; Kim, Won
2010-12-01
In this study, we applied DNA barcoding to identify species using short DNA sequence analysis. We examined the utility of DNA barcoding by identifying 53 Korean freshwater fish species, 233 other freshwater fish species, and 1339 saltwater fish species. We successfully developed a web-based molecular identification system for fish (MISF) using a profile hidden Markov model. MISF facilitates efficient and reliable species identification, overcoming the limitations of conventional taxonomic approaches. MISF is freely accessible at http://bioinfosys.snu.ac.kr:8080/MISF/misf.jsp .
Design and analysis of a silicon-based antiresonant reflecting optical waveguide chemical sensor
NASA Astrophysics Data System (ADS)
Remley, Kate A.; Weisshaar, Andreas
1996-08-01
The design of a silicon-based antiresonant reflecting optical waveguide (ARROW) chemical sensor is presented, and its theoretical performance is compared with that of a conventional structure. The use of an ARROW structure permits incorporation of a thick guiding region for efficient coupling to a single-mode fiber. A high-index overlay is added to fine tune the sensitivity of the ARROW chemical sensor. The sensitivity of the sensor is presented, and design trade-offs are discussed.
2011-06-01
paradigm historically based on conventional warfare and state-based actors. Underlying questions related to the research question are: Is there a CoG...analysis is still useful by proposing 5 new interpretations of this aged concept. This thesis will examine whether CoG is still useful in...cuts through the confusion and contradictions related to CoG by translating it functionally for the war-fighter. He does this through a
Comparison of nutritional quality between conventional and organic dairy products: a meta-analysis.
Palupi, Eny; Jayanegara, Anuraga; Ploeger, Angelika; Kahl, Johannes
2012-11-01
As a contribution to the debate on the comparison of nutritional quality between conventional versus organic products, the present study would like to provide new results on this issue specifically on dairy products by integrating the last 3 years' studies using a meta-analysis approach with Hedges' d effect size method. The current meta-analysis shows that organic dairy products contain significantly higher protein, ALA, total omega-3 fatty acid, cis-9,trans-11 conjugated linoleic acid, trans-11 vaccenic acid, eicosapentanoic acid, and docosapentanoic acid than those of conventional types, with cumulative effect size ( ± 95% confidence interval) of 0.56 ± 0.24, 1.74 ± 0.16, 0.84 ± 0.14, 0.68 ± 0.13, 0.51 ± 0.16, 0.42 ± 0.23, and 0.71 ± 0.3, respectively. It is also observed that organic dairy products have significantly (P < 0.001) higher omega-3 to -6 ratio (0.42 vs. 0.23) and Δ9-desaturase index (0.28 vs. 0.27) than the conventional types. The current regulation on organic farming indeed drives organic farms to production of organic dairy products with different nutritional qualities from conventional ones. The differences in feeding regime between conventional and organic dairy production is suspected as the reason behind this evidence. Further identical meta-analysis may be best applicable for summarizing a comparison between conventional and organic foodstuffs for other aspects and food categories. Copyright © 2012 Society of Chemical Industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coolens, Catherine, E-mail: catherine.coolens@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario
2015-01-01
Objectives: Development of perfusion imaging as a biomarker requires more robust methodologies for quantification of tumor physiology that allow assessment of volumetric tumor heterogeneity over time. This study proposes a parametric method for automatically analyzing perfused tissue from volumetric dynamic contrast-enhanced (DCE) computed tomography (CT) scans and assesses whether this 4-dimensional (4D) DCE approach is more robust and accurate than conventional, region-of-interest (ROI)-based CT methods in quantifying tumor perfusion with preliminary evaluation in metastatic brain cancer. Methods and Materials: Functional parameter reproducibility and analysis of sensitivity to imaging resolution and arterial input function were evaluated in image sets acquired from amore » 320-slice CT with a controlled flow phantom and patients with brain metastases, whose treatments were planned for stereotactic radiation surgery and who consented to a research ethics board-approved prospective imaging biomarker study. A voxel-based temporal dynamic analysis (TDA) methodology was used at baseline, at day 7, and at day 20 after treatment. The ability to detect changes in kinetic parameter maps in clinical data sets was investigated for both 4D TDA and conventional 2D ROI-based analysis methods. Results: A total of 7 brain metastases in 3 patients were evaluated over the 3 time points. The 4D TDA method showed improved spatial efficacy and accuracy of perfusion parameters compared to ROI-based DCE analysis (P<.005), with a reproducibility error of less than 2% when tested with DCE phantom data. Clinically, changes in transfer constant from the blood plasma into the extracellular extravascular space (K{sub trans}) were seen when using TDA, with substantially smaller errors than the 2D method on both day 7 post radiation surgery (±13%; P<.05) and by day 20 (±12%; P<.04). Standard methods showed a decrease in K{sub trans} but with large uncertainty (111.6 ± 150.5) %. Conclusions: Parametric voxel-based analysis of 4D DCE CT data resulted in greater accuracy and reliability in measuring changes in perfusion CT-based kinetic metrics, which have the potential to be used as biomarkers in patients with metastatic brain cancer.« less
NASA Astrophysics Data System (ADS)
Madhulatha, A.; Rajeevan, M.; Bhowmik, S. K. Roy; Das, A. K.
2018-01-01
The primary goal of present study is to investigate the impact of assimilation of conventional and satellite radiance observations in simulating the mesoscale convective system (MCS) formed over south east India. An assimilation methodology based on Weather Research and Forecasting model three dimensional variational data assimilation is considered. Few numerical experiments are carried out to examine the individual and combined impact of conventional and non-conventional (satellite radiance) observations. After the successful inclusion of additional observations, strong analysis increments of temperature and moisture fields are noticed and contributed to significant improvement in model's initial fields. The resulting model simulations are able to successfully reproduce the prominent synoptic features responsible for the initiation of MCS. Among all the experiments, the final experiment in which both conventional and satellite radiance observations assimilated has showed considerable impact on the prediction of MCS. The location, genesis, intensity, propagation and development of rain bands associated with the MCS are simulated reasonably well. The biases of simulated temperature, moisture and wind fields at surface and different pressure levels are reduced. Thermodynamic, dynamic and vertical structure of convective cells associated with the passage of MCS are well captured. Spatial distribution of rainfall is fairly reproduced and comparable to TRMM observations. It is demonstrated that incorporation of conventional and satellite radiance observations improved the local and synoptic representation of temperature, moisture fields from surface to different levels of atmosphere. This study highlights the importance of assimilation of conventional and satellite radiances in improving the models initial conditions and simulation of MCS.
Towards Greater Recognition of the Right to Play: An Analysis of Article 31 of the UNCRC
ERIC Educational Resources Information Center
Davey, Ciara; Lundy, Laura
2011-01-01
Children's right to play is formally enshrined in Article 31 of the United Nations Convention on the Rights of the Child (UNCRC). However, few research studies have explored children's experiences of play from an explicit rights-based perspective. Using children's views to illustrate the multi-dimensional relationship Article 31 holds with other…
1994-02-01
appears with plots no. 1- 30. This refers to the default convention to delete from analysis that data acquired in any bi-hourly acquisition window...FREQUENCY - 35 MHZ BASED ON OBSERVED NOISE MEASUREMENTS - VERTICAL rIEN~ lOS1 06-1 13-OCT-ii PLO 8.00 186 GEOPHYSICS LAB METEOR SCATTER PROGRAM I AVER
USDA-ARS?s Scientific Manuscript database
While conventionally grown poultry continues to dominate the U.S. poultry industry, there is an increasing demand for locally-grown, “all natural” alternatives. Unfortunately, limited research has been done on this type of poultry management practice, and thus many of these management effects on th...
ERIC Educational Resources Information Center
Hijab, Nadia
In accordance with UNICEF mandates requiring a situation analysis prior to preparing a new country program, this report examines causes and linkages between problems affecting women and children in Iraq, identifies necessary actions to realize the rights of women and children, and contributes to the country program strategy for priority…
Application of ECT inspection to the first wall of a fusion reactor with wavelet analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, G.; Yoshida, Y.; Miya, K.
1994-12-31
The first wall of a fusion reactor will be subjected to intensive loads during fusion operations. Since these loads may cause defects in the first wall, nondestructive evaluation techniques of the first wall should be developed. In this paper, we try to apply eddy current testing (ECT) technique to the inspection of the first wall. A method based on current vector potential and wavelet analysis is proposed. Owing to the use of wavelet analysis, a new theory developed recently, the accuracy of the present method is shown to be better than a conventional one.
Yan, Xu; Zhou, Minxiong; Ying, Lingfang; Yin, Dazhi; Fan, Mingxia; Yang, Guang; Zhou, Yongdi; Song, Fan; Xu, Dongrong
2013-01-01
Diffusion kurtosis imaging (DKI) is a new method of magnetic resonance imaging (MRI) that provides non-Gaussian information that is not available in conventional diffusion tensor imaging (DTI). DKI requires data acquisition at multiple b-values for parameter estimation; this process is usually time-consuming. Therefore, fewer b-values are preferable to expedite acquisition. In this study, we carefully evaluated various acquisition schemas using different numbers and combinations of b-values. Acquisition schemas that sampled b-values that were distributed to two ends were optimized. Compared to conventional schemas using equally spaced b-values (ESB), optimized schemas require fewer b-values to minimize fitting errors in parameter estimation and may thus significantly reduce scanning time. Following a ranked list of optimized schemas resulted from the evaluation, we recommend the 3b schema based on its estimation accuracy and time efficiency, which needs data from only 3 b-values at 0, around 800 and around 2600 s/mm2, respectively. Analyses using voxel-based analysis (VBA) and region-of-interest (ROI) analysis with human DKI datasets support the use of the optimized 3b (0, 1000, 2500 s/mm2) DKI schema in practical clinical applications. PMID:23735303
Cytopathological image analysis using deep-learning networks in microfluidic microscopy.
Gopakumar, G; Hari Babu, K; Mishra, Deepak; Gorthi, Sai Siva; Sai Subrahmanyam, Gorthi R K
2017-01-01
Cytopathologic testing is one of the most critical steps in the diagnosis of diseases, including cancer. However, the task is laborious and demands skill. Associated high cost and low throughput drew considerable interest in automating the testing process. Several neural network architectures were designed to provide human expertise to machines. In this paper, we explore and propose the feasibility of using deep-learning networks for cytopathologic analysis by performing the classification of three important unlabeled, unstained leukemia cell lines (K562, MOLT, and HL60). The cell images used in the classification are captured using a low-cost, high-throughput cell imaging technique: microfluidics-based imaging flow cytometry. We demonstrate that without any conventional fine segmentation followed by explicit feature extraction, the proposed deep-learning algorithms effectively classify the coarsely localized cell lines. We show that the designed deep belief network as well as the deeply pretrained convolutional neural network outperform the conventionally used decision systems and are important in the medical domain, where the availability of labeled data is limited for training. We hope that our work enables the development of a clinically significant high-throughput microfluidic microscopy-based tool for disease screening/triaging, especially in resource-limited settings.
Kelly, V; Sagili, K D; Satyanarayana, S; Reza, L W; Chadha, S S; Wilson, N C
2015-06-01
With support from the Stop TB Partnership's TB REACH Wave 2 Grant, diagnostic microscopy services for tuberculosis (TB) were upgraded from conventional Ziehl-Neelsen (ZN) based sputum microscopy to light emitting diode technology-based fluorescence microscopy (LED FM) in 200 high-workload microscopy centres in India as a pilot intervention. To evaluate the cost-effectiveness of LED-FM over conventional ZN microscopy to inform further scale-up. A decision-tree model was constructed to assess the cost utility of LED FM over ZN microscopy. The results were summarised using incremental cost-effectiveness ratio (ICER); one-way and probabilistic sensitivity analyses were also conducted to address uncertainty within the model. Data were analysed from 200 medical colleges in 2011 and 2012, before and after the introduction of LED microscopes. A full costing analysis was carried out from the perspective of a national TB programme. The ICER was calculated at US$14.64 per disability-adjusted life-year, with an 82% probability of being cost-effective at a willingness-to-pay threshold equivalent to Indian gross domestic product per capita. LED FM is a cost-effective intervention for detecting TB cases in India at high-workload medical college settings.
Thermal Desorption Analysis of Effective Specific Soil Surface Area
NASA Astrophysics Data System (ADS)
Smagin, A. V.; Bashina, A. S.; Klyueva, V. V.; Kubareva, A. V.
2017-12-01
A new method of assessing the effective specific surface area based on the successive thermal desorption of water vapor at different temperature stages of sample drying is analyzed in comparison with the conventional static adsorption method using a representative set of soil samples of different genesis and degree of dispersion. The theory of the method uses the fundamental relationship between the thermodynamic water potential (Ψ) and the absolute temperature of drying ( T): Ψ = Q - aT, where Q is the specific heat of vaporization, and a is the physically based parameter related to the initial temperature and relative humidity of the air in the external thermodynamic reservoir (laboratory). From gravimetric data on the mass fraction of water ( W) and the Ψ value, Polyanyi potential curves ( W(Ψ)) for the studied samples are plotted. Water sorption isotherms are then calculated, from which the capacity of monolayer and the target effective specific surface area are determined using the BET theory. Comparative analysis shows that the new method well agrees with the conventional estimation of the degree of dispersion by the BET and Kutilek methods in a wide range of specific surface area values between 10 and 250 m2/g.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Mi-Ae; Moore, Stephen C.; McQuaid, Sarah J.
Purpose: The authors have previously reported the advantages of high-sensitivity single-photon emission computed tomography (SPECT) systems for imaging structures located deep inside the brain. DaTscan (Isoflupane I-123) is a dopamine transporter (DaT) imaging agent that has shown potential for early detection of Parkinson disease (PD), as well as for monitoring progression of the disease. Realizing the full potential of DaTscan requires efficient estimation of striatal uptake from SPECT images. They have evaluated two SPECT systems, a conventional dual-head gamma camera with low-energy high-resolution collimators (conventional) and a dedicated high-sensitivity multidetector cardiac imaging system (dedicated) for imaging tasks related to PD.more » Methods: Cramer-Rao bounds (CRB) on precision of estimates of striatal and background activity concentrations were calculated from high-count, separate acquisitions of the compartments (right striata, left striata, background) of a striatal phantom. CRB on striatal and background activity concentration were calculated from essentially noise-free projection datasets, synthesized by scaling and summing the compartment projection datasets, for a range of total detected counts. They also calculated variances of estimates of specific-to-nonspecific binding ratios (BR) and asymmetry indices from these values using propagation of error analysis, as well as the precision of measuring changes in BR on the order of the average annual decline in early PD. Results: Under typical clinical conditions, the conventional camera detected 2 M counts while the dedicated camera detected 12 M counts. Assuming a normal BR of 5, the standard deviation of BR estimates was 0.042 and 0.021 for the conventional and dedicated system, respectively. For an 8% decrease to BR = 4.6, the signal-to-noise ratio were 6.8 (conventional) and 13.3 (dedicated); for a 5% decrease, they were 4.2 (conventional) and 8.3 (dedicated). Conclusions: This implies that PD can be detected earlier with the dedicated system than with the conventional system; therefore, earlier identification of PD progression should be possible with the high-sensitivity dedicated SPECT camera.« less
NASA Astrophysics Data System (ADS)
Nauleau, Pierre; Apostolakis, Iason; McGarry, Matthew; Konofagou, Elisa
2018-06-01
The stiffness of the arteries is known to be an indicator of the progression of various cardiovascular diseases. Clinically, the pulse wave velocity (PWV) is used as a surrogate for arterial stiffness. Pulse wave imaging (PWI) is a non-invasive, ultrasound-based imaging technique capable of mapping the motion of the vessel walls, allowing the local assessment of arterial properties. Conventionally, a distinctive feature of the displacement wave (e.g. the 50% upstroke) is tracked across the map to estimate the PWV. However, the presence of reflections, such as those generated at the carotid bifurcation, can bias the PWV estimation. In this paper, we propose a two-step cross-correlation based method to characterize arteries using the information available in the PWI spatio-temporal map. First, the area under the cross-correlation curve is proposed as an index for locating the regions of different properties. Second, a local peak of the cross-correlation function is tracked to obtain a less biased estimate of the PWV. Three series of experiments were conducted in phantoms to evaluate the capabilities of the proposed method compared with the conventional method. In the ideal case of a homogeneous phantom, the two methods performed similarly and correctly estimated the PWV. In the presence of reflections, the proposed method provided a more accurate estimate than conventional processing: e.g. for the soft phantom, biases of ‑0.27 and ‑0.71 m · s–1 were observed. In a third series of experiments, the correlation-based method was able to locate two regions of different properties with an error smaller than 1 mm. It also provided more accurate PWV estimates than conventional processing (biases: ‑0.12 versus ‑0.26 m · s–1). Finally, the in vivo feasibility of the proposed method was demonstrated in eleven healthy subjects. The results indicate that the correlation-based method might be less precise in vivo but more accurate than the conventional method.
Development of Mycoplasma synoviae (MS) core genome multilocus sequence typing (cgMLST) scheme.
Ghanem, Mostafa; El-Gazzar, Mohamed
2018-05-01
Mycoplasma synoviae (MS) is a poultry pathogen with reported increased prevalence and virulence in recent years. MS strain identification is essential for prevention, control efforts and epidemiological outbreak investigations. Multiple multilocus based sequence typing schemes have been developed for MS, yet the resolution of these schemes could be limited for outbreak investigation. The cost of whole genome sequencing became close to that of sequencing the seven MLST targets; however, there is no standardized method for typing MS strains based on whole genome sequences. In this paper, we propose a core genome multilocus sequence typing (cgMLST) scheme as a standardized and reproducible method for typing MS based whole genome sequences. A diverse set of 25 MS whole genome sequences were used to identify 302 core genome genes as cgMLST targets (35.5% of MS genome) and 44 whole genome sequences of MS isolates from six countries in four continents were used for typing applying this scheme. cgMLST based phylogenetic trees displayed a high degree of agreement with core genome SNP based analysis and available epidemiological information. cgMLST allowed evaluation of two conventional MLST schemes of MS. The high discriminatory power of cgMLST allowed differentiation between samples of the same conventional MLST type. cgMLST represents a standardized, accurate, highly discriminatory, and reproducible method for differentiation between MS isolates. Like conventional MLST, it provides stable and expandable nomenclature, allowing for comparing and sharing the typing results between different laboratories worldwide. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Fang, Tuan-Jen; Li, Hsueh-Yu; Liao, Chun-Ta; Chiang, Hui-Chen; Chen, I-How
2015-07-01
Narrow band imaging (NBI)-guided flexible laryngoscopy tissue sampling for laryngopharyngeal lesions is a novel technique. Patients underwent the procedure in an office-based setting without being sedated, which is different from the conventional technique performed using direct laryngoscopy. Although the feasibility and effects of this procedure were established, its financial impact on the institution and Taiwanese National Health Insurance program was not determined. This is a retrospective case-control study. From May 2010 to April 2011, 20 consecutive patients who underwent NBI flexible laryngoscopy tissue sampling were recruited. During the same period, another 20 age-, sex-, and lesion-matched cases were enrolled in the control group. The courses for procedures and financial status were analyzed and compared between groups. Office-based NBI flexible laryngoscopy tissue sampling procedure took 27 minutes to be completed, while 191 minutes were required for the conventional technique. Average reimbursement for each case was New Taiwan Dollar (NT$)1264 for patients undergoing office-based NBI flexible laryngoscopy tissue sampling, while NT$10,913 for those undergoing conventional direct laryngoscopy in the operation room (p < 0.001). The institution suffered a loss of at least NT$690 when performing NBI flexible laryngoscopy tissue sampling. Office-based NBI flexible laryngoscopy tissue sampling is a cost-saving procedure for patients and the Taiwanese National Health Insurance program. It also saves the procedure time. However, the net financial loss for the institution and physician would limit its popularization unless reimbursement patterns are changed. Copyright © 2013. Published by Elsevier B.V.
Navidian, Ali; Mobaraki, Hajar; Shakiba, Mansour
2017-08-01
To determine the effect of education based on motivational interviewing on self-care behaviors in heart failure patients with depression. In this study, 82 patients suffering from heart failure whose depression had been confirmed were selected and divided into two groups. The Self-Care Heart Failure Index was utilized to evaluate self-care behavior. The intervention group received four sessions of self-care behavior education based on the principles of motivational interviewing, and the control group received four sessions of conventional education on self-care behavior. At 8 weeks after finishing the interventions, the self-care behaviors of both groups were evaluated. Data were analyzed using paired and independent t-tests, Chi-square, and analysis of covariance, as appropriate. The average increase in the overall scores and the scores on the three sub-scales of self-care behavior (maintenance, management, and confidence) of the heart failure patients with depression were significantly higher after education based on motivational interviewing than after conventional self-care education (p<0.05). Motivational interviewing had a significant positive effect on self-care behaviors in patients with heart failure and depression. Due to the effectiveness of the MI, using motivational interviewing for education in depressed HF patients is recommended. Copyright © 2017 Elsevier B.V. All rights reserved.
Proposal for a new categorization of aseptic processing facilities based on risk assessment scores.
Katayama, Hirohito; Toda, Atsushi; Tokunaga, Yuji; Katoh, Shigeo
2008-01-01
Risk assessment of aseptic processing facilities was performed using two published risk assessment tools. Calculated risk scores were compared with experimental test results, including environmental monitoring and media fill run results, in three different types of facilities. The two risk assessment tools used gave a generally similar outcome. However, depending on the tool used, variations were observed in the relative scores between the facilities. For the facility yielding the lowest risk scores, the corresponding experimental test results showed no contamination, indicating that these ordinal testing methods are insufficient to evaluate this kind of facility. A conventional facility having acceptable aseptic processing lines gave relatively high risk scores. The facility showing a rather high risk score demonstrated the usefulness of conventional microbiological test methods. Considering the significant gaps observed in calculated risk scores and in the ordinal microbiological test results between advanced and conventional facilities, we propose a facility categorization based on risk assessment. The most important risk factor in aseptic processing is human intervention. When human intervention is eliminated from the process by advanced hardware design, the aseptic processing facility can be classified into a new risk category that is better suited for assuring sterility based on a new set of criteria rather than on currently used microbiological analysis. To fully benefit from advanced technologies, we propose three risk categories for these aseptic facilities.
Oparaji, U; Tsai, Y H; Liu, Y C; Lee, K W; Patelli, E; Sheu, R J
2017-06-01
This paper presents improved and extended results of our previous study on corrections for conventional neutron dose meters used in environments with high-energy neutrons (En > 10 MeV). Conventional moderated-type neutron dose meters tend to underestimate the dose contribution of high-energy neutrons because of the opposite trends of dose conversion coefficients and detection efficiencies as the neutron energy increases. A practical correction scheme was proposed based on analysis of hundreds of neutron spectra in the IAEA-TRS-403 report. By comparing 252Cf-calibrated dose responses with reference values derived from fluence-to-dose conversion coefficients, this study provides recommendations for neutron field characterization and the corresponding dose correction factors. Further sensitivity studies confirm the appropriateness of the proposed scheme and indicate that (1) the spectral correction factors are nearly independent of the selection of three commonly used calibration sources: 252Cf, 241Am-Be and 239Pu-Be; (2) the derived correction factors for Bonner spheres of various sizes (6"-9") are similar in trend and (3) practical high-energy neutron indexes based on measurements can be established to facilitate the application of these correction factors in workplaces. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Luong, Emilie; Shayegan, Amir
2018-01-01
Aim The aim of this study was to make a comparison between microleakage of conventionally restored class V cavities using acid etchant and the ones conditioned by erbium-doped yttrium aluminum garnet (Er:YAG) laser, and also to assess and compare the effectiveness of enamel surface treatments of occlusal pits and fissures by acid etching and conditioned by Er:YAG laser-etch. Materials and methods Seventy-two extracted third molars were used in this study. The samples were divided into two major groups: class V cavities and pit and fissure sealants. Each subgroup was divided into conventional acid etching, Er:YAG laser conditioning and conventional acid etching, and combination with Er:YAG laser conditioning (n=12). The teeth were placed in 2% methylene blue dye solution, were sectioned, and were evaluated according to the dye penetration criteria. Two samples per subgroup were chosen for scanning electron microscopic (SEM) analysis. Results There was a significant difference between occlusal and cervical margin groups. Laser conventional composite cementum group showed more microleakage values compared to other groups. There was no significant difference between occlusal margin groups. However, there was a significant difference between cervical margin groups in terms of microleakage. In sealant groups, there was a significant difference between laser and conventional with/without laser treatment groups in terms of microleakage. Conclusion Based on the results reported in this study, it can be concluded that the application of the Er:YAG laser beneath the resin composite, the resin-modified glass ionomers (GIs), and the fissure sealant placement may be an alternative enamel and dentin etching method to acid etching. PMID:29881311
Cooper, Jennifer A; Loomis, George W; Kalen, David V; Amador, Jose A
2015-05-01
Shallow narrow drainfields are assumed to provide better wastewater renovation than conventional drainfields and are used for protection of surface and ground water. To test this assumption, we evaluated the water quality functions of two advanced onsite wastewater treatment system (OWTS) drainfields-shallow narrow (SND) and Geomat (GEO)-and a conventional pipe and stone (P&S) drainfield over 12 mo using replicated ( = 3) intact soil mesocosms. The SND and GEO mesocosms received effluent from a single-pass sand filter, whereas the P&S received septic tank effluent. Between 97.1 and 100% of 5-d biochemical oxygen demand (BOD), fecal coliform bacteria, and total phosphorus (P) were removed in all drainfield types. Total nitrogen (N) removal averaged 12.0% for P&S, 4.8% for SND, and 5.4% for GEO. A mass balance analysis accounted for 95.1% (SND), 94.1% (GEO), and 87.6% (P&S) of N inputs. When the whole treatment train (excluding the septic tank) is considered, advanced systems, including sand filter pretreatment and SND or GEO soil-based treatment, removed 99.8 to 99.9% of BOD, 100% of fecal coliform bacteria and P, and 26.0 to 27.0% of N. In contrast, the conventional system removed 99.4% of BOD and 100% of fecal coliform bacteria and P but only 12.0% of N. All drainfield types performed similarly for most water quality functions despite differences in placement within the soil profile. However, inclusion of the pretreatment step in advanced system treatment trains results in better N removal than in conventional treatment systems despite higher drainfield N removal rates in the latter. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Ronco, Guglielmo; Brezzi, Silvia; Carozzi, Francesca; Dalla Palma, Paolo; Giorgi-Rossi, Paolo; Minucci, Daria; Naldoni, Carlo; Segnan, Nereo; Zappa, Marco; Zorzi, Manuel; Cuzick, Jack
2007-10-01
To study the impact of different cervical cancer screening strategies including HPV testing. A randomised controlled trial with a conventional arm (conventional cytology) and an experimental arm following two phases (first HPV testing+conventional cytology, second HPV testing alone). In phase one, different protocols were applied to different age groups (25-34 and 35-60). Published data on test accuracy during the phase one of recruitment are summarised. 45,307 women were recruited in phase one (about 95,000 overall). In the age group 35-60, HPV testing (by Hybrid Capture 2) alone at 2 RLU cut-off increased sensitivity vs. conventional cytology (relative sensitivity 1.41; 95% CI: 0.98-1.02) with a small loss in Positive Predictive Value (PPV; relative PPV 0.75; 95% CI: 0.45-1.25). Adding liquid-based cytology as screening test and referring to colposcopy women positive to either only marginally increased sensitivity but strongly reduced PPV. In the age group 25-34, similar results (relative sensitivity vs. conventional cytology 1.58; 95% CI: 1.032.44; relative PPV 0.78; 95% CI: 0.72-1.16) were obtained, despite 14% of women were HPV positive, with a strategy based on HPV alone as screening test, triaging HPV positive women by cytology, directly referring those ASCUS+ to colposcopy and repeating both tests after 1 year in those with normal cytology. HPV testing, if used as screening test, should be applied alone, with cytology triage essential in younger women but preferable at all ages. Follow-up data will allow analysis of the safety of prolonging screening intervals and the relative persistence of lesions detected with different methods.
NASA Astrophysics Data System (ADS)
Syifahayu
2017-02-01
The study was conducted based on teaching and learning problems led by conventional method that had been done in the process of learning science. It gave students lack opportunities to develop their competence and thinking skills. Consequently, the process of learning science was neglected. Students did not have opportunity to improve their critical attitude and creative thinking skills. To cope this problem, the study was conducted using Project-Based Learning model through inquiry-based science education about environment. The study also used an approach called Sains Lingkungan and Teknologi masyarakat - “Saling Temas” (Environmental science and Technology in Society) which promoted the local content in Lampung as a theme in integrated science teaching and learning. The study was a quasi-experimental with pretest-posttest control group design. Initially, the subjects were given a pre-test. The experimental group was given inquiry learning method while the control group was given conventional learning. After the learning process, the subjects of both groups were given post-test. Quantitative analysis was performed using the Mann-Whitney U-test and also a qualitative descriptive. Based on the result, environmental literacy skills of students who get inquiry learning strategy, with project-based learning model on the theme soil washing, showed significant differences. The experimental group is better than the control group. Data analysis showed the p-value or sig. (2-tailed) is 0.000 <α = 0.05 with the average N-gain of experimental group is 34.72 and control group is 16.40. Besides, the learning process becomes more meaningful.
Task-based statistical image reconstruction for high-quality cone-beam CT
NASA Astrophysics Data System (ADS)
Dang, Hao; Webster Stayman, J.; Xu, Jennifer; Zbijewski, Wojciech; Sisniega, Alejandro; Mow, Michael; Wang, Xiaohui; Foos, David H.; Aygun, Nafi; Koliatsos, Vassilis E.; Siewerdsen, Jeffrey H.
2017-11-01
Task-based analysis of medical imaging performance underlies many ongoing efforts in the development of new imaging systems. In statistical image reconstruction, regularization is often formulated in terms to encourage smoothness and/or sharpness (e.g. a linear, quadratic, or Huber penalty) but without explicit formulation of the task. We propose an alternative regularization approach in which a spatially varying penalty is determined that maximizes task-based imaging performance at every location in a 3D image. We apply the method to model-based image reconstruction (MBIR—viz., penalized weighted least-squares, PWLS) in cone-beam CT (CBCT) of the head, focusing on the task of detecting a small, low-contrast intracranial hemorrhage (ICH), and we test the performance of the algorithm in the context of a recently developed CBCT prototype for point-of-care imaging of brain injury. Theoretical predictions of local spatial resolution and noise are computed via an optimization by which regularization (specifically, the quadratic penalty strength) is allowed to vary throughout the image to maximize local task-based detectability index ({{d}\\prime} ). Simulation studies and test-bench experiments were performed using an anthropomorphic head phantom. Three PWLS implementations were tested: conventional (constant) penalty; a certainty-based penalty derived to enforce constant point-spread function, PSF; and the task-based penalty derived to maximize local detectability at each location. Conventional (constant) regularization exhibited a fairly strong degree of spatial variation in {{d}\\prime} , and the certainty-based method achieved uniform PSF, but each exhibited a reduction in detectability compared to the task-based method, which improved detectability up to ~15%. The improvement was strongest in areas of high attenuation (skull base), where the conventional and certainty-based methods tended to over-smooth the data. The task-driven reconstruction method presents a promising regularization method in MBIR by explicitly incorporating task-based imaging performance as the objective. The results demonstrate improved ICH conspicuity and support the development of high-quality CBCT systems.
de Almeida, Marcos E; Koru, Ozgur; Steurer, Francis; Herwaldt, Barbara L; da Silva, Alexandre J
2017-01-01
Leishmaniasis in humans is caused by Leishmania spp. in the subgenera Leishmania and Viannia Species identification often has clinical relevance. Until recently, our laboratory relied on conventional PCR amplification of the internal transcribed spacer 2 (ITS2) region (ITS2-PCR) followed by sequencing analysis of the PCR product to differentiate Leishmania spp. Here we describe a novel real-time quantitative PCR (qPCR) approach based on the SYBR green technology (LSG-qPCR), which uses genus-specific primers that target the ITS1 region and amplify DNA from at least 10 Leishmania spp., followed by analysis of the melting temperature (T m ) of the amplicons on qPCR platforms (the Mx3000P qPCR system [Stratagene-Agilent] and the 7500 real-time PCR system [ABI Life Technologies]). We initially evaluated the assay by testing reference Leishmania isolates and comparing the results with those from the conventional ITS2-PCR approach. Then we compared the results from the real-time and conventional molecular approaches for clinical specimens from 1,051 patients submitted to the reference laboratory of the Centers for Disease Control and Prevention for Leishmania diagnostic testing. Specimens from 477 patients tested positive for Leishmania spp. with the LSG-qPCR assay, specimens from 465 of these 477 patients also tested positive with the conventional ITS2-PCR approach, and specimens from 10 of these 465 patients had positive results because of retesting prompted by LSG-qPCR positivity. On the basis of the T m values of the LSG-qPCR amplicons from reference and clinical specimens, we were able to differentiate four groups of Leishmania parasites: the Viannia subgenus in aggregate; the Leishmania (Leishmania) donovani complex in aggregate; the species L (L) tropica; and the species L (L) mexicana, L (L) amazonensis, L (L) major, and L (L) aethiopica in aggregate. Copyright © 2016 American Society for Microbiology.
da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Senna, Kátia Marie Simões e.; Tura, Bernardo Rangel; Goulart, Marcelo Correia
2014-01-01
Objectives The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. Methods A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. Results The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. Conclusions The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation. PMID:25302806
da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Simões e Senna, Kátia Marie; Tura, Bernardo Rangel; Correia, Marcelo Goulart; Goulart, Marcelo Correia
2014-01-01
The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation.
Martínez Bueno, María Jesús; Díaz-Galiano, Francisco José; Rajski, Łukasz; Cutillas, Víctor; Fernández-Alba, Amadeo R
2018-04-20
In the last decade, the consumption trend of organic food has increased dramatically worldwide. However, the lack of reliable chemical markers to discriminate between organic and conventional products makes this market susceptible to food fraud in products labeled as "organic". Metabolomic fingerprinting approach has been demonstrated as the best option for a full characterization of metabolome occurring in plants, since their pattern may reflect the impact of both endogenous and exogenous factors. In the present study, advanced technologies based on high performance liquid chromatography-high-resolution accurate mass spectrometry (HPLC-HRAMS) has been used for marker search in organic and conventional tomatoes grown in greenhouse under controlled agronomic conditions. The screening of unknown compounds comprised the retrospective analysis of all tomato samples throughout the studied period and data processing using databases (mzCloud, ChemSpider and PubChem). In addition, stable nitrogen isotope analysis (δ 15 N) was assessed as a possible indicator to support discrimination between both production systems using crop/fertilizer correlations. Pesticide residue analyses were also applied as a well-established way to evaluate the organic production. Finally, the evaluation by combined chemometric analysis of high-resolution accurate mass spectrometry (HRAMS) and δ 15 N data provided a robust classification model in accordance with the agricultural practices. Principal component analysis (PCA) showed a sample clustering according to farming systems and significant differences in the sample profile was observed for six bioactive components (L-tyrosyl-L-isoleucyl-L-threonyl-L-threonine, trilobatin, phloridzin, tomatine, phloretin and echinenone). Copyright © 2018 Elsevier B.V. All rights reserved.
Fuel Cell Thermal Management Through Conductive Cooling Plates
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.; Burke, Kenneth A.
2008-01-01
An analysis was performed to evaluate the concept of utilizing conductive cooling plates to remove heat from a fuel cell stack, as opposed to a conventional internal cooling loop. The potential advantages of this type of cooling system are reduced stack complexity and weight and increased reliability through the reduction of the number of internal fluid seals. The conductive cooling plates would extract heat from the stack transferring it to an external coolant loop. The analysis was performed to determine the required thickness of these plates. The analysis was based on an energy balance between the thermal energy produced within the stack and the heat removal from the cooling plates. To accomplish the energy balance, the heat flow into and along the plates to the cooling fluid was modeled. Results were generated for various numbers of cells being cooled by a single cooling plate. The results provided cooling plate thickness, mass, and operating temperature of the plates. It was determined that utilizing high-conductivity pyrolitic graphite cooling plates can provide a specific cooling capacity (W/kg) equivalent to or potentially greater than a conventional internal cooling loop system.
Local Instruction Theory (LIT) on spherical geometry for enhancement students’ strategic competence
NASA Astrophysics Data System (ADS)
Nuraida, I.; Kusumah, Y. S.; Kartasasmita, B. G.
2018-03-01
This research focused on the analysis of the materials spherical geometry of the wake in an attempt to enhancemet the strategic competence of students and to produce learning trajectory. That is because the materials that are used less catchy concept gives students. Learning materials with Local Instructional Theory (LIT) can enhancemet the strategic competence of the students. This research aims to study the difference of achievement and improving the strategic competence of the students who got the Realistics Mathematics Education (RME) and (LIT) with conventional learning. This research is the Design Research with two cycles. This research has three phases i.e. 1) preparing for the experiment/preliminary; 2) teaching eksperiment; 3) retrospective analysis. The population of the research was the whole IX group junior high school 1 Rajapolah with samples of IXg and IXj group. Results of the analysis of the data shows that students based on Mathematical Prior Knowledge (MPK) acquire learning achievement have RME and LIT and enhancement strategic competence of the mathematical that are higher than those of students who obtain the conventional learning.
ERIC Educational Resources Information Center
Di Mare, Lesley A.
1987-01-01
Examines Jesse Jackson's rhetorical strategy of functionalizing conflict among divisive Democrats during the 1984 national convention. Applies conflict theory to Jackson's convention address, which serves as the basis for this rhetorical analysis. (JD)
Modelling, design and stability analysis of an improved SEPIC converter for renewable energy systems
NASA Astrophysics Data System (ADS)
G, Dileep; Singh, S. N.; Singh, G. K.
2017-09-01
In this paper, a detailed modelling and analysis of a switched inductor (SI)-based improved single-ended primary inductor converter (SEPIC) has been presented. To increase the gain of conventional SEPIC converter, input and output side inductors are replaced with SI structures. Design and stability analysis for continuous conduction mode operation of the proposed SI-SEPIC converter has also been presented in this paper. State space averaging technique is used to model the converter and carry out the stability analysis. Performance and stability analysis of closed loop configuration is predicted by observing the open loop behaviour using Nyquist diagram and Nichols chart. System was found to stable and critically damped.
Kin Wong, Kenny; Chiu, Rose; Tang, Betty; Mak, Donald; Liu, Joanne; Chiu, Siu Ning
2008-01-01
Supported employment is an evidence-based practice that has proved to be consistently more effective than conventional vocational rehabilitation in helping people with severe mental illness find and sustain competitive employment. Most research on the effectiveness of supported employment comes from the United States. This study examined the effectiveness and applicability of a supported employment program based on the individual placement and support model in a Hong Kong setting. Ninety-two unemployed individuals with long-term mental illness who desired competitive employment were randomly assigned to either a supported employment program or a conventional vocational rehabilitation program and followed up for 18 months. Both vocational and nonvocational outcomes were measured. Over the 18-month study period, compared with participants in the conventional vocational rehabilitation program, those in the supported employment group were more likely to work competitively (70% versus 29%; odds ratio=5.63, 95% confidence interval=2.28-13.84), held a greater number of competitive jobs, earned more income, worked more days, and sustained longer job tenures. Repeated-measures analysis of variance found no substantive differences between participants in the two groups and no significant change from baseline over time for psychiatric symptoms and self-perceived quality of life. Consistent with previous research findings in the United States, the supported employment program was more effective than the conventional vocational rehabilitation program in helping individuals with long-term mental illness find and sustain competitive employment in a Hong Kong setting. The supported employment program based on the individual placement and support model can thus be recommended for wider use in local mental health practice.
Frederix, Ines; Hansen, Dominique; Coninx, Karin; Vandervoort, Pieter; Vandijck, Dominique; Hens, Niel; Van Craenenbroeck, Emeline; Van Driessche, Niels; Dendale, Paul
2016-05-01
Notwithstanding the cardiovascular disease epidemic, current budgetary constraints do not allow for budget expansion of conventional cardiac rehabilitation programmes. Consequently, there is an increasing need for cost-effectiveness studies of alternative strategies such as telerehabilitation. The present study evaluated the cost-effectiveness of a comprehensive cardiac telerehabilitation programme. This multi-centre randomized controlled trial comprised 140 cardiac rehabilitation patients, randomized (1:1) to a 24-week telerehabilitation programme in addition to conventional cardiac rehabilitation (intervention group) or to conventional cardiac rehabilitation alone (control group). The incremental cost-effectiveness ratio was calculated based on intervention and health care costs (incremental cost), and the differential incremental quality adjusted life years (QALYs) gained. The total average cost per patient was significantly lower in the intervention group (€2156 ± €126) than in the control group (€2720 ± €276) (p = 0.01) with an overall incremental cost of €-564.40. Dividing this incremental cost by the baseline adjusted differential incremental QALYs (0.026 QALYs) yielded an incremental cost-effectiveness ratio of €-21,707/QALY. The number of days lost due to cardiovascular rehospitalizations in the intervention group (0.33 ± 0.15) was significantly lower than in the control group (0.79 ± 0.20) (p = 0.037). This paper shows the addition of cardiac telerehabilitation to conventional centre-based cardiac rehabilitation to be more effective and efficient than centre-based cardiac rehabilitation alone. These results are useful for policy makers charged with deciding how limited health care resources should best be allocated in the era of exploding need. © The European Society of Cardiology 2015.
Neubauer, Aljoscha S; Langer, Julian; Liegl, Raffael; Haritoglou, Christos; Wolf, Armin; Kozak, Igor; Seidensticker, Florian; Ulbig, Michael; Freeman, William R; Kampik, Anselm; Kernt, Marcus
2013-01-01
The purpose of this study was to evaluate and compare clinical outcomes and retreatment rates using navigated macular laser versus conventional laser for the treatment of diabetic macular edema (DME). In this prospective, interventional pilot study, 46 eyes from 46 consecutive patients with DME were allocated to receive macular laser photocoagulation using navigated laser. Best corrected visual acuity and retreatment rate were evaluated for up to 12 months after treatment. The control group was drawn based on chart review of 119 patients treated by conventional laser at the same institutions during the same time period. Propensity score matching was performed with Stata, based on the nearest-neighbor method. Propensity score matching for age, gender, baseline visual acuity, and number of laser spots yielded 28 matched patients for the control group. Visual acuity after navigated macular laser improved from a mean 0.48 ± 0.37 logMAR by a mean +2.9 letters after 3 months, while the control group showed a mean -4.0 letters (P = 0.03). After 6 months, navigated laser maintained a mean visual gain of +3.3 letters, and the conventional laser group showed a slower mean increase to +1.9 letters versus baseline. Using Kaplan-Meier analysis, the laser retreatment rate showed separation of the survival curves after 2 months, with fewer retreatments in the navigated group than in the conventional laser group during the first 8 months (18% versus 31%, respectively, P = 0.02). The short-term results of this pilot study suggest that navigated macular photocoagulation is an effective technique and could be considered as a valid alternative to conventional slit-lamp laser for DME when focal laser photocoagulation is indicated. The observed lower retreatment rates with navigated retinal laser therapy in the first 8 months suggest a more durable treatment effect.
McHugh, Lauren E J; Politi, Ioanna; Al-Fodeh, Rami S; Fleming, Garry J P
2017-09-01
To assess the cuspal deflection of standardised large mesio-occluso-distal (MOD) cavities in third molar teeth restored using conventional resin-based composite (RBC) or their bulk fill restorative counterparts compared with the unbound condition using a twin channel deflection measuring gauge. Following thermocycling, the cervical microleakage of the restored teeth was assessed to determine marginal integrity. Standardised MOD cavities were prepared in forty-eight sound third molar teeth and randomly allocated to six groups. Restorations were placed in conjunction with (and without) a universal bonding system and resin restorative materials were irradiated with a light-emitting-diode light-curing-unit. The dependent variable was the restoration protocol, eight oblique increments for conventional RBCs or two horizontal increments for the bulk fill resin restoratives. The cumulative buccal and palatal cuspal deflections from a twin channel deflection measuring gauge were summed, the restored teeth thermally fatigued, immersed in 0.2% basic fuchsin dye for 24h, sectioned and examined for cervical microleakage score. The one-way analysis of variance (ANOVA) identified third molar teeth restored using conventional RBC materials had significantly higher mean total cuspal deflection values compared with bulk fill resin restorative restoration (all p<0.0001). For the conventional RBCs, Admira Fusion (bonded) third molar teeth had significantly the lowest microleakage scores (all p<0.001) while the Admira Fusion x-tra (bonded) bulk fill resin restored teeth had significantly the lowest microleakage scores compared with Tetric EvoCeram Bulk Fill (bonded and non-bonded) teeth (all p<0.001). Not all conventional RBCs or bulk fill resin restoratives behave in a similar manner when used to restore standardised MOD cavities in third molar teeth. It would appear that light irradiation of individual conventional RBCs or bulk fill resin restoratives may be problematic such that material selection is vital in the absence of clinical data. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Using machine learning to assess covariate balance in matching studies.
Linden, Ariel; Yarnold, Paul R
2016-12-01
In order to assess the effectiveness of matching approaches in observational studies, investigators typically present summary statistics for each observed pre-intervention covariate, with the objective of showing that matching reduces the difference in means (or proportions) between groups to as close to zero as possible. In this paper, we introduce a new approach to distinguish between study groups based on their distributions of the covariates using a machine-learning algorithm called optimal discriminant analysis (ODA). Assessing covariate balance using ODA as compared with the conventional method has several key advantages: the ability to ascertain how individuals self-select based on optimal (maximum-accuracy) cut-points on the covariates; the application to any variable metric and number of groups; its insensitivity to skewed data or outliers; and the use of accuracy measures that can be widely applied to all analyses. Moreover, ODA accepts analytic weights, thereby extending the assessment of covariate balance to any study design where weights are used for covariate adjustment. By comparing the two approaches using empirical data, we are able to demonstrate that using measures of classification accuracy as balance diagnostics produces highly consistent results to those obtained via the conventional approach (in our matched-pairs example, ODA revealed a weak statistically significant relationship not detected by the conventional approach). Thus, investigators should consider ODA as a robust complement, or perhaps alternative, to the conventional approach for assessing covariate balance in matching studies. © 2016 John Wiley & Sons, Ltd.
Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors.
Ge, Xiaoliang; Theuwissen, Albert J P
2018-02-27
This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models.
Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors †
Theuwissen, Albert J. P.
2018-01-01
This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models. PMID:29495496
Lu, Yingjian; Gao, Boyan; Chen, Pei; Charles, Denys; Yu, Liangli (Lucy)
2014-01-01
Sweet basil, Ocimum basilicum., is one of the most important and wildly used spices and has been shown to have antioxidant, antibacterial, and anti-diarrheal activities. In this study, high performance liquid chromatographic (HPLC) and flow-injection mass spectrometric (FIMS) fingerprinting techniques were used to differentiate organic and conventional sweet basil leaf samples. Principal component analysis (PCA) of the fingerprints indicated that both HPLC and FIMS fingerprints could effectively detect the chemical differences in the organic and conventional sweet basil leaf samples. This study suggested that the organic basil sample contained greater concentrations of almost all the major compounds than its conventional counterpart on a per same botanical weight basis. The FIMS method was able to rapidly differentiate the organic and conventional sweet basil leaf samples (1 min analysis time), whereas the HPLC fingerprints provided more information about the chemical composition of the basil samples with a longer analytical time. PMID:24518341
Lu, Yingjian; Gao, Boyan; Chen, Pei; Charles, Denys; Yu, Liangli Lucy
2014-07-01
Sweet basil, Ocimum basilicum, is one of the most important and wildly used spices and has been shown to have antioxidant, antibacterial, and anti-diarrheal activities. In this study, high performance liquid chromatographic (HPLC) and flow-injection mass spectrometric (FIMS) fingerprinting techniques were used to differentiate organic and conventional sweet basil leaf samples. Principal component analysis (PCA) of the fingerprints indicated that both HPLC and FIMS fingerprints could effectively detect the chemical differences in the organic and conventional sweet basil leaf samples. This study suggested that the organic basil sample contained greater concentrations of almost all the major compounds than its conventional counterpart on a per same botanical weight basis. The FIMS method was able to rapidly differentiate the organic and conventional sweet basil leaf samples (1min analysis time), whereas the HPLC fingerprints provided more information about the chemical composition of the basil samples with a longer analytical time. Copyright © 2014 Elsevier Ltd. All rights reserved.
Burkowitz, Jörg; Merzenich, Carina; Grassme, Kathrin; Brüggenjürgen, Bernd
2016-08-01
Insertable or implantable cardiac monitors (ICMs) continuously monitor the heart rhythm and record irregularities over 3 years, enabling the diagnosis of infrequent rhythm abnormalities associated with syncope and stroke. The enhanced recognition capabilities of recent ICM models are able to accurately detect atrial fibrillation (AF) and have led to new applications of ICMs for the detection and monitoring of AF. Based on a systematic literature search, two indications were identified for ICMs for which considerable evidence, including randomized studies, exists: diagnosing the underlying cardiac cause of unexplained recurrent syncope and detecting AF in patients after cryptogenic stroke (CS). Three randomized controlled trials (RCTs) were identified that compared the effectiveness of ICMs in diagnosing patients with unexplained syncope (n = 556) to standard of care. A meta-analysis was conducted in order to generate an overall effect size and confidence interval of the diagnostic yield of ICMs versus conventional monitoring. In the indication CS, one RCT and five observational studies were included in order to assess the performance of ICMs in diagnosing patients with AF (n = 1129). Based on these studies, there is strong evidence that ICMs provide a higher diagnostic yield for detecting arrhythmias in patients with unexplained syncope and for detection of AF in patients after CS compared to conventional monitoring. Prolonged monitoring with ICMs is an effective tool for diagnosing the underlying cardiac cause of unexplained syncope and for detecting AF in patients with CS. In all RCTs, ICMs have a superior diagnostic yield compared to conventional monitoring. © The European Society of Cardiology 2016.
Tofangchiha, Maryam; Adel, Mamak; Bakhshi, Mahin; Esfehani, Mahsa; Nazeman, Pantea; Ghorbani Elizeyi, Mojgan; Javadi, Amir
2013-01-01
Vertical root fracture (VRF) is a complication which is chiefly diagnosed radiographically. Recently, film-based radiography has been substituted with digital radiography. At the moment, there is a wide range of monitors available in the market for viewing digital images. The present study aims to compare the diagnostic accuracy, sensitivity and specificity of medical and conventional monitors in detection of vertical root fractures. In this in vitro study 228 extracted single-rooted human teeth were endodontically treated. Vertical root fractures were induced in 114 samples. The teeth were imaged by a digital charge-coupled device radiography using parallel technique. The images were evaluated by a radiologist and an endodontist on two medical and conventional liquid-crystal display (LCD) monitors twice. Z-test was used to analyze the sensitivity, accuracy and specificity of each monitor. Significance level was set at 0.05. Inter and intra observer agreements were calculated by Cohen's kappa. Accuracy, specificity and sensitivity for conventional monitor were calculated as 67.5%, 72%, 62.5% respectively; and data for medical grade monitor were 67.5%, 66.5% and 68% respectively. Statistical analysis showed no significant differences in detecting VRF between the two techniques. Inter-observer agreement for conventional and medical monitor was 0.47 and 0.55 respectively (moderate). Intra-observer agreement was 0.78 for medical monitor and 0.87 for conventional one (substantial). The type of monitor does not influence diagnosis of vertical root fractures.
Mapping brain activity in gradient-echo functional MRI using principal component analysis
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Singh, Manbir; Don, Manuel
1997-05-01
The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.
Training Needs Analysis: Weaknesses in the Conventional Approach.
ERIC Educational Resources Information Center
Leat, Michael James; Lovel, Murray Jack
1997-01-01
Identification of the training and development needs of administrative support staff is not aided by conventional performance appraisal, which measures summary or comparative effectiveness. Meaningful diagnostic evaluation integrates three levels of analysis (organization, task, and individual), using behavioral expectation scales. (SK)
Wang, Pengfei; Wu, Siyu; Tian, Cheng; Yu, Guimei; Jiang, Wen; Wang, Guansong; Mao, Chengde
2016-10-11
Current tile-based DNA self-assembly produces simple repetitive or highly symmetric structures. In the case of 2D lattices, the unit cell often contains only one basic tile because the tiles often are symmetric (in terms of either the backbone or the sequence). In this work, we have applied retrosynthetic analysis to determine the minimal asymmetric units for complex DNA nanostructures. Such analysis guides us to break the intrinsic structural symmetries of the tiles to achieve high structural complexities. This strategy has led to the construction of several DNA nanostructures that are not accessible from conventional symmetric tile designs. Along with previous studies, herein we have established a set of four fundamental rules regarding tile-based assembly. Such rules could serve as guidelines for the design of DNA nanostructures.
Analysis of biofluids in aqueous environment based on mid-infrared spectroscopy.
Fabian, Heinz; Lasch, Peter; Naumann, Dieter
2005-01-01
In this study we describe a semiautomatic Fourier transform infrared spectroscopic methodology for the analysis of liquid serum samples, which combines simple sample introduction with high sample throughput. The applicability of this new infrared technology to the analysis of liquid serum samples from a cohort of cattle naturally infected with bovine spongiform encephalopathy and from controls was explored in comparison to the conventional approach based on transmission infrared spectroscopy of dried serum films. Artifical neural network analysis of the infrared data was performed to differentiate between bovine spongiform encephalopathy-negative controls and animals in the late stage of the disease. After training of artifical neural network classifiers, infrared spectra of sera from an independent external validation data set were analyzed. In this way, sensitivities between 90 and 96% and specificities between 84 and 92% were achieved, respectively, depending upon the strategy of data collection and data analysis. Based on these results, the advantages and limitations of the liquid sample technique and the dried film approach for routine analysis of biofluids are discussed. 2005 Society of Photo-Optical Instrumentation Engineers.
Al-mejrad, Lamya A.; Albarrag, Ahmed M.
2017-01-01
PURPOSE The goal of this study was to compare the adhesion of Candida albicans to the surfaces of CAD/CAM and conventionally fabricated complete denture bases. MATERIALS AND METHODS Twenty discs of acrylic resin poly (methyl methacrylate) were fabricated with CAD/CAM and conventional procedures (heat-polymerized acrylic resin). The specimens were divided into two groups: 10 discs were fabricated using the CAD/CAM procedure (Wieland Digital Denture Ivoclar Vivadent), and 10 discs were fabricated using a conventional flasking and pressure-pack technique. Candida colonization was performed on all the specimens using four Candida albicans isolates. The difference in Candida albicans adhesion on the discs was evaluated. The number of adherent yeast cells was calculated by the colony-forming units (CFU) and by Fluorescence microscopy. RESULTS There was a significant difference in the adhesion of Candida albicans to the complete denture bases created with CAD/CAM and the adhesion to those created with the conventional procedure. The CAD/CAM denture bases exhibited less adhesion of Candida albicans than did the denture bases created with the conventional procedure (P<.05). CONCLUSION The CAD/CAM procedure for fabricating complete dentures showed promising potential for reducing the adherence of Candida to the denture base surface. Clinical Implications. Complete dentures made with the CAD/CAM procedure might decrease the incidence of denture stomatitis compared with conventional dentures. PMID:29142649
Al-Fouzan, Afnan F; Al-Mejrad, Lamya A; Albarrag, Ahmed M
2017-10-01
The goal of this study was to compare the adhesion of Candida albicans to the surfaces of CAD/CAM and conventionally fabricated complete denture bases. Twenty discs of acrylic resin poly (methyl methacrylate) were fabricated with CAD/CAM and conventional procedures (heat-polymerized acrylic resin). The specimens were divided into two groups: 10 discs were fabricated using the CAD/CAM procedure (Wieland Digital Denture Ivoclar Vivadent), and 10 discs were fabricated using a conventional flasking and pressure-pack technique. Candida colonization was performed on all the specimens using four Candida albicans isolates. The difference in Candida albicans adhesion on the discs was evaluated. The number of adherent yeast cells was calculated by the colony-forming units (CFU) and by Fluorescence microscopy. There was a significant difference in the adhesion of Candida albicans to the complete denture bases created with CAD/CAM and the adhesion to those created with the conventional procedure. The CAD/CAM denture bases exhibited less adhesion of Candida albicans than did the denture bases created with the conventional procedure ( P <.05). The CAD/CAM procedure for fabricating complete dentures showed promising potential for reducing the adherence of Candida to the denture base surface. Clinical Implications. Complete dentures made with the CAD/CAM procedure might decrease the incidence of denture stomatitis compared with conventional dentures.
8 CFR 204.306 - Classification as an immediate relative based on a Convention adoption.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Classification as an immediate relative based on a Convention adoption. 204.306 Section 204.306 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY IMMIGRATION REGULATIONS IMMIGRANT PETITIONS Intercountry Adoption of a Convention Adoptee § 204...
Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard
2017-01-01
Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions. PMID:28752092
Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard; Laurinavicius, Arvydas
2017-01-01
Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions.
The limits of direct satellite tracking with the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Bertiger, W. I.; Yunck, T. P.
1988-01-01
Recent advances in high precision differential Global Positioning System-based satellite tracking can be applied to the more conventional direct tracking of low earth satellites. To properly evaluate the limiting accuracy of direct GPS-based tracking, it is necessary to account for the correlations between the a-priori errors in GPS states, Y-bias, and solar pressure parameters. These can be obtained by careful analysis of the GPS orbit determination process. The analysis indicates that sub-meter accuracy can be readily achieved for a user above 1000 km altitude, even when the user solution is obtained with data taken 12 hours after the data used in the GPS orbit solutions.
Fast focus estimation using frequency analysis in digital holography.
Oh, Seungtaik; Hwang, Chi-Young; Jeong, Il Kwon; Lee, Sung-Keun; Park, Jae-Hyeung
2014-11-17
A novel fast frequency-based method to estimate the focus distance of digital hologram for a single object is proposed. The focus distance is computed by analyzing the distribution of intersections of smoothed-rays. The smoothed-rays are determined by the directions of energy flow which are computed from local spatial frequency spectrum based on the windowed Fourier transform. So our method uses only the intrinsic frequency information of the optical field on the hologram and therefore does not require any sequential numerical reconstructions and focus detection techniques of conventional photography, both of which are the essential parts in previous methods. To show the effectiveness of our method, numerical results and analysis are presented as well.
Impact of satellite-based data on FGGE general circulation statistics
NASA Technical Reports Server (NTRS)
Salstein, David A.; Rosen, Richard D.; Baker, Wayman E.; Kalnay, Eugenia
1987-01-01
The NASA Goddard Laboratory for Atmospheres (GLA) analysis/forecast system was run in two different parallel modes in order to evaluate the influence that data from satellites and other FGGE observation platforms can have on analyses of large scale circulation; in the first mode, data from all observation systems were used, while in the second only conventional upper air and surface reports were used. The GLA model was also integrated for the same period without insertion of any data; an independent objective analysis based only on rawinsonde and pilot balloon data is also performed. A small decrease in the vigor of the general circulation is noted to follow from the inclusion of satellite observations.
Single-energy intensity modulated proton therapy
NASA Astrophysics Data System (ADS)
Farace, Paolo; Righetto, Roberto; Cianchetti, Marco
2015-09-01
In this note, an intensity modulated proton therapy (IMPT) technique, based on the use of high single-energy (SE-IMPT) pencil beams, is described. The method uses only the highest system energy (226 MeV) and only lateral penumbra to produce dose gradient, as in photon therapy. In the study, after a preliminary analysis of the width of proton pencil beam penumbras at different depths, SE-IMPT was compared with conventional IMPT in a phantom containing titanium inserts and in a patient, affected by a spinal chordoma with fixation rods. It was shown that SE-IMPT has the potential to produce a sharp dose gradient and that it is not affected by the uncertainties produced by metal implants crossed by the proton beams. Moreover, in the chordoma patient, target coverage and organ at risk sparing of the SE-IMPT plan resulted comparable to that of the less reliable conventional IMPT technique. Robustness analysis confirmed that SE-IMPT was not affected by range errors, which can drastically affect the IMPT plan. When accepting a low-dose spread as in modern photon techniques, SE-IMPT could be an option for the treatment of lesions (e.g. cervical bone tumours) where steep dose gradient could improve curability, and where range uncertainty, due for example to the presence of metal implants, hampers conventional IMPT.
Sobral, Guilherme Caiado; Vedovello, Mário; Degan, Viviane Veroni; Santamaria, Milton
2014-01-01
OBJECTIVE: By means of a photoelastic model, this study analyzed the stress caused on conventional and self-ligating brackets with expanded arch wires. METHOD: Standard brackets were adhered to artificial teeth and a photoelastic model was prepared using the Interlandi 19/12 diagram as base. Successive activations were made with 0.014-in and 0.018-in rounded cross section Nickel-Titanium wires (NiTi) and 0.019 x 0.025-in rectangular stainless steel wires all of which made on 22/14 Interlandi diagram. The model was observed on a plane polariscope - in a dark field microscope configuration - and photographed at each exchange of wire. Then, they were replaced by self-ligating brackets and the process was repeated. Analysis was qualitative and observed stress location and pattern on both models analyzed. CONCLUSIONS: Results identified greater stress on the region of the apex of premolars in both analyzed models. Upon comparing the stress between models, a greater amount of stress was found in the model with conventional brackets in all of its wires. Therefore, the present pilot study revealed that alignment of wires in self-ligating brackets produced lower stress in periodontal tissues in expansive mechanics. PMID:25715719
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nafisi, Kourosh; Ranau, Werner; Hemminger, John C.
2001-01-01
We present a new ultrahigh vacuum (UHV) chamber for surface analysis and microscopy at controlled, variable temperatures. The new instrument allows surface analysis with Auger electron spectroscopy, low energy electron diffraction, quadrupole mass spectrometer, argon ion sputtering gun, and a variable temperature scanning tunneling microscope (VT-STM). In this system, we introduce a novel procedure for transferring a sample off a conventional UHV manipulator and onto a scanning tunneling microscope in the conventional ''beetle'' geometry, without disconnecting the heating or thermocouple wires. The microscope, a modified version of the Besocke beetle microscope, is mounted on a 2.75 in. outer diameter UHVmore » flange and is directly attached to the base of the chamber. The sample is attached to a tripod sample holder that is held by the main manipulator. Under UHV conditions the tripod sample holder can be removed from the main manipulator and placed onto the STM. The VT-STM has the capability of acquiring images between the temperature range of 180--500 K. The performance of the chamber is demonstrated here by producing an ordered array of island vacancy defects on a Pt(111) surface and obtaining STM images of these defects.« less
Single-energy intensity modulated proton therapy.
Farace, Paolo; Righetto, Roberto; Cianchetti, Marco
2015-10-07
In this note, an intensity modulated proton therapy (IMPT) technique, based on the use of high single-energy (SE-IMPT) pencil beams, is described.The method uses only the highest system energy (226 MeV) and only lateral penumbra to produce dose gradient, as in photon therapy. In the study, after a preliminary analysis of the width of proton pencil beam penumbras at different depths, SE-IMPT was compared with conventional IMPT in a phantom containing titanium inserts and in a patient, affected by a spinal chordoma with fixation rods.It was shown that SE-IMPT has the potential to produce a sharp dose gradient and that it is not affected by the uncertainties produced by metal implants crossed by the proton beams. Moreover, in the chordoma patient, target coverage and organ at risk sparing of the SE-IMPT plan resulted comparable to that of the less reliable conventional IMPT technique. Robustness analysis confirmed that SE-IMPT was not affected by range errors, which can drastically affect the IMPT plan.When accepting a low-dose spread as in modern photon techniques, SE-IMPT could be an option for the treatment of lesions (e.g. cervical bone tumours) where steep dose gradient could improve curability, and where range uncertainty, due for example to the presence of metal implants, hampers conventional IMPT.
de Almeida, Sandro Marco Steanini; Franca, Fabiana Mantovani Gomes; Florio, Flavia Martao; Ambrosano, Glaucia Maria Bovi; Basting, Roberta Tarkany
2013-07-01
Chemomechanical caries removal, when compared with removal using conventional rotary instruments, seems to preserve healthy tooth structure with less trauma to the patient. This study performed in vivo analysis of the total number of microorganisms in dentin after the use of conventional or chemomechanical (papain gel) caries removal methods. Analyses were performed before caries removal (baseline), immediately after caries removal, and 45 days after caries removal and temporary cavity sealing. Sixty patients were selected for this study, each with two mandibular molars (one on each side) with occlusal caries of moderate depth, for a total of 120 teeth. For each patient, the carious lesion of one tooth was removed by conventional methods using low speed drills (Group 1). For the other tooth, a chemomechanical method was used (Group 2). Dentin samples were collected at the three intervals and subjected to microbiological culture in blood agar. For the total number of microorganisms in both groups, ANOVA and Tukey tests (which considered the baseline values as a covariable) showed a higher microbial count immediately after the preparation of the cavity compared to the count at 45 days (P < 0.05). For both groups, the total count of microorganisms in dentin decreased 45 days after placing the temporary cavity sealing.
Walter Reed Army Medical Center's Internet-based electronic health portal.
Abbott, Kevin C; Boocks, Carl E; Sun, Zhengyi; Boal, Thomas R; Poropatich, Ronald K
2003-12-01
Use of the World Wide Web (WWW) and electronic media to facilitate medical care has been the subject of many reports in the popular press. However, few reports have documented the results of implementing electronic health portals for essential medical tasks, such as prescription refills and appointments. At Walter Reed Army Medical Center, "Search & Learn" medical information, Internet-based prescription refills and patient appointments were established in January 2001. A multiphase retrospective analysis was conducted to determine the use of the "Search & Learn" medical information and the relative number of prescription refills and appointments conducted via the WWW compared with conventional methods. From January 2001 to May 2002, there were 34,741 refills and 819 appointments made over the Internet compared with 2,275,112 refills and approximately 500,000 appointments made conventionally. WWW activity accounted for 1.52% of refills and 0.16% of appointments. There was a steady increase in this percentage over the time of the analysis. In April of 2002, the monthly average of online refills had risen to 4.57% and online appointments were at 0.27%. Online refills were projected to account for 10% of all prescriptions in 2 years. The "Search & Learn" medical information portion of our web site received 147,429 unique visits during this same time frame, which was an average of 326 visitors per day. WWW-based methods of conducting essential medical tasks accounted for a small but rapidly increasing percentage of total activity at Walter Reed Army Medical Center. Subsequent phases of analysis will assess demographic and geographic factors and aid in the design of future systems to increase use of the Internet-based systems.
Comparing the effectiveness of laser vs. conventional endoforehead lifting.
Chang, Cheng-Jen; Yu, De-Yi; Chang, Shu-Ying; Hsiao, Yen-Chang
2018-04-01
The objective of this study was to compare the efficacy and safety of laser versus conventional endoforehead lifting. Over a period of 12 years (January 2000-January 2012), a total of 110 patients with hyperactive muscles over the frontal region have been collected for a retrospective study. The SurgiLase 150XJ CO 2 laser system, in conjunction with the flexible FIBERLASE, was used. The endoscope was 4 mm in diameter with an angle of 30°. The primary efficacy measurement was the assessment of the final outcome for using laser vs. conventional methods. Both groups were observed at three weeks, six weeks and six months after surgery. The most common complication in early convalescence (three weeks) was swelling. This was followed by local paraesthesia, ecchymosis, localized hematomas and scar with alopecia. All these problems disappeared completely after the 6-month study period. Based on a chi-square analysis, there were clinically and statistically significant differences favouring the laser endoforehead surgery in the operative time, early and late complications. All patients achieved significant improvement after both laser and conventional endoforehead surgery in the final outcome. However, the early and late complications indicated a greater difference in the laser group.
Direct ultrafiltration performance and membrane integrity monitoring by microbiological analysis.
Ferrer, O; Casas, S; Galvañ, C; Lucena, F; Bosch, A; Galofré, B; Mesa, J; Jofre, J; Bernat, X
2015-10-15
The feasibility of substituting a conventional pre-treatment, consisting of dioxi-chlorination, coagulation/flocculation, settling and sand filtration, of a drinking water treatment plant (DWTP) by direct ultrafiltration (UF) has been assessed from a microbiological standpoint. Bacterial indicators, viral indicators and human viruses have been monitored in raw river, ultrafiltered and conventionally pre-treated water samples during two years. Direct UF has proven to remove bacterial indicators quite efficiently and to a greater extent than the conventional process does. Nevertheless, the removal of small viruses such as some small bacteriophages and human viruses (e.g. enteroviruses and noroviruses) is lower than the current conventional pre-treatment. Membrane integrity has been assessed during two years by means of tailored tests based on bacteriophages with different properties (MS-2, GA and PDR-1) and bacterial spores (Bacillus spores). Membrane integrity has not been compromised despite the challenging conditions faced by directly treating raw river water. Bacteriophage PDR-1 appears as a suitable microbe to test membrane integrity, as its size is slightly larger than the considered membrane pore size. However, its implementation at full scale plant is still challenging due to difficulties in obtaining enough phages for its seeding. Copyright © 2015 Elsevier Ltd. All rights reserved.
Gasquoine, Philip Gerard; Gonzalez, Cassandra Dayanira
2012-05-01
Conventional neuropsychological norms developed for monolinguals likely overestimate normal performance in bilinguals on language but not visual-perceptual format tests. This was studied by comparing neuropsychological false-positive rates using the 50th percentile of conventional norms and individual comparison standards (Picture Vocabulary or Matrix Reasoning scores) as estimates of preexisting neuropsychological skill level against the number expected from the normal distribution for a consecutive sample of 56 neurologically intact, bilingual, Hispanic Americans. Participants were tested in separate sessions in Spanish and English in the counterbalanced order on La Bateria Neuropsicologica and the original English language tests on which this battery was based. For language format measures, repeated-measures multivariate analysis of variance showed that individual estimates of preexisting skill level in English generated the mean number of false positives most approximate to that expected from the normal distribution, whereas the 50th percentile of conventional English language norms did the same for visual-perceptual format measures. When using conventional Spanish or English monolingual norms for language format neuropsychological measures with bilingual Hispanic Americans, individual estimates of preexisting skill level are recommended over the 50th percentile.
Bacheler, N.M.; Buckel, J.A.; Hightower, J.E.; Paramore, L.M.; Pollock, K.H.
2009-01-01
A joint analysis of tag return and telemetry data should improve estimates of mortality rates for exploited fishes; however, the combined approach has thus far only been tested in terrestrial systems. We tagged subadult red drum (Sciaenops ocellatus) with conventional tags and ultrasonic transmitters over 3 years in coastal North Carolina, USA, to test the efficacy of the combined telemetry - tag return approach. There was a strong seasonal pattern to monthly fishing mortality rate (F) estimates from both conventional and telemetry tags; highest F values occurred in fall months and lowest levels occurred during winter. Although monthly F values were similar in pattern and magnitude between conventional tagging and telemetry, information on F in the combined model came primarily from conventional tags. The estimated natural mortality rate (M) in the combined model was low (estimated annual rate ?? standard error: 0.04 ?? 0.04) and was based primarily upon the telemetry approach. Using high-reward tagging, we estimated different tag reporting rates for state agency and university tagging programs. The combined telemetry - tag return approach can be an effective approach for estimating F and M as long as several key assumptions of the model are met.
Arnuntasupakul, Vanlapa; Van Zundert, Tom C R V; Vijitpavan, Amorn; Aliste, Julian; Engsusophon, Phatthanaphol; Leurcharusmee, Prangmalee; Ah-Kye, Sonia; Finlayson, Roderick J; Tran, De Q H
2016-01-01
Epidural waveform analysis (EWA) provides a simple confirmatory adjunct for loss of resistance (LOR): when the needle tip is correctly positioned inside the epidural space, pressure measurement results in a pulsatile waveform. In this randomized trial, we compared conventional and EWA-confirmed LOR in 2 teaching centers. Our research hypothesis was that EWA-confirmed LOR would decrease the failure rate of thoracic epidural blocks. One hundred patients undergoing thoracic epidural blocks for thoracic surgery, abdominal surgery, or rib fractures were randomized to conventional LOR or EWA-LOR. The operator was allowed as many attempts as necessary to achieve a satisfactory LOR (by feel) in the conventional group. In the EWA-LOR group, LOR was confirmed by connecting the epidural needle to a pressure transducer using a rigid extension tubing. Positive waveforms indicated that the needle tip was positioned inside the epidural space. The operator was allowed a maximum of 3 different intervertebral levels to obtain a positive waveform. If waveforms were still absent at the third level, the operator simply accepted LOR as the technical end point. However, the patient was retained in the EWA-LOR group (intent-to-treat analysis).After achieving a satisfactory tactile LOR (conventional group), positive waveforms (EWA-LOR group), or a third intervertebral level with LOR but no waveform (EWA-LOR group), the operator administered a 4-mL test dose of lidocaine 2% with epinephrine 5 μg/mL. Fifteen minutes after the test dose, a blinded investigator assessed the patient for sensory block to ice. Compared with LOR, EWA-LOR resulted in a lower rate of primary failure (2% vs 24%; P = 0.002). Subgroup analysis based on experience level reveals that EWA-LOR outperformed conventional LOR for novice (P = 0.001) but not expert operators. The performance time was longer in the EWA-LOR group (11.2 ± 6.2 vs 8.0 ± 4.6 minutes; P = 0.006). Both groups were comparable in terms of operator's level of expertise, depth of the epidural space, approach, and LOR medium. In the EWA-LOR group, operators obtained a pulsatile waveform with the first level attempted in 60% of patients. However, 40% of subjects required performance at a second or third level. Compared with its conventional counterpart, EWA-confirmed LOR results in a lower failure rate for thoracic epidural blocks (2% vs 24%) in our teaching centers. Confirmatory EWA provides significant benefits for inexperienced operators.
Synthesis of Monodisperse Chitosan Nanoparticles and in Situ Drug Loading Using Active Microreactor.
Kamat, Vivek; Marathe, Ila; Ghormade, Vandana; Bodas, Dhananjay; Paknikar, Kishore
2015-10-21
Chitosan nanoparticles are promising drug delivery vehicles. However, the conventional method of unregulated mixing during ionic gelation limits their application because of heterogeneity in size and physicochemical properties. Therefore, a detailed theoretical analysis of conventional and active microreactor models was simulated. This led to design and fabrication of a polydimethylsiloxane microreactor with magnetic micro needles for the synthesis of monodisperse chitosan nanoparticles. Chitosan nanoparticles synthesized conventionally, using 0.5 mg/mL chitosan, were 250 ± 27 nm with +29.8 ± 8 mV charge. Using similar parameters, the microreactor yielded small size particles (154 ± 20 nm) at optimized flow rate of 400 μL/min. Further optimization at 0.4 mg/mL chitosan concentration yielded particles (130 ± 9 nm) with higher charge (+39.8 ± 5 mV). The well-controlled microreactor-based mixing generated highly monodisperse particles with tunable properties including antifungal drug entrapment (80%), release rate, and effective activity (MIC, 1 μg/mL) against Candida.
Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing
2018-01-01
Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.
2018-01-01
Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673
Gao, Boyan; Qin, Fang; Ding, Tingting; Chen, Yineng; Lu, Weiying; Yu, Liangli Lucy
2014-08-13
Ultraperformance liquid chromatography mass spectrometry (UPLC-MS), flow injection mass spectrometry (FIMS), and headspace gas chromatography (headspace-GC) combined with multivariate data analysis techniques were examined and compared in differentiating organically grown oregano from that grown conventionally. It is the first time that headspace-GC fingerprinting technology is reported in differentiating organically and conventionally grown spice samples. The results also indicated that UPLC-MS, FIMS, and headspace-GC-FID fingerprints with OPLS-DA were able to effectively distinguish oreganos under different growing conditions, whereas with PCA, only FIMS fingerprint could differentiate the organically and conventionally grown oregano samples. UPLC fingerprinting provided detailed information about the chemical composition of oregano with a longer analysis time, whereas FIMS finished a sample analysis within 1 min. On the other hand, headspace GC-FID fingerprinting required no sample pretreatment, suggesting its potential as a high-throughput method in distinguishing organically and conventionally grown oregano samples. In addition, chemical components in oregano were identified by their molecular weight using QTOF-MS and headspace-GC-MS.
Wu, Rongli; Watanabe, Yoshiyuki; Arisawa, Atsuko; Takahashi, Hiroto; Tanaka, Hisashi; Fujimoto, Yasunori; Watabe, Tadashi; Isohashi, Kayako; Hatazawa, Jun; Tomiyama, Noriyuki
2017-10-01
This study aimed to compare the tumor volume definition using conventional magnetic resonance (MR) and 11C-methionine positron emission tomography (MET/PET) images in the differentiation of the pre-operative glioma grade by using whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) maps. Thirty-four patients with histopathologically proven primary brain low-grade gliomas (n = 15) and high-grade gliomas (n = 19) underwent pre-operative or pre-biopsy MET/PET, fluid-attenuated inversion recovery, dynamic susceptibility contrast perfusion-weighted magnetic resonance imaging, and contrast-enhanced T1-weighted at 3.0 T. The histogram distribution derived from the nCBV maps was obtained by co-registering the whole tumor volume delineated on conventional MR or MET/PET images, and eight histogram parameters were assessed. The mean nCBV value had the highest AUC value (0.906) based on MET/PET images. Diagnostic accuracy significantly improved when the tumor volume was measured from MET/PET images compared with conventional MR images for the parameters of mean, 50th, and 75th percentile nCBV value (p = 0.0246, 0.0223, and 0.0150, respectively). Whole-tumor histogram analysis of CBV map provides more valuable histogram parameters and increases diagnostic accuracy in the differentiation of pre-operative cerebral gliomas when the tumor volume is derived from MET/PET images.
You, Joyce H S; Lui, Grace; Kam, Kai Man; Lee, Nelson L S
2015-04-01
We examined, from a Hong Kong healthcare providers' perspective, the cost-effectiveness of rapid diagnosis with Xpert in patients hospitalized for suspected active pulmonary tuberculosis (PTB). A decision tree was designed to simulate outcomes of three diagnostic assessment strategies in adult patients hospitalized for suspected active PTB: conventional approach, sputum smear plus Xpert for acid-fast bacilli (AFB) smear-negative, and a single sputum Xpert test. Model inputs were derived from the literature. Outcome measures were direct medical cost, one-year mortality rate, quality-adjusted life-years (QALYs) and incremental cost per QALY (ICER). In the base-case analysis, Xpert was more effective with higher QALYs gained and a lower mortality rate when compared with smear plus Xpert by an ICER of USD99. A conventional diagnostic approach was the least preferred option with the highest cost, lowest QALYs gained and highest mortality rate. Sensitivity analysis showed that Xpert would be the most cost-effective option if the sensitivity of sputum AFB smear microscopy was ≤74%. The probabilities of Xpert, smear plus Xpert and a conventional approach to be cost-effective were 94.5%, 5.5% and 0%, respectively, in 10,000 Monte Carlo simulations. The Xpert sputum test appears to be a highly cost-effective diagnostic strategy for patients with suspected active PTB in an intermediate burden area like Hong Kong. Copyright © 2015 The British Infection Association. Published by Elsevier Ltd. All rights reserved.
Enhancement of automated blood flow estimates (ENABLE) from arterial spin-labeled MRI.
Shirzadi, Zahra; Stefanovic, Bojana; Chappell, Michael A; Ramirez, Joel; Schwindt, Graeme; Masellis, Mario; Black, Sandra E; MacIntosh, Bradley J
2018-03-01
To validate a multiparametric automated algorithm-ENhancement of Automated Blood fLow Estimates (ENABLE)-that identifies useful and poor arterial spin-labeled (ASL) difference images in multiple postlabeling delay (PLD) acquisitions and thereby improve clinical ASL. ENABLE is a sort/check algorithm that uses a linear combination of ASL quality features. ENABLE uses simulations to determine quality weighting factors based on an unconstrained nonlinear optimization. We acquired a set of 6-PLD ASL images with 1.5T or 3.0T systems among 98 healthy elderly and adults with mild cognitive impairment or dementia. We contrasted signal-to-noise ratio (SNR) of cerebral blood flow (CBF) images obtained with ENABLE vs. conventional ASL analysis. In a subgroup, we validated our CBF estimates with single-photon emission computed tomography (SPECT) CBF images. ENABLE produced significantly increased SNR compared to a conventional ASL analysis (Wilcoxon signed-rank test, P < 0.0001). We also found the similarity between ASL and SPECT was greater when using ENABLE vs. conventional ASL analysis (n = 51, Wilcoxon signed-rank test, P < 0.0001) and this similarity was strongly related to ASL SNR (t = 24, P < 0.0001). These findings suggest that ENABLE improves CBF image quality from multiple PLD ASL in dementia cohorts at either 1.5T or 3.0T, achieved by multiparametric quality features that guided postprocessing of dementia ASL. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:647-655. © 2017 International Society for Magnetic Resonance in Medicine.
Complex network analysis of conventional and Islamic stock market in Indonesia
NASA Astrophysics Data System (ADS)
Rahmadhani, Andri; Purqon, Acep; Kim, Sehyun; Kim, Soo Yong
2015-09-01
The rising popularity of Islamic financial products in Indonesia has become a new interesting topic to be analyzed recently. We introduce a complex network analysis to compare conventional and Islamic stock market in Indonesia. Additionally, Random Matrix Theory (RMT) has been added as a part of reference to expand the analysis of the result. Both of them are based on the cross correlation matrix of logarithmic price returns. Closing price data, which is taken from June 2011 to July 2012, is used to construct logarithmic price returns. We also introduce the threshold value using winner-take-all approach to obtain scale-free property of the network. This means that the nodes of the network that has a cross correlation coefficient below the threshold value should not be connected with an edge. As a result, we obtain 0.5 as the threshold value for all of the stock market. From the RMT analysis, we found that there is only market wide effect on both stock market and no clustering effect has been found yet. From the network analysis, both of stock market networks are dominated by the mining sector. The length of time series of closing price data must be expanded to get more valuable results, even different behaviors of the system.
Gold nanoparticle-based optical microfluidic sensors for analysis of environmental pollutants.
Lafleur, Josiane P; Senkbeil, Silja; Jensen, Thomas G; Kutter, Jörg P
2012-11-21
Conventional methods of environmental analysis can be significantly improved by the development of portable microscale technologies for direct in-field sensing at remote locations. This report demonstrates the vast potential of gold nanoparticle-based microfluidic sensors for the rapid, in-field, detection of two important classes of environmental contaminants - heavy metals and pesticides. Using gold nanoparticle-based microfluidic sensors linked to a simple digital camera as the detector, detection limits as low as 0.6 μg L(-1) and 16 μg L(-1) could be obtained for the heavy metal mercury and the dithiocarbamate pesticide ziram, respectively. These results demonstrate that the attractive optical properties of gold nanoparticle probes combine synergistically with the inherent qualities of microfluidic platforms to offer simple, portable and sensitive sensors for environmental contaminants.
Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems
NASA Technical Reports Server (NTRS)
Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.
1992-01-01
The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.
New preparation method of {beta}{double_prime}-alumina and application for AMTEC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nishi, Toshiro; Tsuru, Yasuhiko; Yamamoto, Hirokazu
1995-12-31
The Alkali Metal Thermo-Electric Converter(AMTEC) is an energy conversion system that converts heat to electrical energy with high efficiency. The {beta}{double_prime}-alumina solid electrolyte (BASE) is the most important component in the AMTEC system. In this paper, the relationship among the conduction property, the microstructure and the amount of chemical component for BASE is studied. As an analysis of the chemical reaction for each component, the authors established a new BASE preparation method rather than using the conventional method. They also report the AMTFC cell performance using this electrolyte tube on which Mo or TiC electrode is filmed by the screenmore » printing method. Then, an electrochemical analysis and a heat cycle test of AMTEC cell are studied.« less
NASA Astrophysics Data System (ADS)
Habas, Piotr A.; Kim, Kio; Chandramohan, Dharshan; Rousseau, Francois; Glenn, Orit A.; Studholme, Colin
2009-02-01
Recent advances in MR and image analysis allow for reconstruction of high-resolution 3D images from clinical in utero scans of the human fetal brain. Automated segmentation of tissue types from MR images (MRI) is a key step in the quantitative analysis of brain development. Conventional atlas-based methods for adult brain segmentation are limited in their ability to accurately delineate complex structures of developing tissues from fetal MRI. In this paper, we formulate a novel geometric representation of the fetal brain aimed at capturing the laminar structure of developing anatomy. The proposed model uses a depth-based encoding of tissue occurrence within the fetal brain and provides an additional anatomical constraint in a form of a laminar prior that can be incorporated into conventional atlas-based EM segmentation. Validation experiments are performed using clinical in utero scans of 5 fetal subjects at gestational ages ranging from 20.5 to 22.5 weeks. Experimental results are evaluated against reference manual segmentations and quantified in terms of Dice similarity coefficient (DSC). The study demonstrates that the use of laminar depth-encoded tissue priors improves both the overall accuracy and precision of fetal brain segmentation. Particular refinement is observed in regions of the parietal and occipital lobes where the DSC index is improved from 0.81 to 0.82 for cortical grey matter, from 0.71 to 0.73 for the germinal matrix, and from 0.81 to 0.87 for white matter.