NASA Astrophysics Data System (ADS)
Yusa, Yasunori; Okada, Hiroshi; Yamada, Tomonori; Yoshimura, Shinobu
2018-04-01
A domain decomposition method for large-scale elastic-plastic problems is proposed. The proposed method is based on a quasi-Newton method in conjunction with a balancing domain decomposition preconditioner. The use of a quasi-Newton method overcomes two problems associated with the conventional domain decomposition method based on the Newton-Raphson method: (1) avoidance of a double-loop iteration algorithm, which generally has large computational complexity, and (2) consideration of the local concentration of nonlinear deformation, which is observed in elastic-plastic problems with stress concentration. Moreover, the application of a balancing domain decomposition preconditioner ensures scalability. Using the conventional and proposed domain decomposition methods, several numerical tests, including weak scaling tests, were performed. The convergence performance of the proposed method is comparable to that of the conventional method. In particular, in elastic-plastic analysis, the proposed method exhibits better convergence performance than the conventional method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakanishi, Hidetoshi, E-mail: nakanisi@screen.co.jp; Ito, Akira, E-mail: a.ito@screen.co.jp; Takayama, Kazuhisa, E-mail: takayama.k0123@gmail.com
2015-11-15
A laser terahertz emission microscope (LTEM) can be used for noncontact inspection to detect the waveforms of photoinduced terahertz emissions from material devices. In this study, we experimentally compared the performance of LTEM with conventional analysis methods, e.g., electroluminescence (EL), photoluminescence (PL), and laser beam induced current (LBIC), as an inspection method for solar cells. The results showed that LTEM was more sensitive to the characteristics of the depletion layer of the polycrystalline solar cell compared with EL, PL, and LBIC and that it could be used as a complementary tool to the conventional analysis methods for a solar cell.
Meta-analysis of Odds Ratios: Current Good Practices
Chang, Bei-Hung; Hoaglin, David C.
2016-01-01
Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977
Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L
2016-02-10
Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure. Copyright © 2015 John Wiley & Sons, Ltd.
Masood, Athar; Stark, Ken D; Salem, Norman
2005-10-01
Conventional sample preparation for fatty acid analysis is a complicated, multiple-step process, and gas chromatography (GC) analysis alone can require >1 h per sample to resolve fatty acid methyl esters (FAMEs). Fast GC analysis was adapted to human plasma FAME analysis using a modified polyethylene glycol column with smaller internal diameters, thinner stationary phase films, increased carrier gas linear velocity, and faster temperature ramping. Our results indicated that fast GC analyses were comparable to conventional GC in peak resolution. A conventional transesterification method based on Lepage and Roy was simplified to a one-step method with the elimination of the neutralization and centrifugation steps. A robotics-amenable method was also developed, with lower methylation temperatures and in an open-tube format using multiple reagent additions. The simplified methods produced results that were quantitatively similar and with similar coefficients of variation as compared with the original Lepage and Roy method. The present streamlined methodology is suitable for the direct fatty acid analysis of human plasma, is appropriate for research studies, and will facilitate large clinical trials and make possible population studies.
Tracking B-Cell Repertoires and Clonal Histories in Normal and Malignant Lymphocytes.
Weston-Bell, Nicola J; Cowan, Graeme; Sahota, Surinder S
2017-01-01
Methods for tracking B-cell repertoires and clonal history in normal and malignant B-cells based on immunoglobulin variable region (IGV) gene analysis have developed rapidly with the advent of massive parallel next-generation sequencing (mpNGS) protocols. mpNGS permits a depth of analysis of IGV genes not hitherto feasible, and presents challenges of bioinformatics analysis, which can be readily met by current pipelines. This strategy offers a potential resolution of B-cell usage at a depth that may capture fully the natural state, in a given biological setting. Conventional methods based on RT-PCR amplification and Sanger sequencing are also available where mpNGS is not accessible. Each method offers distinct advantages. Conventional methods for IGV gene sequencing are readily adaptable to most laboratories and provide an ease of analysis to capture salient features of B-cell use. This chapter describes two methods in detail for analysis of IGV genes, mpNGS and conventional RT-PCR with Sanger sequencing.
ERIC Educational Resources Information Center
Boden, Andrea; Archwamety, Teara; McFarland, Max
This review used meta-analytic techniques to integrate findings from 30 independent studies that compared programmed instruction to conventional methods of instruction at the secondary level. The meta-analysis demonstrated that programmed instruction resulted in higher achievement when compared to conventional methods of instruction (average…
Pongsachareonnont, Pear; Honglertnapakul, Worawalun; Chatsuwan, Tanittha
2017-02-21
Identification of bacterial pathogens in endophthalmitis is important to inform antibiotic selection and treatment decisions. Hemoculture bottles and polymerase chain reaction (PCR) analysis have been proposed to offer good detection sensitivity. This study compared the sensitivity and accuracy of a blood culture system, a PCR approach, and conventional culture methods for identification of causative bacteria in cases of acute endophthalmitis. Twenty-nine patients with a diagnosis of presumed acute bacterial endophthalmitis who underwent vitreous specimen collection at King Chulalongkorn Memorial Hospital were enrolled in this study. Forty-one specimens were collected. Each specimen was divided into three parts, and each part was analyzed using one of three microbial identification techniques: conventional plate culture, blood culture, and polymerase chain reaction and sequencing. The results of the three methods were then compared. Bacteria were identified in 15 of the 41 specimens (36.5%). Five (12.2%) specimens were positive by conventional culture methods, 11 (26.8%) were positive by hemoculture, and 11 (26.8%) were positive by PCR. Cohen's kappa analysis revealed p-values for conventional methods vs. hemoculture, conventional methods vs. PCR, and hemoculture vs. PCR of 0.057, 0.33, and 0.009, respectively. Higher detection rates of Enterococcus faecalis were observed for hemoculture and PCR than for conventional methods. Blood culture bottles and PCR detection may facilitate bacterial identification in cases of presumed acute endophthalmitis. These techniques should be used in addition to conventional plate culture methods because they provide a greater degree of sensitivity than conventional plate culture alone for the detection of specific microorganisms such as E. faecalis. Thai Clinical Trial Register No. TCTR20110000024 .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kawaguchi, Tomoya; Liu, Yihua; Reiter, Anthony
Here, a one-dimensional non-iterative direct method was employed for normalized crystal truncation rod analysis. The non-iterative approach, utilizing the Kramers–Kronig relation, avoids the ambiguities due to an improper initial model or incomplete convergence in the conventional iterative methods. The validity and limitations of the present method are demonstrated through both numerical simulations and experiments with Pt(111) in a 0.1 M CsF aqueous solution. The present method is compared with conventional iterative phase-retrieval methods.
Kawaguchi, Tomoya; Liu, Yihua; Reiter, Anthony; ...
2018-04-20
Here, a one-dimensional non-iterative direct method was employed for normalized crystal truncation rod analysis. The non-iterative approach, utilizing the Kramers–Kronig relation, avoids the ambiguities due to an improper initial model or incomplete convergence in the conventional iterative methods. The validity and limitations of the present method are demonstrated through both numerical simulations and experiments with Pt(111) in a 0.1 M CsF aqueous solution. The present method is compared with conventional iterative phase-retrieval methods.
Li, Xin; Xing, Pengfei; Du, Xinghong; Gao, Shuaibo; Chen, Chen
2017-09-01
In this paper, the ultrasound-assisted leaching of iron from boron carbide waste-scrap was investigated and the optimization of different influencing factors had also been performed. The factors investigated were acid concentration, liquid-solid ratio, leaching temperature, ultrasonic power and frequency. The leaching of iron with conventional method at various temperatures was also performed. The results show the maximum iron leaching ratios are 87.4%, 94.5% for 80min-leaching with conventional method and 50min-leaching with ultrasound assistance, respectively. The leaching of waste-scrap with conventional method fits the chemical reaction-controlled model. The leaching with ultrasound assistance fits chemical reaction-controlled model, diffusion-controlled model for the first stage and second stage, respectively. The assistance of ultrasound can greatly improve the iron leaching ratio, accelerate the leaching rate, shorten leaching time and lower the residual iron, comparing with conventional method. The advantages of ultrasound-assisted leaching were also confirmed by the SEM-EDS analysis and elemental analysis of the raw material and leached solid samples. Copyright © 2017 Elsevier B.V. All rights reserved.
A Stirling engine analysis method based upon moving gas nodes
NASA Technical Reports Server (NTRS)
Martini, W. R.
1986-01-01
A Lagrangian nodal analysis method for Stirling engines (SEs) is described, validated, and applied to a conventional SE and an isothermalized SE (with fins in the hot and cold spaces). The analysis employs a constant-mass gas node (which moves with respect to the solid nodes during each time step) instead of the fixed gas nodes of Eulerian analysis. The isothermalized SE is found to have efficiency only slightly greater than that of a conventional SE.
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Shot Number Based Approach to Performance Analysis in Table Tennis
Yoshida, Kazuto; Yamada, Koshi
2017-01-01
Abstract The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method. PMID:28210334
NAA For Human Serum Analysis: Comparison With Conventional Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Laura C.; Zamboni, Cibele B.; Medeiros, Jose A. G.
2010-08-04
Instrumental and Comparator methods of Neutron Activation Analysis (NAA) were applied to determine elements of clinical relevancy in serum samples of adult population (Sao Paulo city, Brazil). A comparison with the conventional analyses, Colorimetric for calcium, Titrymetric for chlorine and Ion Specific Electrode for sodium and potassium determination were also performed permitting a discussion about the performance of NAA methods for clinical chemistry research.
Liu, Tao; Thibos, Larry; Marin, Gildas; Hernandez, Martha
2014-01-01
Conventional aberration analysis by a Shack-Hartmann aberrometer is based on the implicit assumption that an injected probe beam reflects from a single fundus layer. In fact, the biological fundus is a thick reflector and therefore conventional analysis may produce errors of unknown magnitude. We developed a novel computational method to investigate this potential failure of conventional analysis. The Shack-Hartmann wavefront sensor was simulated by computer software and used to recover by two methods the known wavefront aberrations expected from a population of normally-aberrated human eyes and bi-layer fundus reflection. The conventional method determines the centroid of each spot in the SH data image, from which wavefront slopes are computed for least-squares fitting with derivatives of Zernike polynomials. The novel 'global' method iteratively adjusted the aberration coefficients derived from conventional centroid analysis until the SH image, when treated as a unitary picture, optimally matched the original data image. Both methods recovered higher order aberrations accurately and precisely, but only the global algorithm correctly recovered the defocus coefficients associated with each layer of fundus reflection. The global algorithm accurately recovered Zernike coefficients for mean defocus and bi-layer separation with maximum error <0.1%. The global algorithm was robust for bi-layer separation up to 2 dioptres for a typical SH wavefront sensor design. For 100 randomly generated test wavefronts with 0.7 D axial separation, the retrieved mean axial separation was 0.70 D with standard deviations (S.D.) of 0.002 D. Sufficient information is contained in SH data images to measure the dioptric thickness of dual-layer fundus reflection. The global algorithm is superior since it successfully recovered the focus value associated with both fundus layers even when their separation was too small to produce clearly separated spots, while the conventional analysis misrepresents the defocus component of the wavefront aberration as the mean defocus for the two reflectors. Our novel global algorithm is a promising method for SH data image analysis in clinical and visual optics research for human and animal eyes. © 2013 The Authors Ophthalmic & Physiological Optics © 2013 The College of Optometrists.
Lee, Jung-Ju; Lee, Sang Kun; Choi, Jang Wuk; Kim, Dong-Wook; Park, Kyung Il; Kim, Bom Sahn; Kang, Hyejin; Lee, Dong Soo; Lee, Seo-Young; Kim, Sung Hun; Chung, Chun Kee; Nam, Hyeon Woo; Kim, Kwang Ki
2009-12-01
Ictal single-photon emission computed tomography (SPECT) is a valuable method for localizing the ictal onset zone in the presurgical evaluation of patients with intractable epilepsy. Conventional methods used to localize the ictal onset zone have problems with time lag from seizure onset to injection. To evaluate the clinical usefulness of a method that we developed, which involves an attachable automated injector (AAI), in reducing time lag and improving the ability to localize the zone of seizure onset. Patients admitted to the epilepsy monitoring unit (EMU) between January 1, 2003, and June 30, 2008, were included. The definition of ictal onset zone was made by comprehensive review of medical records, magnetic resonance imaging (MRI), data from video electroencephalography (EEG) monitoring, and invasive EEG monitoring if available. We comprehensively evaluated the time lag to injection and the image patterns of ictal SPECT using traditional visual analysis, statistical parametric mapping-assisted, and subtraction ictal SPECT coregistered to an MRI-assisted means of analysis. Image patterns were classified as localizing, lateralizing, and nonlateralizing. The whole number of patients was 99: 48 in the conventional group and 51 in the AAI group. The mean (SD) delay time to injection from seizure onset was 12.4+/-12.0 s in the group injected by our AAI method and 40.4+/-26.3 s in the group injected by the conventional method (P=0.000). The mean delay time to injection from seizure detection was 3.2+/-2.5 s in the group injected by the AAI method and 21.4+/-9.7 s in the group injected by the conventional method (P=0.000). The AAI method was superior to the conventional method in localizing the area of seizure onset (36 out of 51 with AAI method vs. 21 out of 48 with conventional method, P=0.009), especially in non-temporal lobe epilepsy (non-TLE) patients (17 out of 27 with AAI method vs. 3 out of 13 with conventional method, P=0.041), and in lateralizing the seizure onset hemisphere (47 out of 51 with AAI method vs. 33 out of 48 with conventional method, P=0.004). The AAI method was superior to the conventional method in reducing the time lag of tracer injection and in localizing and lateralizing the ictal onset zone, especially in patients with non-TLE.
Scanning Electron Microscope-Cathodoluminescence Analysis of Rare-Earth Elements in Magnets.
Imashuku, Susumu; Wagatsuma, Kazuaki; Kawai, Jun
2016-02-01
Scanning electron microscope-cathodoluminescence (SEM-CL) analysis was performed for neodymium-iron-boron (NdFeB) and samarium-cobalt (Sm-Co) magnets to analyze the rare-earth elements present in the magnets. We examined the advantages of SEM-CL analysis over conventional analytical methods such as SEM-energy-dispersive X-ray (EDX) spectroscopy and SEM-wavelength-dispersive X-ray (WDX) spectroscopy for elemental analysis of rare-earth elements in NdFeB magnets. Luminescence spectra of chloride compounds of elements in the magnets were measured by the SEM-CL method. Chloride compounds were obtained by the dropwise addition of hydrochloric acid on the magnets followed by drying in vacuum. Neodymium, praseodymium, terbium, and dysprosium were separately detected in the NdFeB magnets, and samarium was detected in the Sm-Co magnet by the SEM-CL method. In contrast, it was difficult to distinguish terbium and dysprosium in the NdFeB magnet with a dysprosium concentration of 1.05 wt% by conventional SEM-EDX analysis. Terbium with a concentration of 0.02 wt% in an NdFeB magnet was detected by SEM-CL analysis, but not by conventional SEM-WDX analysis. SEM-CL analysis is advantageous over conventional SEM-EDX and SEM-WDX analyses for detecting trace rare-earth elements in NdFeB magnets, particularly dysprosium and terbium.
NASA Astrophysics Data System (ADS)
Faizah Bawadi, Nor; Anuar, Shamilah; Rahim, Mustaqqim A.; Mansor, A. Faizal
2018-03-01
A conventional and seismic method for determining the ultimate pile bearing capacity was proposed and compared. The Spectral Analysis of Surface Wave (SASW) method is one of the non-destructive seismic techniques that do not require drilling and sampling of soils, was used in the determination of shear wave velocity (Vs) and damping (D) profile of soil. The soil strength was found to be directly proportional to the Vs and its value has been successfully applied to obtain shallow bearing capacity empirically. A method is proposed in this study to determine the pile bearing capacity using Vs and D measurements for the design of pile and also as an alternative method to verify the bearing capacity from the other conventional methods of evaluation. The objectives of this study are to determine Vs and D profile through frequency response data from SASW measurements and to compare pile bearing capacities obtained from the method carried out and conventional methods. All SASW test arrays were conducted near the borehole and location of conventional pile load tests. In obtaining skin and end bearing pile resistance, the Hardin and Drnevich equation has been used with reference strains obtained from the method proposed by Abbiss. Back analysis results of pile bearing capacities from SASW were found to be 18981 kN and 4947 kN compared to 18014 kN and 4633 kN of IPLT with differences of 5% and 6% for Damansara and Kuala Lumpur test sites, respectively. The results of this study indicate that the seismic method proposed in this study has the potential to be used in estimating the pile bearing capacity.
Design sensitivity analysis using EAL. Part 1: Conventional design parameters
NASA Technical Reports Server (NTRS)
Dopker, B.; Choi, Kyung K.; Lee, J.
1986-01-01
A numerical implementation of design sensitivity analysis of builtup structures is presented, using the versatility and convenience of an existing finite element structural analysis code and its database management system. The finite element code used in the implemenatation presented is the Engineering Analysis Language (EAL), which is based on a hybrid method of analysis. It was shown that design sensitivity computations can be carried out using the database management system of EAL, without writing a separate program and a separate database. Conventional (sizing) design parameters such as cross-sectional area of beams or thickness of plates and plane elastic solid components are considered. Compliance, displacement, and stress functionals are considered as performance criteria. The method presented is being extended to implement shape design sensitivity analysis using a domain method and a design component method.
A Rapid Method for Measuring Strontium-90 Activity in Crops in China
NASA Astrophysics Data System (ADS)
Pan, Lingjing Pan; Yu, Guobing; Wen, Deyun; Chen, Zhi; Sheng, Liusi; Liu, Chung-King; Xu, X. George
2017-09-01
A rapid method for measuring Sr-90 activity in crop ashes is presented. Liquid scintillation counting, combined with ion exchange columns 4`, 4"(5")-di-t-butylcyclohexane-18-crown-6, is used to determine the activity of Sr-90 in crops. The yields of chemical procedure are quantified using gravimetric analysis. The conventional method that uses ion-exchange resin with HDEHP could not completely remove all the bismuth when comparatively large lead and bismuth exist in the samples. This is overcome by the rapid method. The chemical yield of this method is about 60% and the MDA for Sr-90 is found to be 2:32 Bq/kg. The whole procedure together with using spectrum analysis to determine the activity only takes about one day, which is really a large improvement compared with the conventional method. A modified conventional method is also described here to verify the value of the rapid one. These two methods can meet di_erent needs of daily monitoring and emergency situation.
Machining and characterization of self-reinforced polymers
NASA Astrophysics Data System (ADS)
Deepa, A.; Padmanabhan, K.; Kuppan, P.
2017-11-01
This Paper focuses on obtaining the mechanical properties and the effect of the different machining techniques on self-reinforced composites sample and to derive the best machining method with remarkable properties. Each sample was tested by the Tensile and Flexural tests, fabricated using hot compaction test and those loads were calculated. These composites are machined using conventional methods because of lack of advanced machinery in most of the industries. The advanced non-conventional methods like Abrasive water jet machining were used. These machining techniques are used to get the better output for the composite materials with good mechanical properties compared to conventional methods. But the use of non-conventional methods causes the changes in the work piece, tool properties and more economical compared to the conventional methods. Finding out the best method ideal for the designing of these Self Reinforced Composites with and without defects and the use of Scanning Electron Microscope (SEM) analysis for the comparing the microstructure of the PP and PE samples concludes our process.
Pieterman, Elise D; Budde, Ricardo P J; Robbers-Visser, Daniëlle; van Domburg, Ron T; Helbing, Willem A
2017-09-01
Follow-up of right ventricular performance is important for patients with congenital heart disease. Cardiac magnetic resonance imaging is optimal for this purpose. However, observer-dependency of manual analysis of right ventricular volumes limit its use. Knowledge-based reconstruction is a new semiautomatic analysis tool that uses a database including knowledge of right ventricular shape in various congenital heart diseases. We evaluated whether knowledge-based reconstruction is a good alternative for conventional analysis. To assess the inter- and intra-observer variability and agreement of knowledge-based versus conventional analysis of magnetic resonance right ventricular volumes, analysis was done by two observers in a mixed group of 22 patients with congenital heart disease affecting right ventricular loading conditions (dextro-transposition of the great arteries and right ventricle to pulmonary artery conduit) and a group of 17 healthy children. We used Bland-Altman analysis and coefficient of variation. Comparison between the conventional method and the knowledge-based method showed a systematically higher volume for the latter group. We found an overestimation for end-diastolic volume (bias -40 ± 24 mL, r = .956), end-systolic volume (bias -34 ± 24 mL, r = .943), stroke volume (bias -6 ± 17 mL, r = .735) and an underestimation of ejection fraction (bias 7 ± 7%, r = .671) by knowledge-based reconstruction. The intra-observer variability of knowledge-based reconstruction varied with a coefficient of variation of 9% for end-diastolic volume and 22% for stroke volume. The same trend was noted for inter-observer variability. A systematic difference (overestimation) was noted for right ventricular size as assessed with knowledge-based reconstruction compared with conventional methods for analysis. Observer variability for the new method was comparable to what has been reported for the right ventricle in children and congenital heart disease with conventional analysis. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tejabhiram, Y., E-mail: tejabhiram@gmail.com; Pradeep, R.; Helen, A.T.
2014-12-15
Highlights: • Novel low temperature synthesis of nickel ferrite nanoparticles. • Comparison with two conventional synthesis techniques including hydrothermal method. • XRD results confirm the formation of crystalline nickel ferrites at 110 °C. • Superparamagnetic particles with applications in drug delivery and hyperthermia. • Magnetic properties superior to conventional methods found in new process. - Abstract: We report a simple, low temperature and surfactant free co-precipitation method for the preparation of nickel ferrite nanostructures using ferrous sulfate as the iron precursor. The products obtained from this method were compared for their physical properties with nickel ferrites produced through conventional co-precipitationmore » and hydrothermal methods which used ferric nitrate as the iron precursor. X-ray diffraction analysis confirmed the synthesis of single phase inverse spinel nanocrystalline nickel ferrites at temperature as low as 110 °C in the low temperature method. Electron microscopy analysis on the samples revealed the formation of nearly spherical nanostructures in the size range of 20–30 nm which are comparable to other conventional methods. Vibrating sample magnetometer measurements showed the formation of superparamagnetic particles with high magnetic saturation 41.3 emu/g which corresponds well with conventional synthesis methods. The spontaneous synthesis of the nickel ferrite nanoparticles by the low temperature synthesis method was attributed to the presence of 0.808 kJ mol{sup −1} of excess Gibbs free energy due to ferrous sulfate precursor.« less
Mata, Gardênia Márcia Silva Campos; Martins, Evandro; Machado, Solimar Gonçalves; Pinto, Maximiliano Soares; de Carvalho, Antônio Fernandes; Vanetti, Maria Cristina Dantas
2016-01-01
The ability of pathogens to survive cheese ripening is a food-security concern. Therefore, this study aimed to evaluate the performance of two alternative methods of analysis of Listeria during the ripening of artisanal Minas cheese. These methods were tested and compared with the conventional method: Lateral Flow System™, in cheeses produced on laboratory scale using raw milk collected from different farms and inoculated with Listeria innocua; and VIDAS(®)-LMO, in cheese samples collected from different manufacturers in Serro, Minas Gerais, Brazil. These samples were also characterized in terms of lactic acid bacteria, coliforms and physical-chemical analysis. In the inoculated samples, L. innocua was detected by Lateral Flow System™ method with 33% false-negative and 68% accuracy results. L. innocua was only detected in the inoculated samples by the conventional method at 60-days of cheese ripening. L. monocytogenes was not detected by the conventional and the VIDAS(®)-LMO methods in cheese samples collected from different manufacturers, which impairs evaluating the performance of this alternative method. We concluded that the conventional method provided a better recovery of L. innocua throughout cheese ripening, being able to detect L. innocua at 60-day, aging period which is required by the current legislation. Copyright © 2016 Sociedade Brasileira de Microbiologia. Published by Elsevier Editora Ltda. All rights reserved.
Hall, Val; O’Neill, G. L.; Magee, J. T.; Duerden, B. I.
1999-01-01
Identification of Actinomyces spp. by conventional phenotypic methods is notoriously difficult and unreliable. Recently, the application of chemotaxonomic and molecular methods has clarified the taxonomy of the group and has led to the recognition of several new species. A practical and discriminatory identification method is now needed for routine identification of clinical isolates. Amplified 16S ribosomal DNA restriction analysis (ARDRA) was applied to reference strains (n = 27) and clinical isolates (n = 36) of Actinomyces spp. and other gram-positive rods. Clinical strains were identified initially to the species level by conventional biochemical tests. However, given the low degree of confidence in conventional methods, the findings obtained by ARDRA were also compared with those obtained by pyrolysis-mass spectrometry. The ARDRA profiles generated by the combination of HaeIII and HpaII endonuclease digestion differentiated all reference strains to the species or subspecies level. The profiles correlated well with the findings obtained by pyrolysis-mass spectrometry and by conventional tests and enabled the identification of 31 of 36 clinical isolates to the species level. ARDRA was shown to be a simple, rapid, cost-effective, and highly discriminatory method for routine identification of Actinomyces spp. of clinical origin. PMID:10364594
Mahmood, Hafiz Sultan; Hoogmoed, Willem B.; van Henten, Eldert J.
2013-01-01
Fine-scale spatial information on soil properties is needed to successfully implement precision agriculture. Proximal gamma-ray spectroscopy has recently emerged as a promising tool to collect fine-scale soil information. The objective of this study was to evaluate a proximal gamma-ray spectrometer to predict several soil properties using energy-windows and full-spectrum analysis methods in two differently managed sandy loam fields: conventional and organic. In the conventional field, both methods predicted clay, pH and total nitrogen with a good accuracy (R2 ≥ 0.56) in the top 0–15 cm soil depth, whereas in the organic field, only clay content was predicted with such accuracy. The highest prediction accuracy was found for total nitrogen (R2 = 0.75) in the conventional field in the energy-windows method. Predictions were better in the top 0–15 cm soil depths than in the 15–30 cm soil depths for individual and combined fields. This implies that gamma-ray spectroscopy can generally benefit soil characterisation for annual crops where the condition of the seedbed is important. Small differences in soil structure (conventional vs. organic) cannot be determined. As for the methodology, we conclude that the energy-windows method can establish relations between radionuclide data and soil properties as accurate as the full-spectrum analysis method. PMID:24287541
[Role of BoBs technology in early missed abortion chorionic villi].
Li, Z Y; Liu, X Y; Peng, P; Chen, N; Ou, J; Hao, N; Zhou, J; Bian, X M
2018-05-25
Objective: To investigate the value of bacterial artificial chromosome-on-beads (BoBs) technology in the genetic analysis of early missed abortion chorionic villi. Methods: Early missed abortion chorionic villi were detected with both conventional karyotyping method and BoBs technology in Peking Union Medical Hospital from July 2014 to March 2015. Compared the results of BoBs with conventional karyotyping analysis to evaluate the sensitivity, specificity and accuracy of this new method. Results: (1) A total of 161 samples were tested successfully in the technology of BoBs, 131 samples were tested successfully in the method of conventional karyotyping. (2) All of the cases obtained from BoBs results in (2.7±0.6) days and obtained from conventional karyotyping results in (22.5±1.9) days. There was significant statistical difference between the two groups ( t= 123.315, P< 0.01) . (3) Out of 161 cases tested in BoBs, 85 (52.8%, 85/161) cases had the abnormal chromosomes, including 79 cases chromosome number abnormality, 4 cases were chromosome segment deletion, 2 cases mosaic. Out of 131 cases tested successfully in conventional karyotyping, 79 (60.3%, 79/131) cases had the abnormal chromosomes including 62 cases chromosome number abnormality, 17 cases other chromosome number abnormality, and the rate of chromosome abnormality between two methods was no significant differences ( P =0.198) . (4) Conventional karyotyping results were served as the gold standard, the accuracy of BoBs for abnormal chromosomes was 82.4% (108/131) , analysed the normal chromosomes (52 cases) and chromosome number abnormality (62 cases) tested in conventional karyotyping, the accuracy of BoBs for chromosome number abnormality was 94.7% (108/114) . Conclusion: BoBs is a rapid reliable and easily operated method to test early missed abortion chorionic villi chromosomal abnormalities.
Antoszewska-Smith, Joanna; Sarul, Michał; Łyczek, Jan; Konopka, Tomasz; Kawala, Beata
2017-03-01
The aim of this systematic review was to compare the effectiveness of orthodontic miniscrew implants-temporary intraoral skeletal anchorage devices (TISADs)-in anchorage reinforcement during en-masse retraction in relation to conventional methods of anchorage. A search of PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science was performed. The keywords were orthodontic, mini-implants, miniscrews, miniplates, and temporary anchorage device. Relevant articles were assessed for quality according to Cochrane guidelines and the data extracted for statistical analysis. A meta-analysis of raw mean differences concerning anchorage loss, tipping of molars, retraction of incisors, tipping of incisors, and treatment duration was carried out. Initially, we retrieved 10,038 articles. The selection process finally resulted in 14 articles including 616 patients (451 female, 165 male) for detailed analysis. Quality of the included studies was assessed as moderate. Meta-analysis showed that use of TISADs facilitates better anchorage reinforcement compared with conventional methods. On average, TISADs enabled 1.86 mm more anchorage preservation than did conventional methods (P <0.001). The results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2 mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Salas, F.; Cabello, O.; Alarcon, F.; Ferrer, C.
1974-01-01
Multispectral analysis of ERTS-A images at scales of 1:1,000,000 and 1:500,000 has been conducted with conventional photointerpretation methods. Specific methods have been developed for the geomorphological analysis of southern Maracaibo Lake Basin which comprises part of the Venezuelan Andean Range, Perija Range, the Tachira gap and the Southern part of the Maracaibo Lake depression. A steplike analysis was conducted to separate macroforms, landscapes and relief units as well as drainage patterns and tectonic features, which permitted the delineation of tectonic provinces, stratigraphic units, geomorphologic units and geomorphologic positions. The geomorphologic synthesis obtained compares favorably with conventional analysis made on this area for accuracy of 1:100,000 scale, and in some features with details obtained through conventional analysis for accuracy of 1:15,000 and field work. Geomorphological units in the mountains were identified according to changes in tone, texture, forms orientation of interfluves and tectonic characteristics which control interfluvial disimetrics.
Tang, Chuanning; Lew, Scott
2016-01-01
Abstract In vitro protein stability studies are commonly conducted via thermal or chemical denaturation/renaturation of protein. Conventional data analyses on the protein unfolding/(re)folding require well‐defined pre‐ and post‐transition baselines to evaluate Gibbs free‐energy change associated with the protein unfolding/(re)folding. This evaluation becomes problematic when there is insufficient data for determining the pre‐ or post‐transition baselines. In this study, fitting on such partial data obtained in protein chemical denaturation is established by introducing second‐order differential (SOD) analysis to overcome the limitations that the conventional fitting method has. By reducing numbers of the baseline‐related fitting parameters, the SOD analysis can successfully fit incomplete chemical denaturation data sets with high agreement to the conventional evaluation on the equivalent completed data, where the conventional fitting fails in analyzing them. This SOD fitting for the abbreviated isothermal chemical denaturation further fulfills data analysis methods on the insufficient data sets conducted in the two prevalent protein stability studies. PMID:26757366
NASA Astrophysics Data System (ADS)
Rajalakshmi, M.; Shyju, T. S.; Indirajith, R.; Gopalakrishnan, R.
2012-02-01
Good quality <1 0 0> benzil single crystal with a diameter 18 mm and length 75 mm was successfully grown from solution by the unidirectional growth method of Sankaranarayanan-Ramasamy (SR) for the first time in the literature. The seed crystals have been harvested from conventional solution growth technique and subsequently used for unidirectional growth. The grown crystal was subjected to various characterization studies. The results of UV-vis spectral analysis, photoluminescence, etching and microhardness studies were compared with conventional solution grown crystal to that of SR method grown crystal. The quality of SR method grown benzil crystal is better than conventional solution grown crystal.
A refined method for multivariate meta-analysis and meta-regression.
Jackson, Daniel; Riley, Richard D
2014-02-20
Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects' standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Kurnia, H.; Noerhadi, N. A. I.
2017-08-01
Three-dimensional digital study models were introduced following advances in digital technology. This study was carried out to assess the reliability of digital study models scanned by a laser scanning device newly assembled. The aim of this study was to compare the digital study models and conventional models. Twelve sets of dental impressions were taken from patients with mild-to-moderate crowding. The impressions were taken twice, one with alginate and the other with polyvinylsiloxane. The alginate impressions were made into conventional models, and the polyvinylsiloxane impressions were scanned to produce digital models. The mesiodistal tooth width and Little’s irregularity index (LII) were measured manually with digital calipers on the conventional models and digitally on the digital study models. Bolton analysis was performed on each study models. Each method was carried out twice to check for intra-observer variability. The reproducibility (comparison of the methods) was assessed using independent-sample t-tests. The mesiodistal tooth width between conventional and digital models did not significantly differ (p > 0.05). Independent-sample t-tests did not identify statistically significant differences for Bolton analysis and LII (p = 0.603 for Bolton and p = 0894 for LII). The measurements of the digital study models are as accurate as those of the conventional models.
Elzanfaly, Eman S; Hegazy, Maha A; Saad, Samah S; Salem, Maissa Y; Abd El Fattah, Laila E
2015-03-01
The introduction of sustainable development concepts to analytical laboratories has recently gained interest, however, most conventional high-performance liquid chromatography methods do not consider either the effect of the used chemicals or the amount of produced waste on the environment. The aim of this work was to prove that conventional methods can be replaced by greener ones with the same analytical parameters. The suggested methods were designed so that they neither use nor produce harmful chemicals and produce minimum waste to be used in routine analysis without harming the environment. This was achieved by using green mobile phases and short run times. Four mixtures were chosen as models for this study; clidinium bromide/chlordiazepoxide hydrochloride, phenobarbitone/pipenzolate bromide, mebeverine hydrochloride/sulpiride, and chlorphenoxamine hydrochloride/caffeine/8-chlorotheophylline either in their bulk powder or in their dosage forms. The methods were validated with respect to linearity, precision, accuracy, system suitability, and robustness. The developed methods were compared to the reported conventional high-performance liquid chromatography methods regarding their greenness profile. The suggested methods were found to be greener and more time- and solvent-saving than the reported ones; hence they can be used for routine analysis of the studied mixtures without harming the environment. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Droplet Microarray Based on Superhydrophobic-Superhydrophilic Patterns for Single Cell Analysis.
Jogia, Gabriella E; Tronser, Tina; Popova, Anna A; Levkin, Pavel A
2016-12-09
Single-cell analysis provides fundamental information on individual cell response to different environmental cues and is a growing interest in cancer and stem cell research. However, current existing methods are still facing challenges in performing such analysis in a high-throughput manner whilst being cost-effective. Here we established the Droplet Microarray (DMA) as a miniaturized screening platform for high-throughput single-cell analysis. Using the method of limited dilution and varying cell density and seeding time, we optimized the distribution of single cells on the DMA. We established culturing conditions for single cells in individual droplets on DMA obtaining the survival of nearly 100% of single cells and doubling time of single cells comparable with that of cells cultured in bulk cell population using conventional methods. Our results demonstrate that the DMA is a suitable platform for single-cell analysis, which carries a number of advantages compared with existing technologies allowing for treatment, staining and spot-to-spot analysis of single cells over time using conventional analysis methods such as microscopy.
Strain gage measurement errors in the transient heating of structural components
NASA Technical Reports Server (NTRS)
Richards, W. Lance
1993-01-01
Significant strain-gage errors may exist in measurements acquired in transient thermal environments if conventional correction methods are applied. Conventional correction theory was modified and a new experimental method was developed to correct indicated strain data for errors created in radiant heating environments ranging from 0.6 C/sec (1 F/sec) to over 56 C/sec (100 F/sec). In some cases the new and conventional methods differed by as much as 30 percent. Experimental and analytical results were compared to demonstrate the new technique. For heating conditions greater than 6 C/sec (10 F/sec), the indicated strain data corrected with the developed technique compared much better to analysis than the same data corrected with the conventional technique.
Time-series analysis of sleep wake stage of rat EEG using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Shinba, Toshikazu; Mugishima, Go; Haraguchi, Hikaru; Inoue, Masayoshi
2008-05-01
We performed electroencephalography (EEG) for six male Wistar rats to clarify temporal behaviors at different levels of consciousness. Levels were identified both by conventional sleep analysis methods and by our novel entropy method. In our method, time-dependent pattern entropy is introduced, by which EEG is reduced to binary symbolic dynamics and the pattern of symbols in a sliding temporal window is considered. A high correlation was obtained between level of consciousness as measured by the conventional method and mean entropy in our entropy method. Mean entropy was maximal while awake (stage W) and decreased as sleep deepened. These results suggest that time-dependent pattern entropy may offer a promising method for future sleep research.
NASA Technical Reports Server (NTRS)
Richards, W. Lance
1996-01-01
Significant strain-gage errors may exist in measurements acquired in transient-temperature environments if conventional correction methods are applied. As heating or cooling rates increase, temperature gradients between the strain-gage sensor and substrate surface increase proportionally. These temperature gradients introduce strain-measurement errors that are currently neglected in both conventional strain-correction theory and practice. Therefore, the conventional correction theory has been modified to account for these errors. A new experimental method has been developed to correct strain-gage measurements acquired in environments experiencing significant temperature transients. The new correction technique has been demonstrated through a series of tests in which strain measurements were acquired for temperature-rise rates ranging from 1 to greater than 100 degrees F/sec. Strain-gage data from these tests have been corrected with both the new and conventional methods and then compared with an analysis. Results show that, for temperature-rise rates greater than 10 degrees F/sec, the strain measurements corrected with the conventional technique produced strain errors that deviated from analysis by as much as 45 percent, whereas results corrected with the new technique were in good agreement with analytical results.
Passaro, Antony D; Vettel, Jean M; McDaniel, Jonathan; Lawhern, Vernon; Franaszczuk, Piotr J; Gordon, Stephen M
2017-03-01
During an experimental session, behavioral performance fluctuates, yet most neuroimaging analyses of functional connectivity derive a single connectivity pattern. These conventional connectivity approaches assume that since the underlying behavior of the task remains constant, the connectivity pattern is also constant. We introduce a novel method, behavior-regressed connectivity (BRC), to directly examine behavioral fluctuations within an experimental session and capture their relationship to changes in functional connectivity. This method employs the weighted phase lag index (WPLI) applied to a window of trials with a weighting function. Using two datasets, the BRC results are compared to conventional connectivity results during two time windows: the one second before stimulus onset to identify predictive relationships, and the one second after onset to capture task-dependent relationships. In both tasks, we replicate the expected results for the conventional connectivity analysis, and extend our understanding of the brain-behavior relationship using the BRC analysis, demonstrating subject-specific BRC maps that correspond to both positive and negative relationships with behavior. Comparison with Existing Method(s): Conventional connectivity analyses assume a consistent relationship between behaviors and functional connectivity, but the BRC method examines performance variability within an experimental session to understand dynamic connectivity and transient behavior. The BRC approach examines connectivity as it covaries with behavior to complement the knowledge of underlying neural activity derived from conventional connectivity analyses. Within this framework, BRC may be implemented for the purpose of understanding performance variability both within and between participants. Published by Elsevier B.V.
Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Roberts, Larry W.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Evaluating gull diets: A comparison of conventional methods and stable isotope analysis
Weiser, Emily L.; Powell, Abby N.
2011-01-01
Samples such as regurgitated pellets and food remains have traditionally been used in studies of bird diets, but these can produce biased estimates depending on the digestibility of different foods. Stable isotope analysis has been developed as a method for assessing bird diets that is not biased by digestibility. These two methods may provide complementary or conflicting information on diets of birds, but are rarely compared directly. We analyzed carbon and nitrogen stable isotope ratios of feathers of Glaucous Gull (Larus hyperboreus) chicks from eight breeding colonies in northern Alaska, and used a Bayesian mixing model to generate a probability distribution for the contribution of each food group to diets. We compared these model results with probability distributions from conventional diet samples (pellets and food remains) from the same colonies and time periods. Relative to the stable isotope estimates, conventional analysis often overestimated the contributions of birds and small mammals to gull diets and often underestimated the contributions of fish and zooplankton. Both methods gave similar estimates for the contributions of scavenged caribou, miscellaneous marine foods, and garbage to diets. Pellets and food remains therefore may be useful for assessing the importance of garbage relative to certain other foods in diets of gulls and similar birds, but are clearly inappropriate for estimating the potential impact of gulls on birds, small mammals, or fish. However, conventional samples provide more species-level information than stable isotope analysis, so a combined approach would be most useful for diet analysis and assessing a predator's impact on particular prey groups.
Application of photogrammetry for analysis of occlusal contacts.
Shigeta, Yuko; Hirabayashi, Rio; Ikawa, Tomoko; Kihara, Takuya; Ando, Eriko; Hirai, Shinya; Fukushima, Shunji; Ogawa, Takumi
2013-04-01
The conventional 2D-analysis methods for occlusal contacts provided limited information on tooth morphology. This present study aims to detect 3D positional information of occlusal contacts from 2D-photos via photogrammetry. We propose an image processing solution for analysis of occlusal contacts and facets via the black silicone method and a photogrammetric technique. The occlusal facets were reconstructed from a 2D-photograph data-set of inter-occlusal records into a 3D image via photogrammetry. The configuration of the occlusal surface was reproduced with polygons. In addition, the textures of the occlusal contacts were mapped to each polygon. DIFFERENCE FROM CONVENTIONAL METHODS: Constructing occlusal facets with 3D polygons from 2D-photos with photogrammetry was a defining characteristic of this image processing technique. It allowed us to better observe findings of the black silicone method. Compared with conventional 3D analysis using a 3D scanner, our 3D models did not reproduce the detail of the anatomical configuration. However, by merging the findings of the inter-occlusal record, the deformation of mandible and the displacement of periodontal ligaments under occlusal force were reflected in our model. EFFECT OR PERFORMANCE: Through the use of polygons in the conversion of 2D images to 3D images, we were able to define the relation between the location and direction of the occlusal contacts and facets, which was difficult to detect via conventional methods. Through our method of making a 3D polygon model, the findings of inter-occlusal records which reflected the jaw/teeth behavior under occlusal force could be observed 3-dimensionally. Copyright © 2012 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Shrestha, Rojeet; Miura, Yusuke; Hirano, Ken-Ichi; Chen, Zhen; Okabe, Hiroaki; Chiba, Hitoshi; Hui, Shu-Ping
2018-01-01
Fatty acid (FA) profiling of milk has important applications in human health and nutrition. Conventional methods for the saponification and derivatization of FA are time-consuming and laborious. We aimed to develop a simple, rapid, and economical method for the determination of FA in milk. We applied a beneficial approach of microwave-assisted saponification (MAS) of milk fats and microwave-assisted derivatization (MAD) of FA to its hydrazides, integrated with HPLC-based analysis. The optimal conditions for MAS and MAD were determined. Microwave irradiation significantly reduced the sample preparation time from 80 min in the conventional method to less than 3 min. We used three internal standards for the measurement of short-, medium- and long-chain FA. The proposed method showed satisfactory analytical sensitivity, recovery and reproducibility. There was a significant correlation in the milk FA concentrations between the proposed and conventional methods. Being quick, economic, and convenient, the proposed method for the milk FA measurement can be substitute for the convention method.
Neck pain assessment in a virtual environment.
Sarig-Bahat, Hilla; Weiss, Patrice L Tamar; Laufer, Yocheved
2010-02-15
Neck-pain and control group comparative analysis of conventional and virtual reality (VR)-based assessment of cervical range of motion (CROM). To use a tracker-based VR system to compare CROM of individuals suffering from chronic neck pain with CROM of asymptomatic individuals; to compare VR system results with those obtained during conventional assessment; to present the diagnostic value of CROM measures obtained by both assessments; and to demonstrate the effect of a single VR session on CROM. Neck pain is a common musculoskeletal complaint with a reported annual prevalence of 30% to 50%. In the absence of a gold standard for CROM assessment, a variety of assessment devices and methodologies exist. Common to these methodologies, assessment of CROM is carried out by instructing subjects to move their head as far as possible. However, these elicited movements do not necessarily replicate functional movements which occur spontaneously in response to multiple stimuli. To achieve a more functional approach to cervical motion assessment, we have recently developed a VR environment in which electromagnetic tracking is used to monitor cervical motion while participants are involved in a simple yet engaging gaming scenario. CROM measures were collected from 25 symptomatic and 42 asymptomatic individuals using VR and conventional assessments. Analysis of variance was used to determine differences between groups and assessment methods. Logistic regression analysis, using a single predictor, compared the diagnostic ability of both methods. Results obtained by both methods demonstrated significant CROM limitations in the symptomatic group. The VR measures showed greater CROM and sensitivity while conventional measures showed greater specificity. A single session exposure to VR resulted in a significant increase in CROM. Neck pain is significantly associated with reduced CROM as demonstrated by both VR and conventional assessment methods. The VR method provides assessment of functional CROM and can be used for CROM enhancement. Assessment by VR has greater sensitivity than conventional assessment and can be used for the detection of true symptomatic individuals.
Rajalakshmi, M; Shyju, T S; Indirajith, R; Gopalakrishnan, R
2012-02-01
Good quality <100> benzil single crystal with a diameter 18 mm and length 75 mm was successfully grown from solution by the unidirectional growth method of Sankaranarayanan-Ramasamy (SR) for the first time in the literature. The seed crystals have been harvested from conventional solution growth technique and subsequently used for unidirectional growth. The grown crystal was subjected to various characterization studies. The results of UV-vis spectral analysis, photoluminescence, etching and microhardness studies were compared with conventional solution grown crystal to that of SR method grown crystal. The quality of SR method grown benzil crystal is better than conventional solution grown crystal. Copyright © 2011 Elsevier B.V. All rights reserved.
Cost effectiveness of conventional versus LANDSAT use data for hydrologic modeling
NASA Technical Reports Server (NTRS)
George, T. S.; Taylor, R. S.
1982-01-01
Six case studies were analyzed to investigate the cost effectiveness of using land use data obtained from LANDSAT as opposed to conventionally obtained data. A procedure was developed to determine the relative effectiveness of the two alternative means of acquiring data for hydrological modelling. The cost of conventionally acquired data ranged between $3,000 and $16,000 for the six test basins. Information based on LANDSAT imagery cost between $2,000 and $5,000. Results of the effectiveness analysis shows the differences between the two methods are insignificant. From the cost comparison and the act that each method, conventional and LANDSAT, is shown to be equally effective in developing land use data for hydrologic studies, the cost effectiveness of the conventional or LANDSAT method is found to be a function of basin size for the six test watersheds analyzed. The LANDSAT approach is cost effective for areas containing more than 10 square miles.
Tsirogiannis, Panagiotis; Reissmann, Daniel R; Heydecke, Guido
2016-09-01
In existing published reports, some studies indicate the superiority of digital impression systems in terms of the marginal accuracy of ceramic restorations, whereas others show that the conventional method provides restorations with better marginal fit than fully digital fabrication. Which impression method provides the lowest mean values for marginal adaptation is inconclusive. The findings from those studies cannot be easily generalized, and in vivo studies that could provide valid and meaningful information are limited in the existing publications. The purpose of this study was to systematically review existing reports and evaluate the marginal fit of ceramic single-tooth restorations after either digital or conventional impression methods by combining the available evidence in a meta-analysis. The search strategy for this systematic review of the publications was based on a Population, Intervention, Comparison, and Outcome (PICO) framework. For the statistical analysis, the mean marginal fit values of each study were extracted and categorized according to the impression method to calculate the mean value, together with the 95% confidence intervals (CI) of each category, and to evaluate the impact of each impression method on the marginal adaptation by comparing digital and conventional techniques separately for in vitro and in vivo studies. Twelve studies were included in the meta-analysis from the 63 identified records after database searching. For the in vitro studies, where ceramic restorations were fabricated after conventional impressions, the mean value of the marginal fit was 58.9 μm (95% CI: 41.1-76.7 μm), whereas after digital impressions, it was 63.3 μm (95% CI: 50.5-76.0 μm). In the in vivo studies, the mean marginal discrepancy of the restorations after digital impressions was 56.1 μm (95% CI: 46.3-65.8 μm), whereas after conventional impressions, it was 79.2 μm (95% CI: 59.6-98.9 μm) No significant difference was observed regarding the marginal discrepancy of single-unit ceramic restorations fabricated after digital or conventional impressions. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Groopman, Amber M.; Katz, Jonathan I.; Holland, Mark R.; Fujita, Fuminori; Matsukawa, Mami; Mizuno, Katsunori; Wear, Keith A.; Miller, James G.
2015-01-01
Conventional, Bayesian, and the modified least-squares Prony's plus curve-fitting (MLSP + CF) methods were applied to data acquired using 1 MHz center frequency, broadband transducers on a single equine cancellous bone specimen that was systematically shortened from 11.8 mm down to 0.5 mm for a total of 24 sample thicknesses. Due to overlapping fast and slow waves, conventional analysis methods were restricted to data from sample thicknesses ranging from 11.8 mm to 6.0 mm. In contrast, Bayesian and MLSP + CF methods successfully separated fast and slow waves and provided reliable estimates of the ultrasonic properties of fast and slow waves for sample thicknesses ranging from 11.8 mm down to 3.5 mm. Comparisons of the three methods were carried out for phase velocity at the center frequency and the slope of the attenuation coefficient for the fast and slow waves. Good agreement among the three methods was also observed for average signal loss at the center frequency. The Bayesian and MLSP + CF approaches were able to separate the fast and slow waves and provide good estimates of the fast and slow wave properties even when the two wave modes overlapped in both time and frequency domains making conventional analysis methods unreliable. PMID:26328678
A Simple Deep Learning Method for Neuronal Spike Sorting
NASA Astrophysics Data System (ADS)
Yang, Kai; Wu, Haifeng; Zeng, Yu
2017-10-01
Spike sorting is one of key technique to understand brain activity. With the development of modern electrophysiology technology, some recent multi-electrode technologies have been able to record the activity of thousands of neuronal spikes simultaneously. The spike sorting in this case will increase the computational complexity of conventional sorting algorithms. In this paper, we will focus spike sorting on how to reduce the complexity, and introduce a deep learning algorithm, principal component analysis network (PCANet) to spike sorting. The introduced method starts from a conventional model and establish a Toeplitz matrix. Through the column vectors in the matrix, we trains a PCANet, where some eigenvalue vectors of spikes could be extracted. Finally, support vector machine (SVM) is used to sort spikes. In experiments, we choose two groups of simulated data from public databases availably and compare this introduced method with conventional methods. The results indicate that the introduced method indeed has lower complexity with the same sorting errors as the conventional methods.
Autonomous control systems - Architecture and fundamental issues
NASA Technical Reports Server (NTRS)
Antsaklis, P. J.; Passino, K. M.; Wang, S. J.
1988-01-01
A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).
Seok, Junhee; Seon Kang, Yeong
2015-01-01
Mutual information, a general measure of the relatedness between two random variables, has been actively used in the analysis of biomedical data. The mutual information between two discrete variables is conventionally calculated by their joint probabilities estimated from the frequency of observed samples in each combination of variable categories. However, this conventional approach is no longer efficient for discrete variables with many categories, which can be easily found in large-scale biomedical data such as diagnosis codes, drug compounds, and genotypes. Here, we propose a method to provide stable estimations for the mutual information between discrete variables with many categories. Simulation studies showed that the proposed method reduced the estimation errors by 45 folds and improved the correlation coefficients with true values by 99 folds, compared with the conventional calculation of mutual information. The proposed method was also demonstrated through a case study for diagnostic data in electronic health records. This method is expected to be useful in the analysis of various biomedical data with discrete variables. PMID:26046461
Tani, Kazuki; Mio, Motohira; Toyofuku, Tatsuo; Kato, Shinichi; Masumoto, Tomoya; Ijichi, Tetsuya; Matsushima, Masatoshi; Morimoto, Shoichi; Hirata, Takumi
2017-01-01
Spatial normalization is a significant image pre-processing operation in statistical parametric mapping (SPM) analysis. The purpose of this study was to clarify the optimal method of spatial normalization for improving diagnostic accuracy in SPM analysis of arterial spin-labeling (ASL) perfusion images. We evaluated the SPM results of five spatial normalization methods obtained by comparing patients with Alzheimer's disease or normal pressure hydrocephalus complicated with dementia and cognitively healthy subjects. We used the following methods: 3DT1-conventional based on spatial normalization using anatomical images; 3DT1-DARTEL based on spatial normalization with DARTEL using anatomical images; 3DT1-conventional template and 3DT1-DARTEL template, created by averaging cognitively healthy subjects spatially normalized using the above methods; and ASL-DARTEL template created by averaging cognitively healthy subjects spatially normalized with DARTEL using ASL images only. Our results showed that ASL-DARTEL template was small compared with the other two templates. Our SPM results obtained with ASL-DARTEL template method were inaccurate. Also, there were no significant differences between 3DT1-conventional and 3DT1-DARTEL template methods. In contrast, the 3DT1-DARTEL method showed higher detection sensitivity, and precise anatomical location. Our SPM results suggest that we should perform spatial normalization with DARTEL using anatomical images.
Identification of the isomers using principal component analysis (PCA) method
NASA Astrophysics Data System (ADS)
Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur
2016-03-01
In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.
Novel Method for Superposing 3D Digital Models for Monitoring Orthodontic Tooth Movement.
Schmidt, Falko; Kilic, Fatih; Piro, Neltje Emma; Geiger, Martin Eberhard; Lapatki, Bernd Georg
2018-04-18
Quantitative three-dimensional analysis of orthodontic tooth movement (OTM) is possible by superposition of digital jaw models made at different times during treatment. Conventional methods rely on surface alignment at palatal soft-tissue areas, which is applicable to the maxilla only. We introduce two novel numerical methods applicable to both maxilla and mandible. The OTM from the initial phase of multi-bracket appliance treatment of ten pairs of maxillary models were evaluated and compared with four conventional methods. The median range of deviation of OTM for three users was 13-72% smaller for the novel methods than for the conventional methods, indicating greater inter-observer agreement. Total tooth translation and rotation were significantly different (ANOVA, p < 0.01) for OTM determined by use of the two numerical and four conventional methods. Directional decomposition of OTM from the novel methods showed clinically acceptable agreement with reference results except for vertical translations (deviations of medians greater than 0.6 mm). The difference in vertical translational OTM can be explained by maxillary vertical growth during the observation period, which is additionally recorded by conventional methods. The novel approaches are, thus, particularly suitable for evaluation of pure treatment effects, because growth-related changes are ignored.
AlBarakati, SF; Kula, KS; Ghoneima, AA
2012-01-01
Objective The aim of this study was to assess the reliability and reproducibility of angular and linear measurements of conventional and digital cephalometric methods. Methods A total of 13 landmarks and 16 skeletal and dental parameters were defined and measured on pre-treatment cephalometric radiographs of 30 patients. The conventional and digital tracings and measurements were performed twice by the same examiner with a 6 week interval between measurements. The reliability within the method was determined using Pearson's correlation coefficient (r2). The reproducibility between methods was calculated by paired t-test. The level of statistical significance was set at p < 0.05. Results All measurements for each method were above 0.90 r2 (strong correlation) except maxillary length, which had a correlation of 0.82 for conventional tracing. Significant differences between the two methods were observed in most angular and linear measurements except for ANB angle (p = 0.5), angle of convexity (p = 0.09), anterior cranial base (p = 0.3) and the lower anterior facial height (p = 0.6). Conclusion In general, both methods of conventional and digital cephalometric analysis are highly reliable. Although the reproducibility of the two methods showed some statistically significant differences, most differences were not clinically significant. PMID:22184624
A refined method for multivariate meta-analysis and meta-regression
Jackson, Daniel; Riley, Richard D
2014-01-01
Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects’ standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:23996351
Casting Control of Floating-films into Ribbon-shape Structure by modified Dynamic FTM
NASA Astrophysics Data System (ADS)
Tripathi, A.; Pandey, M.; Nagamatsu, S.; Pandey, S. S.; Hayase, S.; Takashima, W.
2017-11-01
We have developed a new method to obtain Ribbon-shaped floating films via dynamic casting of floating-film and transfer method (dynamic-FTM). Dynamic-FTM is a unique method to prepare oriented thin-film of conjugated polymers (CPs) which is quick and easy. This method has several advantages as compared to the other conventional casting procedure to prepare oriented CP films. In the conventional dynamic FTM appearance of large scale circular orientation poses difficulty not only for practical applications but also hinders the detailed analysis of the orientation mechanism. In this present work, pros and cons of this newly proposed ribbon-shaped floating-film have been discussed in detail from those of the conventional floating-film prepared by dynamic-FTM.
Han, Jubong; Lee, K B; Lee, Jong-Man; Park, Tae Soon; Oh, J S; Oh, Pil-Jei
2016-03-01
We discuss a new method to incorporate Type B uncertainty into least-squares procedures. The new method is based on an extension of the likelihood function from which a conventional least-squares function is derived. The extended likelihood function is the product of the original likelihood function with additional PDFs (Probability Density Functions) that characterize the Type B uncertainties. The PDFs are considered to describe one's incomplete knowledge on correction factors being called nuisance parameters. We use the extended likelihood function to make point and interval estimations of parameters in the basically same way as the least-squares function used in the conventional least-squares method is derived. Since the nuisance parameters are not of interest and should be prevented from appearing in the final result, we eliminate such nuisance parameters by using the profile likelihood. As an example, we present a case study for a linear regression analysis with a common component of Type B uncertainty. In this example we compare the analysis results obtained from using our procedure with those from conventional methods. Copyright © 2015. Published by Elsevier Ltd.
SPACE PROPULSION SYSTEM PHASED-MISSION PROBABILITY ANALYSIS USING CONVENTIONAL PRA METHODS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis Smith; James Knudsen
As part of a series of papers on the topic of advance probabilistic methods, a benchmark phased-mission problem has been suggested. This problem consists of modeling a space mission using an ion propulsion system, where the mission consists of seven mission phases. The mission requires that the propulsion operate for several phases, where the configuration changes as a function of phase. The ion propulsion system itself consists of five thruster assemblies and a single propellant supply, where each thruster assembly has one propulsion power unit and two ion engines. In this paper, we evaluate the probability of mission failure usingmore » the conventional methodology of event tree/fault tree analysis. The event tree and fault trees are developed and analyzed using Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE). While the benchmark problem is nominally a "dynamic" problem, in our analysis the mission phases are modeled in a single event tree to show the progression from one phase to the next. The propulsion system is modeled in fault trees to account for the operation; or in this case, the failure of the system. Specifically, the propulsion system is decomposed into each of the five thruster assemblies and fed into the appropriate N-out-of-M gate to evaluate mission failure. A separate fault tree for the propulsion system is developed to account for the different success criteria of each mission phase. Common-cause failure modeling is treated using traditional (i.e., parametrically) methods. As part of this paper, we discuss the overall results in addition to the positive and negative aspects of modeling dynamic situations with non-dynamic modeling techniques. One insight from the use of this conventional method for analyzing the benchmark problem is that it requires significant manual manipulation to the fault trees and how they are linked into the event tree. The conventional method also requires editing the resultant cut sets to obtain the correct results. While conventional methods may be used to evaluate a dynamic system like that in the benchmark, the level of effort required may preclude its use on real-world problems.« less
NASA Astrophysics Data System (ADS)
Ja'fari, Ahmad; Hamidzadeh Moghadam, Rasoul
2012-10-01
Routine core analysis provides useful information for petrophysical study of the hydrocarbon reservoirs. Effective porosity and fluid conductivity (permeability) could be obtained from core analysis in laboratory. Coring hydrocarbon bearing intervals and analysis of obtained cores in laboratory is expensive and time consuming. In this study an improved method to make a quantitative correlation between porosity and permeability obtained from core and conventional well log data by integration of different artificial intelligent systems is proposed. The proposed method combines the results of adaptive neuro-fuzzy inference system (ANFIS) and neural network (NN) algorithms for overall estimation of core data from conventional well log data. These methods multiply the output of each algorithm with a weight factor. Simple averaging and weighted averaging were used for determining the weight factors. In the weighted averaging method the genetic algorithm (GA) is used to determine the weight factors. The overall algorithm was applied in one of SW Iran’s oil fields with two cored wells. One-third of all data were used as the test dataset and the rest of them were used for training the networks. Results show that the output of the GA averaging method provided the best mean square error and also the best correlation coefficient with real core data.
Daxini, S D; Prajapati, J M
2014-01-01
Meshfree methods are viewed as next generation computational techniques. With evident limitations of conventional grid based methods, like FEM, in dealing with problems of fracture mechanics, large deformation, and simulation of manufacturing processes, meshfree methods have gained much attention by researchers. A number of meshfree methods have been proposed till now for analyzing complex problems in various fields of engineering. Present work attempts to review recent developments and some earlier applications of well-known meshfree methods like EFG and MLPG to various types of structure mechanics and fracture mechanics applications like bending, buckling, free vibration analysis, sensitivity analysis and topology optimization, single and mixed mode crack problems, fatigue crack growth, and dynamic crack analysis and some typical applications like vibration of cracked structures, thermoelastic crack problems, and failure transition in impact problems. Due to complex nature of meshfree shape functions and evaluation of integrals in domain, meshless methods are computationally expensive as compared to conventional mesh based methods. Some improved versions of original meshfree methods and other techniques suggested by researchers to improve computational efficiency of meshfree methods are also reviewed here.
Baksi, B Güniz; Ermis, R Banu
2007-10-01
To test the efficacy of conventional radiometry with indirect digital image analysis in the assessment of the relative radiopacity of dental cements used as liners or bases compared to human enamel and dentin. Disks of 15 different dental cements, 5 mm in diameter and 2 mm thick, were exposed to radiation together with 2-mm-thick disks of enamel and dentin and an aluminum step wedge. Density was evaluated by digital transmission densitometry and with the histogram function of an image analysis program following digitization of the radiographs with a flatbed scanner. A higher number of dental cements were discriminated from both dentin and enamel with conventional radiographic densitometer. All the cements examined, except Ionoseal (Voco) and Ionobond (Voco), were more radiopaque than dentin. With both methods, Chelon-Silver (3M ESPE) had the highest radiopacity and glass-ionomer cements the lowest. Radiodensity of dental cements can be differentiated with a high probability with the conventional radiometric method.
Operational modal analysis applied to the concert harp
NASA Astrophysics Data System (ADS)
Chomette, B.; Le Carrou, J.-L.
2015-05-01
Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.
Mino, Takuya; Maekawa, Kenji; Ueda, Akihiro; Higuchi, Shizuo; Sejima, Junichi; Takeuchi, Tetsuo; Hara, Emilio Satoshi; Kimura-Ono, Aya; Sonoyama, Wataru; Kuboki, Takuo
2015-04-01
The aim of this article was to investigate the accuracy in the reproducibility of full-arch implant provisional restorations to final restorations between a 3D Scan/CAD/CAM technique and the conventional method. We fabricated two final restorations for rehabilitation of maxillary and mandibular complete edentulous area and performed a computer-based comparative analysis of the accuracy in the reproducibility of the provisional restoration to final restoration between a 3D scanning and CAD/CAM (Scan/CAD/CAM) technique and the conventional silicone-mold transfer technique. Final restorations fabricated either by the conventional or Scan/CAD/CAM method were successfully installed in the patient. The total concave/convex volume discrepancy observed with the Scan/CAD/CAM technique was 503.50mm(3) and 338.15 mm(3) for maxillary and mandibular implant-supported prostheses (ISPs), respectively. On the other hand, total concave/convex volume discrepancy observed with the conventional method was markedly high (1106.84 mm(3) and 771.23 mm(3) for maxillary and mandibular ISPs, respectively). The results of the present report suggest that Scan/CAD/CAM method enables a more precise and accurate transfer of provisional restorations to final restorations compared to the conventional method. Copyright © 2014 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Lu, Yingjian; Gao, Boyan; Chen, Pei; Charles, Denys; Yu, Liangli (Lucy)
2014-01-01
Sweet basil, Ocimum basilicum., is one of the most important and wildly used spices and has been shown to have antioxidant, antibacterial, and anti-diarrheal activities. In this study, high performance liquid chromatographic (HPLC) and flow-injection mass spectrometric (FIMS) fingerprinting techniques were used to differentiate organic and conventional sweet basil leaf samples. Principal component analysis (PCA) of the fingerprints indicated that both HPLC and FIMS fingerprints could effectively detect the chemical differences in the organic and conventional sweet basil leaf samples. This study suggested that the organic basil sample contained greater concentrations of almost all the major compounds than its conventional counterpart on a per same botanical weight basis. The FIMS method was able to rapidly differentiate the organic and conventional sweet basil leaf samples (1 min analysis time), whereas the HPLC fingerprints provided more information about the chemical composition of the basil samples with a longer analytical time. PMID:24518341
Lu, Yingjian; Gao, Boyan; Chen, Pei; Charles, Denys; Yu, Liangli Lucy
2014-07-01
Sweet basil, Ocimum basilicum, is one of the most important and wildly used spices and has been shown to have antioxidant, antibacterial, and anti-diarrheal activities. In this study, high performance liquid chromatographic (HPLC) and flow-injection mass spectrometric (FIMS) fingerprinting techniques were used to differentiate organic and conventional sweet basil leaf samples. Principal component analysis (PCA) of the fingerprints indicated that both HPLC and FIMS fingerprints could effectively detect the chemical differences in the organic and conventional sweet basil leaf samples. This study suggested that the organic basil sample contained greater concentrations of almost all the major compounds than its conventional counterpart on a per same botanical weight basis. The FIMS method was able to rapidly differentiate the organic and conventional sweet basil leaf samples (1min analysis time), whereas the HPLC fingerprints provided more information about the chemical composition of the basil samples with a longer analytical time. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho
2015-01-01
Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.
A simple, less invasive stripper micropipetter-based technique for day 3 embryo biopsy.
Cedillo, Luciano; Ocampo-Bárcenas, Azucena; Maldonado, Israel; Valdez-Morales, Francisco J; Camargo, Felipe; López-Bayghen, Esther
2016-01-01
Preimplantation genetic screening (PGS) is an important procedure for in vitro fertilization (IVF). A key step of PGS, blastomere removal, is abundant with many technical issues. The aim of this study was to compare a more simple procedure based on the Stipper Micropipetter, named S-biopsy, to the conventional aspiration method. On Day 3, 368 high-quality embryos (>7 cells on Day3 with <10% fragmentation) were collected from 38 women. For each patient, their embryos were equally separated between the conventional method ( n = 188) and S-biopsy method ( n = 180). The conventional method was performed using a standardized protocol. For the S-biopsy method, a laser was used to remove a significantly smaller portion of the zona pellucida. Afterwards, the complete embryo was aspirated with a Stripper Micropipetter, forcing the removal of the blastomere. Selected blastomeres went to PGS using CGH microarrays. Embryo integrity and blastocyst formation were assessed on Day 5. Differences between groups were assessed by either the Mann-Whitney test or Fisher Exact test. Both methods resulted in the removal of only one blastomere. The S-biopsy and the conventional method did not differ in terms of affecting embryo integrity (95.0% vs. 95.7%) or blastocyst formation (72.7% vs. 70.7%). PGS analysis indicated that aneuploidy rate were similar between the two methods (63.1% vs. 65.2%). However, the time required to perform the S-biopsy method (179.2 ± 17.5 s) was significantly shorter (5-fold) than the conventional method. The S-biopsy method is comparable to the conventional method that is used to remove a blastomere for PGS, but requires less time. Furthermore, due to the simplicity of the S-biopsy technique, this method is more ideal for IVF laboratories.
Modified Involute Helical Gears: Computerized Design, Simulation of Meshing, and Stress Analysis
NASA Technical Reports Server (NTRS)
Handschuh, Robert (Technical Monitor); Litvin, Faydor L.; Gonzalez-Perez, Ignacio; Carnevali, Luca; Kawasaki, Kazumasa; Fuentes-Aznar, Alfonso
2003-01-01
The computerized design, methods for generation, simulation of meshing, and enhanced stress analysis of modified involute helical gears is presented. The approaches proposed for modification of conventional involute helical gears are based on conjugation of double-crowned pinion with a conventional helical involute gear. Double-crowning of the pinion means deviation of cross-profile from an involute one and deviation in longitudinal direction from a helicoid surface. Using the method developed, the pinion-gear tooth surfaces are in point-contact, the bearing contact is localized and oriented longitudinally, and edge contact is avoided. Also, the influence of errors of aligment on the shift of bearing contact, vibration, and noise are reduced substantially. The theory developed is illustrated with numerical examples that confirm the advantages of the gear drives of the modified geometry in comparison with conventional helical involute gears.
Modified Involute Helical Gears: Computerized Design, Simulation of Meshing and Stress Analysis
NASA Technical Reports Server (NTRS)
2003-01-01
The computerized design, methods for generation, simulation of meshing, and enhanced stress analysis of modified involute helical gears is presented. The approaches proposed for modification of conventional involute helical gears are based on conjugation of double-crowned pinion with a conventional helical involute gear. Double-crowning of the pinion means deviation of cross-profile from an involute one and deviation in longitudinal direction from a helicoid surface. Using the method developed, the pinion-gear tooth surfaces are in point-contact, the bearing contact is localized and oriented longitudinally, and edge contact is avoided. Also, the influence of errors of alignment on the shift of bearing contact, vibration, and noise are reduced substantially. The theory developed is illustrated with numerical examples that confirm the advantages of the gear drives of the modified geometry in comparison with conventional helical involute gears.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
Win, Khin Thanda; Vegas, Juan; Zhang, Chunying; Song, Kihwan; Lee, Sanghyeob
2017-01-01
QTL mapping using NGS-assisted BSA was successfully applied to an F 2 population for downy mildew resistance in cucumber. QTLs detected by NGS-assisted BSA were confirmed by conventional QTL analysis. Downy mildew (DM), caused by Pseudoperonospora cubensis, is one of the most destructive foliar diseases in cucumber. QTL mapping is a fundamental approach for understanding the genetic inheritance of DM resistance in cucumber. Recently, many studies have reported that a combination of bulked segregant analysis (BSA) and next-generation sequencing (NGS) can be a rapid and cost-effective way of mapping QTLs. In this study, we applied NGS-assisted BSA to QTL mapping of DM resistance in cucumber and confirmed the results by conventional QTL analysis. By sequencing two DNA pools each consisting of ten individuals showing high resistance and susceptibility to DM from a F 2 population, we identified single nucleotide polymorphisms (SNPs) between the two pools. We employed a statistical method for QTL mapping based on these SNPs. Five QTLs, dm2.2, dm4.1, dm5.1, dm5.2, and dm6.1, were detected and dm2.2 showed the largest effect on DM resistance. Conventional QTL analysis using the F 2 confirmed dm2.2 (R 2 = 10.8-24 %) and dm5.2 (R 2 = 14-27.2 %) as major QTLs and dm4.1 (R 2 = 8 %) as two minor QTLs, but could not detect dm5.1 and dm6.1. A new QTL on chromosome 2, dm2.1 (R 2 = 28.2 %) was detected by the conventional QTL method using an F 3 population. This study demonstrated the effectiveness of NGS-assisted BSA for mapping QTLs conferring DM resistance in cucumber and revealed the unique genetic inheritance of DM resistance in this population through two distinct major QTLs on chromosome 2 that mainly harbor DM resistance.
Adjoint Sensitivity Analysis for Scale-Resolving Turbulent Flow Solvers
NASA Astrophysics Data System (ADS)
Blonigan, Patrick; Garai, Anirban; Diosady, Laslo; Murman, Scott
2017-11-01
Adjoint-based sensitivity analysis methods are powerful design tools for engineers who use computational fluid dynamics. In recent years, these engineers have started to use scale-resolving simulations like large-eddy simulations (LES) and direct numerical simulations (DNS), which resolve more scales in complex flows with unsteady separation and jets than the widely-used Reynolds-averaged Navier-Stokes (RANS) methods. However, the conventional adjoint method computes large, unusable sensitivities for scale-resolving simulations, which unlike RANS simulations exhibit the chaotic dynamics inherent in turbulent flows. Sensitivity analysis based on least-squares shadowing (LSS) avoids the issues encountered by conventional adjoint methods, but has a high computational cost even for relatively small simulations. The following talk discusses a more computationally efficient formulation of LSS, ``non-intrusive'' LSS, and its application to turbulent flows simulated with a discontinuous-Galkerin spectral-element-method LES/DNS solver. Results are presented for the minimal flow unit, a turbulent channel flow with a limited streamwise and spanwise domain.
Analysis of biomedical time signals for characterization of cutaneous diabetic micro-angiopathy
NASA Astrophysics Data System (ADS)
Kraitl, Jens; Ewald, Hartmut
2007-02-01
Photo-plethysmography (PPG) is frequently used in research on microcirculation of blood. It is a non-invasive procedure and takes minimal time to be carried out. Usually PPG time series are analyzed by conventional linear methods, mainly Fourier analysis. These methods may not be optimal for the investigation of nonlinear effects of the hearth circulation system like vasomotion, autoregulation, thermoregulation, breathing, heartbeat and vessels. The wavelet analysis of the PPG time series is a specific, sensitive nonlinear method for the in vivo identification of hearth circulation patterns and human health status. This nonlinear analysis of PPG signals provides additional information which cannot be detected using conventional approaches. The wavelet analysis has been used to study healthy subjects and to characterize the health status of patients with a functional cutaneous microangiopathy which was associated with diabetic neuropathy. The non-invasive in vivo method is based on the radiation of monochromatic light through an area of skin on the finger. A Photometrical Measurement Device (PMD) has been developed. The PMD is suitable for non-invasive continuous online monitoring of one or more biologic constituent values and blood circulation patterns.
Development of a numerical model for vehicle-bridge interaction analysis of railway bridges
NASA Astrophysics Data System (ADS)
Kim, Hee Ju; Cho, Eun Sang; Ham, Jun Su; Park, Ki Tae; Kim, Tae Heon
2016-04-01
In the field of civil engineering, analyzing dynamic response was main concern for a long time. These analysis methods can be divided into moving load analysis method and moving mass analysis method, and formulating each an equation of motion has recently been studied after dividing vehicles and bridges. In this study, the numerical method is presented, which can consider the various train types and can solve the equations of motion for a vehicle-bridge interaction analysis by non-iteration procedure through formulating the coupled equations for motion. Also, 3 dimensional accurate numerical models was developed by KTX-vehicle in order to analyze dynamic response characteristics. The equations of motion for the conventional trains are derived, and the numerical models of the conventional trains are idealized by a set of linear springs and dashpots with 18 degrees of freedom. The bridge models are simplified by the 3 dimensional space frame element which is based on the Euler-Bernoulli theory. The rail irregularities of vertical and lateral directions are generated by PSD functions of the Federal Railroad Administration (FRA).
Erich, Sarah; Schill, Sandra; Annweiler, Eva; Waiblinger, Hans-Ulrich; Kuballa, Thomas; Lachenmeier, Dirk W; Monakhova, Yulia B
2015-12-01
The increased sales of organically produced food create a strong need for analytical methods, which could authenticate organic and conventional products. Combined chemometric analysis of (1)H NMR-, (13)C NMR-spectroscopy data, stable-isotope data (IRMS) and α-linolenic acid content (gas chromatography) was used to differentiate organic and conventional milk. In total 85 raw, pasteurized and ultra-heat treated (UHT) milk samples (52 organic and 33 conventional) were collected between August 2013 and May 2014. The carbon isotope ratios of milk protein and milk fat as well as the α-linolenic acid content of these samples were determined. Additionally, the milk fat was analyzed by (1)H and (13)C NMR spectroscopy. The chemometric analysis of combined data (IRMS, GC, NMR) resulted in more precise authentication of German raw and retail milk with a considerably increased classification rate of 95% compared to 81% for NMR and 90% for IRMS using linear discriminate analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.
Effectiveness of Video Demonstration over Conventional Methods in Teaching Osteology in Anatomy.
Viswasom, Angela A; Jobby, Abraham
2017-02-01
Technology and its applications are the most happening things in the world. So, is it in the field of medical education. This study was an evaluation of whether the conventional methods can compete with the test of technology. A comparative study of traditional method of teaching osteology in human anatomy with an innovative visual aided method. The study was conducted on 94 students admitted to MBBS 2014 to 2015 batch of Travancore Medical College. The students were divided into two academically validated groups. They were taught using conventional and video demonstrational techniques in a systematic manner. Post evaluation tests were conducted. Analysis of the mark pattern revealed that the group taught using traditional method scored better when compared to the visual aided method. Feedback analysis showed that, the students were able to identify bony features better with clear visualisation and three dimensional view when taught using the video demonstration method. The students identified visual aided method as the more interesting one for learning which helped them in applying the knowledge gained. In most of the questions asked, the two methods of teaching were found to be comparable on the same scale. As the study ends, we discover that, no new technique can be substituted for time tested techniques of teaching and learning. The ideal method would be incorporating newer multimedia techniques into traditional classes.
Houshmand, Behzad; Janbakhsh, Noushin; Khalilian, Fatemeh; Talebi Ardakani, Mohammad Reza
2017-01-01
Introduction: Diode laser irradiation has recently shown promising results for treatment of gingival pigmentation. This study sought to compare the efficacy of 2 diode laser irradiation protocols for treatment of gingival pigmentations, namely the conventional method and the sieve method. Methods: In this split-mouth clinical trial, 15 patients with gingival pigmentation were selected and their pigmentation intensity was determined using Dummett's oral pigmentation index (DOPI) in different dental regions. Diode laser (980 nm wavelength and 2 W power) was irradiated through a stipple pattern (sieve method) and conventionally in the other side of the mouth. Level of pain and satisfaction with the outcome (both patient and periodontist) were measured using a 0-10 visual analog scale (VAS) for both methods. Patients were followed up at 2 weeks, one month and 3 months. Pigmentation levels were compared using repeated measures of analysis of variance (ANOVA). The difference in level of pain and satisfaction between the 2 groups was analyzed by sample t test and general estimate equation model. Results: No significant differences were found regarding the reduction of pigmentation scores and pain and scores between the 2 groups. The difference in satisfaction with the results at the three time points was significant in both conventional and sieve methods in patients ( P = 0.001) and periodontists ( P = 0.015). Conclusion: Diode laser irradiation in both methods successfully eliminated gingival pigmentations. The sieve method was comparable to conventional technique, offering no additional advantage.
Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi
2011-01-01
AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation coefficient between AQCEL and conventional methods were 0.973 and 0.986 for the normal and affected sides at rest, respectively, and 0.977 and 0.984 for the normal and affected sides after ACZ loading, respectively. The quality of images reconstructed using the application software AQCEL were superior to that obtained using conventional method after ACZ loading, and high correlations were shown in quantity at rest and after ACZ loading. This software can be applied to clinical practice and is a useful tool for improvement of reproducibility and throughput.
Experimental study of geotextile as plinth beam in a pile group-supported modeled building frame
NASA Astrophysics Data System (ADS)
Ravi Kumar Reddy, C.; Gunneswara Rao, T. D.
2017-12-01
This paper presents the experimental results of static vertical load tests on a model building frame with geotextile as plinth beam supported by pile groups embedded in cohesionless soil (sand). The experimental results have been compared with those obtained from the nonlinear FEA and conventional method of analysis. The results revealed that the conventional method of analysis gives a shear force of about 53%, bending moment at the top of the column about 17% and at the base of the column about 50-98% higher than that by the nonlinear FEA for the frame with geotextile as plinth beam.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harry, T; Yaddanapudi, S; Mutic, S
Purpose: New techniques and materials have recently been developed to expedite the conventional Linac Acceptance Testing Procedure (ATP). The new ATP method uses the Electronic Portal Imaging Device (EPID) for data collection and is presented separately. This new procedure is meant to be more efficient then conventional methods. While not clinically implemented yet, a prospective risk assessment is warranted for any new techniques. The purpose of this work is to investigate the risks and establish the pros and cons between the conventional approach and the new ATP method. Methods: ATP tests that were modified and performed with the EPID weremore » analyzed. Five domain experts (Medical Physicists) comprised the core analysis team. Ranking scales were adopted from previous publications related to TG 100. The number of failure pathways for each ATP test procedure were compared as well as the number of risk priority numbers (RPN’s) greater than 100 were compared. Results: There were fewer failure pathways with the new ATP compared to the conventional, 262 and 556, respectively. There were fewer RPN’s > 100 in the new ATP compared to the conventional, 41 and 115. Failure pathways and RPN’s > 100 for individual ATP tests on average were 2 and 3.5 times higher in the conventional ATP compared to the new, respectively. The pixel sensitivity map of the EPID was identified as a key hazard to the new ATP procedure with an RPN of 288 for verifying beam parameters. Conclusion: The significant decrease in failure pathways and RPN’s >100 for the new ATP mitigates the possibilities of a catastrophic error occurring. The Pixel Sensitivity Map determining the response and inherent characteristics of the EPID is crucial as all data and hence results are dependent on that process. Grant from Varian Medical Systems Inc.« less
Self-calibration method without joint iteration for distributed small satellite SAR systems
NASA Astrophysics Data System (ADS)
Xu, Qing; Liao, Guisheng; Liu, Aifei; Zhang, Juan
2013-12-01
The performance of distributed small satellite synthetic aperture radar systems degrades significantly due to the unavoidable array errors, including gain, phase, and position errors, in real operating scenarios. In the conventional method proposed in (IEEE T Aero. Elec. Sys. 42:436-451, 2006), the spectrum components within one Doppler bin are considered as calibration sources. However, it is found in this article that the gain error estimation and the position error estimation in the conventional method can interact with each other. The conventional method may converge to suboptimal solutions in large position errors since it requires the joint iteration between gain-phase error estimation and position error estimation. In addition, it is also found that phase errors can be estimated well regardless of position errors when the zero Doppler bin is chosen. In this article, we propose a method obtained by modifying the conventional one, based on these two observations. In this modified method, gain errors are firstly estimated and compensated, which eliminates the interaction between gain error estimation and position error estimation. Then, by using the zero Doppler bin data, the phase error estimation can be performed well independent of position errors. Finally, position errors are estimated based on the Taylor-series expansion. Meanwhile, the joint iteration between gain-phase error estimation and position error estimation is not required. Therefore, the problem of suboptimal convergence, which occurs in the conventional method, can be avoided with low computational method. The modified method has merits of faster convergence and lower estimation error compared to the conventional one. Theoretical analysis and computer simulation results verified the effectiveness of the modified method.
Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel
2017-01-01
Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.
Fibre Optic Sensors for Selected Wastewater Characteristics
Chong, Su Sin; Abdul Aziz, A. R.; Harun, Sulaiman W.
2013-01-01
Demand for online and real-time measurements techniques to meet environmental regulation and treatment compliance are increasing. However the conventional techniques, which involve scheduled sampling and chemical analysis can be expensive and time consuming. Therefore cheaper and faster alternatives to monitor wastewater characteristics are required as alternatives to conventional methods. This paper reviews existing conventional techniques and optical and fibre optic sensors to determine selected wastewater characteristics which are colour, Chemical Oxygen Demand (COD) and Biological Oxygen Demand (BOD). The review confirms that with appropriate configuration, calibration and fibre features the parameters can be determined with accuracy comparable to conventional method. With more research in this area, the potential for using FOS for online and real-time measurement of more wastewater parameters for various types of industrial effluent are promising. PMID:23881131
Analysis and quality control of carbohydrates in therapeutic proteins with fluorescence HPLC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Kun; Huang, Jian; Center for Informational Biology, University of Electronic Science and Technology of China, Chengdu 610054
Conbercept is an Fc fusion protein with very complicated carbohydrate profiles which must be carefully monitored through manufacturing process. Here, we introduce an optimized fluorescence derivatization high-performance liquid chromatographic method for glycan mapping in conbercept. Compared with conventional glycan analysis method, this method has much better resolution and higher reproducibility making it excellent for product quality control.
Optimization and Validation of Rotating Current Excitation with GMR Array Sensors for Riveted
2016-09-16
distribution. Simulation results, using both an optimized coil and a conventional coil, are generated using the finite element method (FEM) model...optimized coil and a conventional coil, are generated using the finite element method (FEM) model. The signal magnitude for an optimized coil is seen to be...optimized coil. 4. Model Based Performance Analysis A 3D finite element model (FEM) is used to analyze the performance of the optimized coil and
High-frequency surface waves method for agricultural applications
USDA-ARS?s Scientific Manuscript database
A high-frequency surface wave method has been recently developed to explore shallow soil in the vadose zone for agricultural applications. This method is a modification from the conventional multichannel analysis of surface wave (MASW) method that explores near surface soil properties from a couple ...
Comparative Analysis Between Guided Self-Instruction and Conventional Lectures in Neuroanatomy
ERIC Educational Resources Information Center
de Carvalho, Claudio A. F.; And Others
1977-01-01
A study using 60 first-year medical students in the Santo Amaro Faculty of Medicine, San Paulo, Brazil, found that self-instructional methods such as guided self-instruction or discussion groups are not superior to conventional classes. Self-instruction does have the advantages of low cost and easy applicability. (LBH)
Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad
2014-01-01
Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.
Gao, Boyan; Qin, Fang; Ding, Tingting; Chen, Yineng; Lu, Weiying; Yu, Liangli Lucy
2014-08-13
Ultraperformance liquid chromatography mass spectrometry (UPLC-MS), flow injection mass spectrometry (FIMS), and headspace gas chromatography (headspace-GC) combined with multivariate data analysis techniques were examined and compared in differentiating organically grown oregano from that grown conventionally. It is the first time that headspace-GC fingerprinting technology is reported in differentiating organically and conventionally grown spice samples. The results also indicated that UPLC-MS, FIMS, and headspace-GC-FID fingerprints with OPLS-DA were able to effectively distinguish oreganos under different growing conditions, whereas with PCA, only FIMS fingerprint could differentiate the organically and conventionally grown oregano samples. UPLC fingerprinting provided detailed information about the chemical composition of oregano with a longer analysis time, whereas FIMS finished a sample analysis within 1 min. On the other hand, headspace GC-FID fingerprinting required no sample pretreatment, suggesting its potential as a high-throughput method in distinguishing organically and conventionally grown oregano samples. In addition, chemical components in oregano were identified by their molecular weight using QTOF-MS and headspace-GC-MS.
Géczi, Gábor; Horváth, Márk; Kaszab, Tímea; Alemany, Gonzalo Garnacho
2013-01-01
Extension of shelf life and preservation of products are both very important for the food industry. However, just as with other processes, speed and higher manufacturing performance are also beneficial. Although microwave heating is utilized in a number of industrial processes, there are many unanswered questions about its effects on foods. Here we analyze whether the effects of microwave heating with continuous flow are equivalent to those of traditional heat transfer methods. In our study, the effects of heating of liquid foods by conventional and continuous flow microwave heating were studied. Among other properties, we compared the stability of the liquid foods between the two heat treatments. Our goal was to determine whether the continuous flow microwave heating and the conventional heating methods have the same effects on the liquid foods, and, therefore, whether microwave heat treatment can effectively replace conventional heat treatments. We have compared the colour, separation phenomena of the samples treated by different methods. For milk, we also monitored the total viable cell count, for orange juice, vitamin C contents in addition to the taste of the product by sensory analysis. The majority of the results indicate that the circulating coil microwave method used here is equivalent to the conventional heating method based on thermal conduction and convection. However, some results in the analysis of the milk samples show clear differences between heat transfer methods. According to our results, the colour parameters (lightness, red-green and blue-yellow values) of the microwave treated samples differed not only from the untreated control, but also from the traditional heat treated samples. The differences are visually undetectable, however, they become evident through analytical measurement with spectrophotometer. This finding suggests that besides thermal effects, microwave-based food treatment can alter product properties in other ways as well.
Géczi, Gábor; Horváth, Márk; Kaszab, Tímea; Alemany, Gonzalo Garnacho
2013-01-01
Extension of shelf life and preservation of products are both very important for the food industry. However, just as with other processes, speed and higher manufacturing performance are also beneficial. Although microwave heating is utilized in a number of industrial processes, there are many unanswered questions about its effects on foods. Here we analyze whether the effects of microwave heating with continuous flow are equivalent to those of traditional heat transfer methods. In our study, the effects of heating of liquid foods by conventional and continuous flow microwave heating were studied. Among other properties, we compared the stability of the liquid foods between the two heat treatments. Our goal was to determine whether the continuous flow microwave heating and the conventional heating methods have the same effects on the liquid foods, and, therefore, whether microwave heat treatment can effectively replace conventional heat treatments. We have compared the colour, separation phenomena of the samples treated by different methods. For milk, we also monitored the total viable cell count, for orange juice, vitamin C contents in addition to the taste of the product by sensory analysis. The majority of the results indicate that the circulating coil microwave method used here is equivalent to the conventional heating method based on thermal conduction and convection. However, some results in the analysis of the milk samples show clear differences between heat transfer methods. According to our results, the colour parameters (lightness, red-green and blue-yellow values) of the microwave treated samples differed not only from the untreated control, but also from the traditional heat treated samples. The differences are visually undetectable, however, they become evident through analytical measurement with spectrophotometer. This finding suggests that besides thermal effects, microwave-based food treatment can alter product properties in other ways as well. PMID:23341982
Gowda, Dhananjaya; Airaksinen, Manu; Alku, Paavo
2017-09-01
Recently, a quasi-closed phase (QCP) analysis of speech signals for accurate glottal inverse filtering was proposed. However, the QCP analysis which belongs to the family of temporally weighted linear prediction (WLP) methods uses the conventional forward type of sample prediction. This may not be the best choice especially in computing WLP models with a hard-limiting weighting function. A sample selective minimization of the prediction error in WLP reduces the effective number of samples available within a given window frame. To counter this problem, a modified quasi-closed phase forward-backward (QCP-FB) analysis is proposed, wherein each sample is predicted based on its past as well as future samples thereby utilizing the available number of samples more effectively. Formant detection and estimation experiments on synthetic vowels generated using a physical modeling approach as well as natural speech utterances show that the proposed QCP-FB method yields statistically significant improvements over the conventional linear prediction and QCP methods.
Ramadan, Wijdan H; Khreis, Noura A; Kabbara, Wissam K
2015-01-01
Background The aim of the study was to evaluate the simplicity, safety, patients’ preference, and convenience of the administration of insulin using the pen device versus the conventional vial/syringe in patients with diabetes. Methods This observational study was conducted in multiple community pharmacies in Lebanon. The investigators interviewed patients with diabetes using an insulin pen or conventional vial/syringe. A total of 74 questionnaires were filled over a period of 6 months. Answers were entered into the Statistical Package for Social Sciences (SPSS) software and Excel spreadsheet. t-test, logistic regression analysis, and correlation analysis were used in order to analyze the results. Results A higher percentage of patients from the insulin pen users group (95.2%) found the method easy to use as compared to only 46.7% of the insulin conventional users group (P 0.001, relative risk [RR]: 2.041, 95% confidence interval [CI]: 1.178–3.535). Moreover, 61.9% and 26.7% of pen users and conventional users, respectively, could read the scale easily (P 0.037, RR 2.321, 95% CI: 0.940–5.731), while 85.7% of pen users found it more convenient shifting to pen and 86.7% of the conventional users would want to shift to pen if it had the same cost. Pain perception was statistically different between the groups. A much higher percentage (76.2%) of pen users showed no pain during injection compared to only 26.7% of conventional users (P 0.003, RR 2.857, 95% CI: 1.194–6.838). Conclusion The insulin pen was significantly much easier to use and less painful than the conventional vial/syringe. Proper education on the methods of administration/storage and disposal of needles/syringes is needed in both groups. PMID:25848231
[Enzymatic analysis of the quality of foodstuffs].
Kolesnov, A Iu
1997-01-01
Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.
NASA Astrophysics Data System (ADS)
Baek, Tae Hyun
Photoelasticity is one of the most widely used whole-field optical methods for stress analysis. The technique of birefringent coatings, also called the method of photoelastic coatings, extends the classical procedures of model photoelasticity to the measurement of surface strains in opaque models made of any structural material. Photoelastic phase-shifting method can be used for the determination of the phase values of isochromatics and isoclinics. In this paper, photoelastic phase-shifting technique and conventional Babinet-Soleil compensation method were utilized to analyze a specimen with a triangular hole and a circular hole under bending. Photoelastic phase-shifting technique is whole-field measurement. On the other hand, conventional compensation method is point measurement. Three groups of results were obtained by phase-shifting method with reflective polariscope arrangement, conventional compensation method and FEM simulation, respectively. The results from the first two methods agree with each other relatively well considering experiment error. The advantage of photoelastic phase-shifting method is that it is possible to measure the stress distribution accurately close to the edge of holes.
Wang, Fang; Ouyang, Guang; Zhou, Changsong; Wang, Suiping
2015-01-01
A number of studies have explored the time course of Chinese semantic and syntactic processing. However, whether syntactic processing occurs earlier than semantics during Chinese sentence reading is still under debate. To further explore this issue, an event-related potentials (ERPs) experiment was conducted on 21 native Chinese speakers who read individually-presented Chinese simple sentences (NP1+VP+NP2) word-by-word for comprehension and made semantic plausibility judgments. The transitivity of the verbs was manipulated to form three types of stimuli: congruent sentences (CON), sentences with a semantically violated NP2 following a transitive verb (semantic violation, SEM), and sentences with a semantically violated NP2 following an intransitive verb (combined semantic and syntactic violation, SEM+SYN). The ERPs evoked from the target NP2 were analyzed by using the Residue Iteration Decomposition (RIDE) method to reconstruct the ERP waveform blurred by trial-to-trial variability, as well as by using the conventional ERP method based on stimulus-locked averaging. The conventional ERP analysis showed that, compared with the critical words in CON, those in SEM and SEM+SYN elicited an N400-P600 biphasic pattern. The N400 effects in both violation conditions were of similar size and distribution, but the P600 in SEM+SYN was bigger than that in SEM. Compared with the conventional ERP analysis, RIDE analysis revealed a larger N400 effect and an earlier P600 effect (in the time window of 500-800 ms instead of 570-810ms). Overall, the combination of conventional ERP analysis and the RIDE method for compensating for trial-to-trial variability confirmed the non-significant difference between SEM and SEM+SYN in the earlier N400 time window. Converging with previous findings on other Chinese structures, the current study provides further precise evidence that syntactic processing in Chinese does not occur earlier than semantic processing.
Evaluation of bearing capacity of piles from cone penetration test data.
DOT National Transportation Integrated Search
2007-12-01
A statistical analysis and ranking criteria were used to compare the CPT methods and the conventional alpha design method. Based on the results, the de Ruiter/Beringen and LCPC methods showed the best capability in predicting the measured load carryi...
Ahn, J; Yun, I S; Yoo, H G; Choi, J-J; Lee, M
2017-01-01
Purpose To evaluate a progression-detecting algorithm for a new automated matched alternation flicker (AMAF) in glaucoma patients. Methods Open-angle glaucoma patients with a baseline mean deviation of visual field (VF) test>−6 dB were included in this longitudinal and retrospective study. Functional progression was detected by two VF progression criteria and structural progression by both AMAF and conventional comparison methods using optic disc and retinal nerve fiber layer (RNFL) photography. Progression-detecting performances of AMAF and the conventional method were evaluated by an agreement between functional and structural progression criteria. RNFL thickness changes measured by optical coherence tomography (OCT) were compared between progressing and stable eyes determined by each method. Results Among 103 eyes, 47 (45.6%), 21 (20.4%), and 32 (31.1%) eyes were evaluated as glaucoma progression using AMAF, the conventional method, and guided progression analysis (GPA) of the VF test, respectively. The AMAF showed better agreement than the conventional method, using GPA of the VF test (κ=0.337; P<0.001 and κ=0.124; P=0.191, respectively). The rates of RNFL thickness decay using OCT were significantly different between the progressing and stable eyes when progression was determined by AMAF (−3.49±2.86 μm per year vs −1.83±3.22 μm per year; P=0.007) but not by the conventional method (−3.24±2.42 μm per year vs −2.42±3.33 μm per year; P=0.290). Conclusions The AMAF was better than the conventional comparison method in discriminating structural changes during glaucoma progression, and showed a moderate agreement with functional progression criteria. PMID:27662466
Kaur, Ravinder; Dhakad, Megh Singh; Goyal, Ritu; Haque, Absarul; Mukhopadhyay, Gauranga
2016-01-01
Candida infection is a major cause of morbidity and mortality in immunocompromised patients; an accurate and early identification is a prerequisite need to be taken as an effective measure for the management of patients. The purpose of this study was to compare the conventional identification of Candida species with identification by Vitek-2 system and the antifungal susceptibility testing (AST) by broth microdilution method with Vitek-2 AST system. A total of 172 Candida isolates were subjected for identification by the conventional methods, Vitek-2 system, restriction fragment length polymorphism, and random amplified polymorphic DNA analysis. AST was carried out as per the Clinical and Laboratory Standards Institute M27-A3 document and by Vitek-2 system. Candida albicans (82.51%) was the most common Candida species followed by Candida tropicalis (6.29%), Candida krusei (4.89%), Candida parapsilosis (3.49%), and Candida glabrata (2.79%). With Vitek-2 system, of the 172 isolates, 155 Candida isolates were correctly identified, 13 were misidentified, and four were with low discrimination. Whereas with conventional methods, 171 Candida isolates were correctly identified and only a single isolate of C. albicans was misidentified as C. tropicalis . The average measurement of agreement between the Vitek-2 system and conventional methods was >94%. Most of the isolates were susceptible to fluconazole (88.95%) and amphotericin B (97.67%). The measurement of agreement between the methods of AST was >94% for fluconazole and >99% for amphotericin B, which was statistically significant ( P < 0.01). The study confirmed the importance and reliability of conventional and molecular methods, and the acceptable agreements suggest Vitek-2 system an alternative method for speciation and sensitivity testing of Candida species infections.
Comparison of nutritional quality between conventional and organic dairy products: a meta-analysis.
Palupi, Eny; Jayanegara, Anuraga; Ploeger, Angelika; Kahl, Johannes
2012-11-01
As a contribution to the debate on the comparison of nutritional quality between conventional versus organic products, the present study would like to provide new results on this issue specifically on dairy products by integrating the last 3 years' studies using a meta-analysis approach with Hedges' d effect size method. The current meta-analysis shows that organic dairy products contain significantly higher protein, ALA, total omega-3 fatty acid, cis-9,trans-11 conjugated linoleic acid, trans-11 vaccenic acid, eicosapentanoic acid, and docosapentanoic acid than those of conventional types, with cumulative effect size ( ± 95% confidence interval) of 0.56 ± 0.24, 1.74 ± 0.16, 0.84 ± 0.14, 0.68 ± 0.13, 0.51 ± 0.16, 0.42 ± 0.23, and 0.71 ± 0.3, respectively. It is also observed that organic dairy products have significantly (P < 0.001) higher omega-3 to -6 ratio (0.42 vs. 0.23) and Δ9-desaturase index (0.28 vs. 0.27) than the conventional types. The current regulation on organic farming indeed drives organic farms to production of organic dairy products with different nutritional qualities from conventional ones. The differences in feeding regime between conventional and organic dairy production is suspected as the reason behind this evidence. Further identical meta-analysis may be best applicable for summarizing a comparison between conventional and organic foodstuffs for other aspects and food categories. Copyright © 2012 Society of Chemical Industry.
Simulation-based sensitivity analysis for non-ignorably missing data.
Yin, Peng; Shi, Jian Q
2017-01-01
Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.
New materials through a variety of sintering methods
NASA Astrophysics Data System (ADS)
Jaworska, L.; Cyboroń, J.; Cygan, S.; Laszkiewicz-Łukasik, J.; Podsiadło, M.; Novak, P.; Holovenko, Y.
2018-03-01
New sintering techniques make it possible to obtain materials with special properties that are impossible to obtain by conventional sintering techniques. This issue is especially important for ceramic materials for application under extreme conditions. Following the tendency to limit critical materials in manufacturing processes, the use of W, Si, B, Co, Cr should be limited, also. One of the cheapest and widely available materials is aluminum oxide, which shows differences in phase composition, grain size, hardness, strain and fracture toughness of the same type of powder, sintered via various methods. In this paper the alumina was sintered using the conventional free sintering process, microwave sintering, Spark Plasma Sintering (SPS), high pressure-high temperature method (HP-HT) and High Pressure Spark Plasma Sintering (HP SPS). Phase composition analysis, by X-ray diffraction of the alumina materials sintered using various methods, was carried out. For the conventional sintering method, compacts are composed of α-Al2O3 and θ-Al2O3. For compacts sintered using SPS, microwave and HP-HT methods, χ-Al2O3 and γ-Al2O3 phases were additionally present. Mechanical and physical properties of the obtained materials were compared between the methods of sintering. On the basis of images from scanning electron microscope quantitative analysis was performed to determine the degree of grain growth of alumina after sintering.
Li, Mingchao; Wang, Zhengyun
2016-01-01
Objective To perform a meta-analysis of data from available published studies comparing laparoendoscopic single-site surgery varicocelectomy (LESSV) with conventional transperitoneal laparoscopic varicocele ligation. Methods A comprehensive data search was performed in PubMed and Embase to identify randomized controlled trials and comparative studies that compared the two surgical approaches for the treatment of varicoceles. Results Six studies were included in the meta-analysis. LESSV required a significantly longer operative time than conventional laparoscopic varicocelectomy but was associated with significantly less postoperative pain at 6 h and 24 h, a shorter recovery time and greater patient satisfaction with the cosmetic outcome. There was no difference between the two surgical approaches in terms of postoperative semen quality or the incidence of complications. Conclusion These data suggest that LESSV offers a well tolerated and efficient alternative to conventional laparoscopic varicocelectomy, with less pain, a shorter recovery time and better cosmetic satisfaction. Further well-designed studies are required to confirm these findings and update the results of this meta-analysis. PMID:27688686
Freeman, Karoline; Tsertsvadze, Alexander; Taylor-Phillips, Sian; McCarthy, Noel; Mistry, Hema; Manuel, Rohini; Mason, James
2017-01-01
Multiplex gastrointestinal pathogen panel (GPP) tests simultaneously identify bacterial, viral and parasitic pathogens from the stool samples of patients with suspected infectious gastroenteritis presenting in hospital or the community. We undertook a systematic review to compare the accuracy of GPP tests with standard microbiology techniques. Searches in Medline, Embase, Web of Science and the Cochrane library were undertaken from inception to January 2016. Eligible studies compared GPP tests with standard microbiology techniques in patients with suspected gastroenteritis. Quality assessment of included studies used tailored QUADAS-2. In the absence of a reference standard we analysed test performance taking GPP tests and standard microbiology techniques in turn as the benchmark test, using random effects meta-analysis of proportions. No study provided an adequate reference standard with which to compare the test accuracy of GPP and conventional tests. Ten studies informed a meta-analysis of positive and negative agreement. Positive agreement across all pathogens was 0.93 (95% CI 0.90 to 0.96) when conventional methods were the benchmark and 0.68 (95% CI: 0.58 to 0.77) when GPP provided the benchmark. Negative agreement was high in both instances due to the high proportion of negative cases. GPP testing produced a greater number of pathogen-positive findings than conventional testing. It is unclear whether these additional 'positives' are clinically important. GPP testing has the potential to simplify testing and accelerate reporting when compared to conventional microbiology methods. However the impact of GPP testing upon the management, treatment and outcome of patients is poorly understood and further studies are needed to evaluate the health economic impact of GPP testing compared with standard methods. The review protocol is registered with PROSPERO as CRD42016033320.
Fractal analysis of GPS time series for early detection of disastrous seismic events
NASA Astrophysics Data System (ADS)
Filatov, Denis M.; Lyubushin, Alexey A.
2017-03-01
A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.
Colello, Raymond J; Tozer, Jordan; Henderson, Scott C
2012-01-01
Photoconversion, the method by which a fluorescent dye is transformed into a stable, osmiophilic product that can be visualized by electron microscopy, is the most widely used method to enable the ultrastructural analysis of fluorescently labeled cellular structures. Nevertheless, the conventional method of photoconversion using widefield fluorescence microscopy requires long reaction times and results in low-resolution cell targeting. Accordingly, we have developed a photoconversion method that ameliorates these limitations by adapting confocal laser scanning microscopy to the procedure. We have found that this method greatly reduces photoconversion times, as compared to conventional wide field microscopy. Moreover, region-of-interest scanning capabilities of a confocal microscope facilitate the targeting of the photoconversion process to individual cellular or subcellular elements within a fluorescent field. This reduces the area of the cell exposed to light energy, thereby reducing the ultrastructural damage common to this process when widefield microscopes are employed. © 2012 by John Wiley & Sons, Inc.
Meta-Analysis inside and outside Particle Physics: Convergence Using the Path of Least Resistance?
ERIC Educational Resources Information Center
Jackson, Dan; Baker, Rose
2013-01-01
In this note, we explain how the method proposed by Hartung and Knapp provides a compromise between conventional meta-analysis methodology and "unconstrained averaging", as used by the Particle Data Group.
Exploratory Mediation Analysis via Regularization
Serang, Sarfaraz; Jacobucci, Ross; Brimhall, Kim C.; Grimm, Kevin J.
2017-01-01
Exploratory mediation analysis refers to a class of methods used to identify a set of potential mediators of a process of interest. Despite its exploratory nature, conventional approaches are rooted in confirmatory traditions, and as such have limitations in exploratory contexts. We propose a two-stage approach called exploratory mediation analysis via regularization (XMed) to better address these concerns. We demonstrate that this approach is able to correctly identify mediators more often than conventional approaches and that its estimates are unbiased. Finally, this approach is illustrated through an empirical example examining the relationship between college acceptance and enrollment. PMID:29225454
Irimia, Andrei; Richards, William O; Bradshaw, L Alan
2009-11-01
In this study, we perform a comparative study of independent component analysis (ICA) and conventional filtering (CF) for the purpose of artifact reduction from simultaneous gastric EMG and magnetogastrography (MGG). EMG/MGG data were acquired from ten anesthetized pigs by obtaining simultaneous recordings using serosal electrodes (EMG) as well as with a superconducting quantum interference device biomagnetometer (MGG). The analysis of MGG waveforms using ICA and CF indicates that ICA is superior to the CF method in its ability to extract respiration and cardiac artifacts from MGG recordings. A signal frequency analysis of ICA- and CF-processed data was also undertaken using waterfall plots, and it was determined that the two methods produce qualitatively comparable results. Through the use of simultaneous EMG/MGG, we were able to demonstrate the accuracy and trustworthiness of our results by comparison and cross-validation within the framework of a porcine model.
Thalanany, Mariamma M; Mugford, Miranda; Hibbert, Clare; Cooper, Nicola J; Truesdale, Ann; Robinson, Steven; Tiruvoipati, Ravindranath; Elbourne, Diana R; Peek, Giles J; Clemens, Felicity; Hardy, Polly; Wilson, Andrew
2008-01-01
Background Extracorporeal Membrane Oxygenation (ECMO) is a technology used in treatment of patients with severe but potentially reversible respiratory failure. A multi-centre randomised controlled trial (CESAR) was funded in the UK to compare care including ECMO with conventional intensive care management. The protocol and funding for the CESAR trial included plans for economic data collection and analysis. Given the high cost of treatment, ECMO is considered an expensive technology for many funding systems. However, conventional treatment for severe respiratory failure is also one of the more costly forms of care in any health system. Methods/Design The objectives of the economic evaluation are to compare the costs of a policy of referral for ECMO with those of conventional treatment; to assess cost-effectiveness and the cost-utility at 6 months follow-up; and to assess the cost-utility over a predicted lifetime. Resources used by patients in the trial are identified. Resource use data are collected from clinical report forms and through follow up interviews with patients. Unit costs of hospital intensive care resources are based on parallel research on cost functions in UK NHS intensive care units. Other unit costs are based on published NHS tariffs. Cost effectiveness analysis uses the outcome: survival without severe disability. Cost utility analysis is based on quality adjusted life years gained based on the Euroqol EQ-5D at 6 months. Sensitivity analysis is planned to vary assumptions about transport costs and method of costing intensive care. Uncertainty will also be expressed in analysis of individual patient data. Probabilities of cost effectiveness given different funding thresholds will be estimated. Discussion In our view it is important to record our methods in detail and present them before publication of the results of the trial so that a record of detail not normally found in the final trial reports can be made available in the public domain. Trial Registrations The CESAR trial registration number is ISRCTN47279827. PMID:18447931
Kim, Yun-Jeong; Chae, Joon-Seok; Chang, Jun Keun; Kang, Seong Ho
2005-08-12
We have developed a novel method for the ultra-fast analysis of genetically modified organisms (GMOs) in soybeans by microchip capillary gel electrophoresis (MCGE) using programmed field strength gradients (PFSG) in a conventional glass double-T microchip. Under the programmed electric field strength and 0.3% poly(ethylene oxide) sieving matrix, the GMO in soybeans was analyzed within only 11 s of the microchip. The MCGE-PFSG method was a program that changes the electric field strength during GMO analysis, and was also applied to the ultra-fast analysis of PCR products. Compared to MCGE using a conventional and constantly applied electric field, the MCGE-PFSG analysis generated faster results without the loss of resolving power and reproducibility for specific DNA fragments (100- and 250-bp DNA) of GM-soybeans. The MCGE-PFSG technique may prove to be a new tool in the GMO analysis due to its speed, simplicity, and high efficiency.
ERIC Educational Resources Information Center
Hong, Guanglei; Deutsch, Jonah; Hill, Heather D.
2015-01-01
Conventional methods for mediation analysis generate biased results when the mediator-outcome relationship depends on the treatment condition. This article shows how the ratio-of-mediator-probability weighting (RMPW) method can be used to decompose total effects into natural direct and indirect effects in the presence of treatment-by-mediator…
ERIC Educational Resources Information Center
Hong, Guanglei; Deutsch, Jonah; Hill, Heather D.
2015-01-01
Conventional methods for mediation analysis generate biased results when the mediator--outcome relationship depends on the treatment condition. This article shows how the ratio-of-mediator-probability weighting (RMPW) method can be used to decompose total effects into natural direct and indirect effects in the presence of treatment-by-mediator…
Bioactive Compounds in Potato Tubers: Effects of Farming System, Cooking Method, and Flesh Color
Czerko, Zbigniew; Zarzyńska, Krystyna; Borowska-Komenda, Monika
2016-01-01
We investigated the effect of cultivation system (conventional or organic), cooking method, and flesh color on the contents of ascorbic acid (AA) and total phenolics (TPs), and on total antioxidant activity (Trolox equivalents, TE) in Solanum tuberosum (potato) tubers. The research material, consisting of 4 potato cultivars, was grown in experimental fields, using organic and conventional systems, at the experimental station in 2012 and 2013. The analysis showed that organically grown potatoes with creamy, light yellow, and yellow flesh had significantly higher TPs than did potatoes grown conventionally. Flesh color and cooking method also affected AA. The greatest losses of AA occurred in yellow-fleshed potatoes grown conventionally and cooked in the microwave; such losses were not observed in potatoes grown organically. A dry cooking method (baking in a microwave) increased the TP contents in potatoes by about 30%, regardless of the flesh color and the production system. TE was significantly higher in organically grown potatoes (raw and cooked in a steamer) than in conventionally grown potatoes. TE and AA contents showed a significant positive correlation, but only in potatoes from the organic system [R2 = 0.686]. By contrast, the positive correlation between TE and TPs was observed regardless of the production system. Therefore, we have identified the effects of farming system, cooking method, and flesh color on the contents of bioactive compounds in potato tubers. PMID:27139188
Conventional vs. e-learning in nursing education: A systematic review and meta-analysis.
Voutilainen, Ari; Saaranen, Terhi; Sormunen, Marjorita
2017-03-01
By and large, in health professions training, the direction of the effect of e-learning, positive or negative, strongly depends on the learning outcome in question as well as on learning methods which e-learning is compared to. In nursing education, meta-analytically generated knowledge regarding the comparisons between conventional and e-learning is scarce. The aim of this review is to discover the size of the effect of e-learning on learning outcomes in nursing education and to assess the quality of studies in which e-learning has been compared to conventional learning. A systematic search of six electronic databases, PubMed, Ovid MEDLINE®, CINAHL (EBSCOhost), Cochrane Library, PsycINFO, and ERIC, was conducted in order to identify relevant peer-reviewed English language articles published between 2011 and 2015. The quality of the studies included as well as the risk of bias in each study was assessed. A random-effects meta-analysis was performed to generate a pooled mean difference in the learning outcome. Altogether, 10 studies were eligible for the quality assessment and meta-analysis. Nine studies were evaluated as good quality studies, but not without a risk of bias. Performance bias caused a high risk in nearly all the studies. In the meta-analysis, an e-learning method resulted in test scores that were, on average, five points higher than a conventional method on a 0-100 scale. Heterogeneity between the studies was very large. The size and direction of the effect of a learning method on learning outcomes appeared to be strongly situational. We suggest that meta-regressions should be performed instead of basic meta-analyses in order to reveal factors that cause variation in the learning outcomes of nursing education. It might be necessary to perform separate meta-analyses between e-learning interventions aimed at improving nursing knowledge and those aimed at improving nursing skills. Copyright © 2016 Elsevier Ltd. All rights reserved.
This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...
ERIC Educational Resources Information Center
Henaku, Christina Bampo; Pobbi, Michael Asamani
2017-01-01
Many researchers and educationist remain skeptical about the effectiveness of distance learning program and have termed it as second to the conventional training method. This perception is largely due to several challenges which exist within the management of distance learning program across the country. The general aim of the study is compare the…
Neither fixed nor random: weighted least squares meta-analysis.
Stanley, T D; Doucouliagos, Hristos
2015-06-15
This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects. Copyright © 2015 John Wiley & Sons, Ltd.
Holmes, Robert R.; Dunn, Chad J.
1996-01-01
A simplified method to estimate total-streambed scour was developed for application to bridges in the State of Illinois. Scour envelope curves, developed as empirical relations between calculated total scour and bridge-site chracteristics for 213 State highway bridges in Illinois, are used in the method to estimate the 500-year flood scour. These 213 bridges, geographically distributed throughout Illinois, had been previously evaluated for streambed scour with the application of conventional hydraulic and scour-analysis methods recommended by the Federal Highway Administration. The bridge characteristics necessary for application of the simplified bridge scour-analysis method can be obtained from an office review of bridge plans, examination of topographic maps, and reconnaissance-level site inspection. The estimates computed with the simplified method generally resulted in a larger value of 500-year flood total-streambed scour than with the more detailed conventional method. The simplified method was successfully verified with a separate data set of 106 State highway bridges, which are geographically distributed throughout Illinois, and 15 county highway bridges.
Single-tube analysis of DNA methylation with silica superparamagnetic beads.
Bailey, Vasudev J; Zhang, Yi; Keeley, Brian P; Yin, Chao; Pelosky, Kristen L; Brock, Malcolm; Baylin, Stephen B; Herman, James G; Wang, Tza-Huei
2010-06-01
DNA promoter methylation is a signature for the silencing of tumor suppressor genes. Most widely used methods to detect DNA methylation involve 3 separate, independent processes: DNA extraction, bisulfite conversion, and methylation detection via a PCR method, such as methylation-specific PCR (MSP). This method includes many disconnected steps with associated losses of material, potentially reducing the analytical sensitivity required for analysis of challenging clinical samples. Methylation on beads (MOB) is a new technique that integrates DNA extraction, bisulfite conversion, and PCR in a single tube via the use of silica superparamagnetic beads (SSBs) as a common DNA carrier for facilitating cell debris removal and buffer exchange throughout the entire process. In addition, PCR buffer is used to directly elute bisulfite-treated DNA from SSBs for subsequent target amplifications. The diagnostic sensitivity of MOB was evaluated by methylation analysis of the CDKN2A [cyclin-dependent kinase inhibitor 2A (melanoma, p16, inhibits CDK4); also known as p16(INK4a)] promoter in serum DNA of lung cancer patients and compared with that of conventional methods. Methylation analysis consisting of DNA extraction followed by bisulfite conversion and MSP was successfully carried out within 9 h in a single tube. The median pre-PCR DNA yield was 6.61-fold higher with the MOB technique than with conventional techniques. Furthermore, MOB increased the diagnostic sensitivity in our analysis of the CDKN2A promoter in patient serum by successfully detecting methylation in 74% of cancer patients, vs the 45% detection rate obtained with conventional techniques. The MOB technique successfully combined 3 processes into a single tube, thereby allowing ease in handling and an increased detection throughput. The increased pre-PCR yield in MOB allowed efficient, diagnostically sensitive methylation detection.
NASA Technical Reports Server (NTRS)
Waszak, M. R.; Schmidt, D. S.
1985-01-01
As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.
Kawashima, Hiroki; Hayashi, Norio; Ohno, Naoki; Matsuura, Yukihiro; Sanada, Shigeru
2015-08-01
To evaluate the patient identification ability of radiographers, previous and current chest radiographs were assessed with observer study utilizing a receiver operating characteristics (ROCs) analysis. This study included portable and conventional chest radiographs from 43 same and 43 different patients. The dataset used in this study was divided into the three following groups: (1) a pair of portable radiographs, (2) a pair of conventional radiographs, and (3) a combination of each type of radiograph. Seven observers participated in this ROC study, which aimed to identify same or different patients, using these datasets. ROC analysis was conducted to calculate the average area under ROC curve obtained by each observer (AUCave), and a statistical test was performed using the multi-reader multi-case method. Comparable results were obtained with pairs of portable (AUCave: 0.949) and conventional radiographs (AUCave: 0.951). In a comparison between the same modality, there were no significant differences. In contrast, the ability to identify patients by comparing a portable and conventional radiograph (AUCave: 0.873) was lower than with the matching datasets (p=0.002 and p=0.004, respectively). In conclusion, the use of different imaging modalities reduces radiographers' ability to identify their patients.
Johnson, R.G.; Wandless, G.A.
1984-01-01
A new method is described for determining carrier yield in the radiochemical neutron activation analysis of rare-earth elements in silicate rocks by group separation. The method involves the determination of the rare-earth elements present in the carrier by means of energy-dispersive X-ray fluorescence analysis, eliminating the need to re-irradiate samples in a nuclear reactor after the gamma ray analysis is complete. Results from the analysis of USGS standards AGV-1 and BCR-1 compare favorably with those obtained using the conventional method. ?? 1984 Akade??miai Kiado??.
Untangling Brain-Wide Dynamics in Consciousness by Cross-Embedding
Tajima, Satohiro; Yanagawa, Toru; Fujii, Naotaka; Toyoizumi, Taro
2015-01-01
Brain-wide interactions generating complex neural dynamics are considered crucial for emergent cognitive functions. However, the irreducible nature of nonlinear and high-dimensional dynamical interactions challenges conventional reductionist approaches. We introduce a model-free method, based on embedding theorems in nonlinear state-space reconstruction, that permits a simultaneous characterization of complexity in local dynamics, directed interactions between brain areas, and how the complexity is produced by the interactions. We demonstrate this method in large-scale electrophysiological recordings from awake and anesthetized monkeys. The cross-embedding method captures structured interaction underlying cortex-wide dynamics that may be missed by conventional correlation-based analysis, demonstrating a critical role of time-series analysis in characterizing brain state. The method reveals a consciousness-related hierarchy of cortical areas, where dynamical complexity increases along with cross-area information flow. These findings demonstrate the advantages of the cross-embedding method in deciphering large-scale and heterogeneous neuronal systems, suggesting a crucial contribution by sensory-frontoparietal interactions to the emergence of complex brain dynamics during consciousness. PMID:26584045
A new design of groundwater sampling device and its application.
Tsai, Yih-jin; Kuo, Ming-ching T
2005-01-01
Compounds in the atmosphere contaminate samples of groundwater. An inexpensive and simple method for collecting groundwater samples is developed to prevent contamination when the background concentration of contaminants is high. This new design of groundwater sampling device involves a glass sampling bottle with a Teflon-lined valve at each end. A cleaned and dried sampling bottle was connected to a low flow-rate peristaltic pump with Teflon tubing and was filled with water. No headspace volume was remained in the sampling bottle. The sample bottle was then packed in a PVC bag to prevent the target component from infiltrating into the water sample through the valves. In this study, groundwater was sampled at six wells using both the conventional method and the improved method. The analysis of trichlorofluoromethane (CFC-11) concentrations at these six wells indicates that all the groundwater samples obtained by the conventional sampling method were contaminated by CFC-11 from the atmosphere. The improved sampling method greatly eliminated the problems of contamination, preservation and quantitative analysis of natural water.
Koopman, Richelle J.; Kochendorfer, Karl M.; Moore, Joi L.; Mehr, David R.; Wakefield, Douglas S.; Yadamsuren, Borchuluun; Coberly, Jared S.; Kruse, Robin L.; Wakefield, Bonnie J.; Belden, Jeffery L.
2011-01-01
PURPOSE We compared use of a new diabetes dashboard screen with use of a conventional approach of viewing multiple electronic health record (EHR) screens to find data needed for ambulatory diabetes care. METHODS We performed a usability study, including a quantitative time study and qualitative analysis of information-seeking behaviors. While being recorded with Morae Recorder software and “think-aloud” interview methods, 10 primary care physicians first searched their EHR for 10 diabetes data elements using a conventional approach for a simulated patient, and then using a new diabetes dashboard for another. We measured time, number of mouse clicks, and accuracy. Two coders analyzed think-aloud and interview data using grounded theory methodology. RESULTS The mean time needed to find all data elements was 5.5 minutes using the conventional approach vs 1.3 minutes using the diabetes dashboard (P <.001). Physicians correctly identified 94% of the data requested using the conventional method, vs 100% with the dashboard (P <.01). The mean number of mouse clicks was 60 for conventional searching vs 3 clicks with the diabetes dashboard (P <.001). A common theme was that in everyday practice, if physicians had to spend too much time searching for data, they would either continue without it or order a test again. CONCLUSIONS Using a patient-specific diabetes dashboard improves both the efficiency and accuracy of acquiring data needed for high-quality diabetes care. Usability analysis tools can provide important insights into the value of optimizing physician use of health information technologies. PMID:21911758
Huang, Shu-Huan; Lin, Yi-Fang; Tsai, Ming-Han; Yang, Shuan; Liao, Mei-Ling; Chao, Shao-Wen; Hwang, Cheng-Cheng
2018-06-01
Conventional methods for identifying gastroenteritis pathogens are time consuming, more likely to result in a false-negative, rely on personnel with diagnostic expertise, and are dependent on the specimen status. Alternatively, molecular diagnostic methods permit the rapid, simultaneous detection of multiple pathogens with high sensitivity and specificity. The present study compared conventional methods with the Luminex xTAG Gastrointestinal Pathogen Panel (xTAG GPP) for the diagnosis of infectious gastroenteritis in northern Taiwan. From July 2015 to April 2016, 217 clinical fecal samples were collected from patients with suspected infectious gastroenteritis. All specimens were tested using conventional diagnostic techniques following physicians' orders as well as with the xTAG GPP. The multiplex polymerase chain reaction (PCR) approach detected significantly more positive samples with bacterial, viral, and/or parasitic infections as compared to conventional analysis (55.8% vs 40.1%, respectively; P < .001). Moreover, multiplex PCR could detect Escherichia coli O157, enterotoxigenic E coli, Shiga-like toxin-producing E coli, Cryptosporidium, and Giardia, which were undetectable by conventional methods. Furthermore, 48 pathogens in 23 patients (10.6%) with coinfections were identified only using the multiplex PCR approach. Of which, 82.6% were from pediatric patients. Because the detection rates using multiplex PCR are higher than conventional methods, and some pediatric pathogens could only be detected by multiplex PCR, this approach may be useful in rapidly diagnosing diarrheal disease in children and facilitating treatment initiation. Further studies are necessary to determine if multiplex PCR improves patient outcomes and reduces costs.
Huang, Shu-Huan; Lin, Yi-Fang; Tsai, Ming-Han; Yang, Shuan; Liao, Mei-Ling; Chao, Shao-Wen; Hwang, Cheng-Cheng
2018-01-01
Abstract Conventional methods for identifying gastroenteritis pathogens are time consuming, more likely to result in a false-negative, rely on personnel with diagnostic expertise, and are dependent on the specimen status. Alternatively, molecular diagnostic methods permit the rapid, simultaneous detection of multiple pathogens with high sensitivity and specificity. The present study compared conventional methods with the Luminex xTAG Gastrointestinal Pathogen Panel (xTAG GPP) for the diagnosis of infectious gastroenteritis in northern Taiwan. From July 2015 to April 2016, 217 clinical fecal samples were collected from patients with suspected infectious gastroenteritis. All specimens were tested using conventional diagnostic techniques following physicians’ orders as well as with the xTAG GPP. The multiplex polymerase chain reaction (PCR) approach detected significantly more positive samples with bacterial, viral, and/or parasitic infections as compared to conventional analysis (55.8% vs 40.1%, respectively; P < .001). Moreover, multiplex PCR could detect Escherichia coli O157, enterotoxigenic E coli, Shiga-like toxin-producing E coli, Cryptosporidium, and Giardia, which were undetectable by conventional methods. Furthermore, 48 pathogens in 23 patients (10.6%) with coinfections were identified only using the multiplex PCR approach. Of which, 82.6% were from pediatric patients. Because the detection rates using multiplex PCR are higher than conventional methods, and some pediatric pathogens could only be detected by multiplex PCR, this approach may be useful in rapidly diagnosing diarrheal disease in children and facilitating treatment initiation. Further studies are necessary to determine if multiplex PCR improves patient outcomes and reduces costs. PMID:29879060
Intraoral distalizer effects with conventional and skeletal anchorage: a meta-analysis.
Grec, Roberto Henrique da Costa; Janson, Guilherme; Branco, Nuria Castello; Moura-Grec, Patrícia Garcia; Patel, Mayara Paim; Castanha Henriques, José Fernando
2013-05-01
The aims of this meta-analysis were to quantify and to compare the amounts of distalization and anchorage loss of conventional and skeletal anchorage methods in the correction of Class II malocclusion with intraoral distalizers. The literature was searched through 5 electronic databases, and inclusion criteria were applied. Articles that presented pretreatment and posttreatment cephalometric values were preferred. Quality assessments of the studies were performed. The averages and standard deviations of molar and premolar effects were extracted from the studies to perform a meta-analysis. After applying the inclusion and exclusion criteria, 40 studies were included in the systematic review. After the quality analysis, 2 articles were classified as high quality, 27 as medium quality, and 11 as low quality. For the meta-analysis, 6 studies were included, and they showed average molar distalization amounts of 3.34 mm with conventional anchorage and 5.10 mm with skeletal anchorage. The meta-analysis of premolar movement showed estimates of combined effects of 2.30 mm (mesialization) in studies with conventional anchorage and -4.01 mm (distalization) in studies with skeletal anchorage. There was scientific evidence that both anchorage systems are effective for distalization; however, with skeletal anchorage, there was no anchorage loss when direct anchorage was used. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
This report compares simultaneous results from three woodstove sampling methods and evaluates particulate emission rates of conventional and Oregon-certified catalytic and noncatalytic woodstoves in six Portland, OR, houses. EPA Methods 5G and 5H and the field emission sampler (A...
Prediction of Groundwater Level at Slope Areas using Electrical Resistivity Method
NASA Astrophysics Data System (ADS)
Baharuddin, M. F. T.; Hazreek, Z. A. M.; Azman, M. A. A.; Madun, A.
2018-04-01
Groundwater level plays an important role as an agent that triggers landslides. Commonly, the conventional method used to monitor the groundwater level is done by using standpipe piezometer. There were several disadvantages of the conventional method related to cost, time and data coverage. The aim of this study is to determine groundwater level at slope areas using electrical resistivity method and to verify groundwater level of the study area with standpipe piezometer data. The data acquisition was performed using ABEM Terrameter SAS4000. For data analysis and processing, RES2DINV and SURFER were used. The groundwater level was calibrated with reference of standpipe piezometer based on electrical resistivity value (ERV).
Dual-energy x-ray image decomposition by independent component analysis
NASA Astrophysics Data System (ADS)
Jiang, Yifeng; Jiang, Dazong; Zhang, Feng; Zhang, Dengfu; Lin, Gang
2001-09-01
The spatial distributions of bone and soft tissue in human body are separated by independent component analysis (ICA) of dual-energy x-ray images. It is because of the dual energy imaging modelí-s conformity to the ICA model that we can apply this method: (1) the absorption in body is mainly caused by photoelectric absorption and Compton scattering; (2) they take place simultaneously but are mutually independent; and (3) for monochromatic x-ray sources the total attenuation is achieved by linear combination of these two absorption. Compared with the conventional method, the proposed one needs no priori information about the accurate x-ray energy magnitude for imaging, while the results of the separation agree well with the conventional one.
Development and evaluation of the impulse transfer function technique
NASA Technical Reports Server (NTRS)
Mantus, M.
1972-01-01
The development of the test/analysis technique known as the impulse transfer function (ITF) method is discussed. This technique, when implemented with proper data processing systems, should become a valuable supplement to conventional dynamic testing and analysis procedures that will be used in the space shuttle development program. The method can relieve many of the problems associated with extensive and costly testing of the shuttle for transient loading conditions. In addition, the time history information derived from impulse testing has the potential for being used to determine modal data for the structure under investigation. The technique could be very useful in determining the time-varying modal characteristics of structures subjected to thermal transients, where conventional mode surveys are difficult to perform.
Improving Arterial Spin Labeling by Using Deep Learning.
Kim, Ki Hwan; Choi, Seung Hong; Park, Sung-Hong
2018-05-01
Purpose To develop a deep learning algorithm that generates arterial spin labeling (ASL) perfusion images with higher accuracy and robustness by using a smaller number of subtraction images. Materials and Methods For ASL image generation from pair-wise subtraction, we used a convolutional neural network (CNN) as a deep learning algorithm. The ground truth perfusion images were generated by averaging six or seven pairwise subtraction images acquired with (a) conventional pseudocontinuous arterial spin labeling from seven healthy subjects or (b) Hadamard-encoded pseudocontinuous ASL from 114 patients with various diseases. CNNs were trained to generate perfusion images from a smaller number (two or three) of subtraction images and evaluated by means of cross-validation. CNNs from the patient data sets were also tested on 26 separate stroke data sets. CNNs were compared with the conventional averaging method in terms of mean square error and radiologic score by using a paired t test and/or Wilcoxon signed-rank test. Results Mean square errors were approximately 40% lower than those of the conventional averaging method for the cross-validation with the healthy subjects and patients and the separate test with the patients who had experienced a stroke (P < .001). Region-of-interest analysis in stroke regions showed that cerebral blood flow maps from CNN (mean ± standard deviation, 19.7 mL per 100 g/min ± 9.7) had smaller mean square errors than those determined with the conventional averaging method (43.2 ± 29.8) (P < .001). Radiologic scoring demonstrated that CNNs suppressed noise and motion and/or segmentation artifacts better than the conventional averaging method did (P < .001). Conclusion CNNs provided superior perfusion image quality and more accurate perfusion measurement compared with those of the conventional averaging method for generation of ASL images from pair-wise subtraction images. © RSNA, 2017.
Adaptive Sampling using Support Vector Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Mandelli; C. Smith
2012-11-01
Reliability/safety analysis of stochastic dynamic systems (e.g., nuclear power plants, airplanes, chemical plants) is currently performed through a combination of Event-Tress and Fault-Trees. However, these conventional methods suffer from certain drawbacks: • Timing of events is not explicitly modeled • Ordering of events is preset by the analyst • The modeling of complex accident scenarios is driven by expert-judgment For these reasons, there is currently an increasing interest into the development of dynamic PRA methodologies since they can be used to address the deficiencies of conventional methods listed above.
DOT National Transportation Integrated Search
2011-12-01
Current AASHTO provisions for the conventional load rating of flat slab bridges rely on the equivalent strip method : of analysis for determining live load effects, this is generally regarded as overly conservative by many professional : engineers. A...
Kanlaya, Rattiyaporn; Thongboonkerd, Visith
2016-08-01
Conventional method to purify/concentrate dengue virus (DENV) is time-consuming with low virus recovery yield. Herein, we applied cellufine sulfate column chromatography to purify/concentrate DENV based on the mimicry between heparan sulfate and DENV envelope protein. Comparative analysis demonstrated that this new method offered higher purity (as determined by less contamination of bovine serum albumin) and recovery yield (as determined by greater infectivity). Moreover, overall duration used for cellufine sulfate column chromatography to purify/concentrate DENV was approximately 1/20 of that of conventional method. Therefore, cellufine sulfate column chromatography serves as a simple, rapid, and effective alternative method for DENV purification/concentration. Copyright © 2016 Elsevier B.V. All rights reserved.
Pickup, William; Bremer, Phil; Peng, Mei
2018-03-01
The extensive time and cost associated with conventional sensory profiling methods has spurred sensory researchers to develop rapid method alternatives, such as Napping® with Ultra-Flash Profiling (UFP). Napping®-UFP generates sensory maps by requiring untrained panellists to separate samples based on perceived sensory similarities. Evaluations of this method have been restrained to manufactured/formulated food models, and predominantly structured on comparisons against the conventional descriptive method. The present study aims to extend the validation of Napping®-UFP (N = 72) to natural biological products; and to evaluate this method against Descriptive Analysis (DA; N = 8) with physiochemical measurements as an additional evaluative criterion. The results revealed that sample configurations generated by DA and Napping®-UFP were not significantly correlated (RV = 0.425, P = 0.077); however, they were both correlated with the product map generated based on the instrumental measures (P < 0.05). The finding also noted that sample characterisations from DA and Napping®-UFP were driven by different sensory attributes, indicating potential structural differences between these two methods in configuring samples. Overall, these findings lent support for the extended use of Napping®-UFP for evaluations of natural biological products. Although DA was shown to be a better method for establishing sensory-instrumental relationships, Napping®-UFP exhibited strengths in generating informative sample configurations based on holistic perception of products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing
NASA Technical Reports Server (NTRS)
Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.
2001-01-01
The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.
The relation between periods’ identification and noises in hydrologic series data
NASA Astrophysics Data System (ADS)
Sang, Yan-Fang; Wang, Dong; Wu, Ji-Chun; Zhu, Qing-Ping; Wang, Ling
2009-04-01
SummaryIdentification of dominant periods is a typical and important issue in hydrologic series data analysis, since it is the basis of building effective stochastic models, understanding complex hydrologic processes, etc. However it is still a difficult task due to the influence of many interrelated factors, such as noises in hydrologic series data. In this paper, firstly the great influence of noises on periods' identification has been analyzed. Then, based on two conventional methods of hydrologic series analysis: wavelet analysis (WA) and maximum entropy spectral analysis (MESA), a new method of periods' identification of hydrologic series data, main series spectral analysis (MSSA), has been put forward, whose main idea is to identify periods of the main series on the basis of reducing hydrologic noises. Various methods (include fast Fourier transform (FFT), MESA and MSSA) have been applied to both synthetic series and observed hydrologic series. Results show that conventional methods (FFT and MESA) are not as good as expected due to the great influence of noises. However, this influence is not so strong while using the new method MSSA. In addition, by using the new de-noising method proposed in this paper, which is suitable for both normal noises and skew noises, the results are more reasonable, since noises separated from hydrologic series data generally follow skew probability distributions. In conclusion, based on comprehensive analyses, it can be stated that the proposed method MSSA could improve periods' identification by effectively reducing the influence of hydrologic noises.
Taylor, Vivien F; Toms, Andrew; Longerich, Henry P
2002-01-01
The application of open vessel focused microwave acid digestion is described for the preparation of geological and environmental samples for analysis using inductively coupled plasma-mass spectrometry (ICP-MS). The method is compared to conventional closed-vessel high pressure methods which are limited in the use of HF to break down silicates. Open-vessel acid digestion more conveniently enables the use of HF to remove Si from geological and plant samples as volatile SiF4, as well as evaporation-to-dryness and sequential acid addition during the procedure. Rock reference materials (G-2 granite, MRG-1 gabbros, SY-2 syenite, JA-1 andesite, and JB-2 and SRM-688 basalts) and plant reference materials (BCR and IAEA lichens, peach leaves, apple leaves, Durham wheat flour, and pine needles) were digested with results comparable to conventional hotplate digestion. The microwave digestion method gave poor results for granitic samples containing refractory minerals, however fusion was the preferred method of preparation for these samples. Sample preparation time was reduced from several days, using conventional hotplate digestion method, to one hour per sample using our microwave method.
NASA Astrophysics Data System (ADS)
Shao, Xupeng
2017-04-01
Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy
Application of the Radon-FCL approach to seismic random noise suppression and signal preservation
NASA Astrophysics Data System (ADS)
Meng, Fanlei; Li, Yue; Liu, Yanping; Tian, Yanan; Wu, Ning
2016-08-01
The fractal conservation law (FCL) is a linear partial differential equation that is modified by an anti-diffusive term of lower order. The analysis indicated that this algorithm could eliminate high frequencies and preserve or amplify low/medium-frequencies. Thus, this method is quite suitable for the simultaneous noise suppression and enhancement or preservation of seismic signals. However, the conventional FCL filters seismic data only along the time direction, thereby ignoring the spatial coherence between neighbouring traces, which leads to the loss of directional information. Therefore, we consider the development of the conventional FCL into the time-space domain and propose a Radon-FCL approach. We applied a Radon transform to implement the FCL method in this article; performing FCL filtering in the Radon domain achieves a higher level of noise attenuation. Using this method, seismic reflection events can be recovered with the sacrifice of fewer frequency components while effectively attenuating more random noise than conventional FCL filtering. Experiments using both synthetic and common shot point data demonstrate the advantages of the Radon-FCL approach versus the conventional FCL method with regard to both random noise attenuation and seismic signal preservation.
NASA Astrophysics Data System (ADS)
Yang, Lei; Yan, Hongyong; Liu, Hong
2017-03-01
Implicit staggered-grid finite-difference (ISFD) scheme is competitive for its great accuracy and stability, whereas its coefficients are conventionally determined by the Taylor-series expansion (TE) method, leading to a loss in numerical precision. In this paper, we modify the TE method using the minimax approximation (MA), and propose a new optimal ISFD scheme based on the modified TE (MTE) with MA method. The new ISFD scheme takes the advantage of the TE method that guarantees great accuracy at small wavenumbers, and keeps the property of the MA method that keeps the numerical errors within a limited bound at the same time. Thus, it leads to great accuracy for numerical solution of the wave equations. We derive the optimal ISFD coefficients by applying the new method to the construction of the objective function, and using a Remez algorithm to minimize its maximum. Numerical analysis is made in comparison with the conventional TE-based ISFD scheme, indicating that the MTE-based ISFD scheme with appropriate parameters can widen the wavenumber range with high accuracy, and achieve greater precision than the conventional ISFD scheme. The numerical modeling results also demonstrate that the MTE-based ISFD scheme performs well in elastic wave simulation, and is more efficient than the conventional ISFD scheme for elastic modeling.
NASA Astrophysics Data System (ADS)
Mudra, E.; Streckova, M.; Pavlinak, D.; Medvecka, V.; Kovacik, D.; Kovalcikova, A.; Zubko, P.; Girman, V.; Dankova, Z.; Koval, V.; Duzsa, J.
2017-09-01
In this paper, the electrospinning method was used for preparation of α-Al2O3 microfibers from PAN/Al(NO3)3 precursor solution. The precursor fibers were thermally treated by conventional method in furnace or low-temperature plasma induced surface sintering method in ambient air. The four different temperatures of PAN/Al(NO3)3 precursors were chosen for formation of α-Al2O3 phase by conventional sintering way according to the transition features observed in the TG/DSC analysis. In comparison, the low-temperature plasma treatment at atmospheric pressure was used as an alternative sintering method at the exposure times of 5, 10 and 30 min. FTIR analysis was used for evaluation of residual polymer after plasma induced calcination and for studying the mechanism of polymer degradation. The polycrystalline alumina fibers arranged with the nanoparticles was created continuously throughout the whole volume of the sample. On the other side the low temperature approach, high density of reactive species and high power density of plasma generated at atmospheric pressure by used plasma source allowed rapid removal of polymer in preference from the surface of fibers leading to the formation of composite ceramic/polymer fibers. This plasma induced sintering of PAN/Al(NO3)3 can have obvious importance in industrial applications where the ceramic character of surface with higher toughness of the fibers are required.
NASA Technical Reports Server (NTRS)
Bartels, Robert E.
1998-01-01
Flow and turbulence models applied to the problem of shock buffet onset are studied. The accuracy of the interactive boundary layer and the thin-layer Navier-Stokes equations solved with recent upwind techniques using similar transport field equation turbulence models is assessed for standard steady test cases, including conditions having significant shock separation. The two methods are found to compare well in the shock buffet onset region of a supercritical airfoil that involves strong trailing-edge separation. A computational analysis using the interactive-boundary layer has revealed a Reynolds scaling effect in the shock buffet onset of the supercritical airfoil, which compares well with experiment. The methods are next applied to a conventional airfoil. Steady shock-separated computations of the conventional airfoil with the two methods compare well with experiment. Although the interactive boundary layer computations in the shock buffet region compare well with experiment for the conventional airfoil, the thin-layer Navier-Stokes computations do not. These findings are discussed in connection with possible mechanisms important in the onset of shock buffet and the constraints imposed by current numerical modeling techniques.
Mohamed, Heba M; Lamie, Nesrine T
2016-09-01
In the past few decades the analytical community has been focused on eliminating or reducing the usage of hazardous chemicals and solvents, in different analytical methodologies, that have been ascertained to be extremely dangerous to human health and environment. In this context, environmentally friendly, green, or clean practices have been implemented in different research areas. This study presents a greener alternative of conventional RP-HPLC methods for the simultaneous determination and quantitative analysis of a pharmaceutical ternary mixture composed of telmisartan, hydrochlorothiazide, and amlodipine besylate, using an ecofriendly mobile phase and short run time with the least amount of waste production. This solvent-replacement approach was feasible without compromising method performance criteria, such as separation efficiency, peak symmetry, and chromatographic retention. The greenness profile of the proposed method was assessed and compared with reported conventional methods using the analytical Eco-Scale as an assessment tool. The proposed method was found to be greener in terms of usage of hazardous chemicals and solvents, energy consumption, and production of waste. The proposed method can be safely used for the routine analysis of the studied pharmaceutical ternary mixture with a minimal detrimental impact on human health and the environment.
Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing
2018-01-01
Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.
Basaki, Kinga; Alkumru, Hasan; De Souza, Grace; Finer, Yoav
To assess the three-dimensional (3D) accuracy and clinical acceptability of implant definitive casts fabricated using a digital impression approach and to compare the results with those of a conventional impression method in a partially edentulous condition. A mandibular reference model was fabricated with implants in the first premolar and molar positions to simulate a patient with bilateral posterior edentulism. Ten implant-level impressions per method were made using either an intraoral scanner with scanning abutments for the digital approach or an open-tray technique and polyvinylsiloxane material for the conventional approach. 3D analysis and comparison of implant location on resultant definitive casts were performed using laser scanner and quality control software. The inter-implant distances and interimplant angulations for each implant pair were measured for the reference model and for each definitive cast (n = 20 per group); these measurements were compared to calculate the magnitude of error in 3D for each definitive cast. The influence of implant angulation on definitive cast accuracy was evaluated for both digital and conventional approaches. Statistical analysis was performed using t test (α = .05) for implant position and angulation. Clinical qualitative assessment of accuracy was done via the assessment of the passivity of a master verification stent for each implant pair, and significance was analyzed using chi-square test (α = .05). A 3D error of implant positioning was observed for the two impression techniques vs the reference model, with mean ± standard deviation (SD) error of 116 ± 94 μm and 56 ± 29 μm for the digital and conventional approaches, respectively (P = .01). In contrast, the inter-implant angulation errors were not significantly different between the two techniques (P = .83). Implant angulation did not have a significant influence on definitive cast accuracy within either technique (P = .64). The verification stent demonstrated acceptable passive fit for 11 out of 20 casts and 18 out of 20 casts for the digital and conventional methods, respectively (P = .01). Definitive casts fabricated using the digital impression approach were less accurate than those fabricated from the conventional impression approach for this simulated clinical scenario. A significant number of definitive casts generated by the digital technique did not meet clinically acceptable accuracy for the fabrication of a multiple implant-supported restoration.
Howard, Elizabeth J; Harville, Emily; Kissinger, Patricia; Xiong, Xu
2013-07-01
There is growing interest in the application of propensity scores (PS) in epidemiologic studies, especially within the field of reproductive epidemiology. This retrospective cohort study assesses the impact of a short interpregnancy interval (IPI) on preterm birth and compares the results of the conventional logistic regression analysis with analyses utilizing a PS. The study included 96,378 singleton infants from Louisiana birth certificate data (1995-2007). Five regression models designed for methods comparison are presented. Ten percent (10.17 %) of all births were preterm; 26.83 % of births were from a short IPI. The PS-adjusted model produced a more conservative estimate of the exposure variable compared to the conventional logistic regression method (β-coefficient: 0.21 vs. 0.43), as well as a smaller standard error (0.024 vs. 0.028), odds ratio and 95 % confidence intervals [1.15 (1.09, 1.20) vs. 1.23 (1.17, 1.30)]. The inclusion of more covariate and interaction terms in the PS did not change the estimates of the exposure variable. This analysis indicates that PS-adjusted regression may be appropriate for validation of conventional methods in a large dataset with a fairly common outcome. PS's may be beneficial in producing more precise estimates, especially for models with many confounders and effect modifiers and where conventional adjustment with logistic regression is unsatisfactory. Short intervals between pregnancies are associated with preterm birth in this population, according to either technique. Birth spacing is an issue that women have some control over. Educational interventions, including birth control, should be applied during prenatal visits and following delivery.
[Evaluation of Wits appraisal with superimposition method].
Xu, T; Ahn, J; Baumrind, S
1999-07-01
To compare the conventional Wits appraisal with superimposed Wits appraisal in evaluation of sagittal jaw relationship change between pre and post orthodontic treatment. The sample consists of 48-case pre and post treatment lateral head films. Computerized digitizing is used to get the cephalometric landmarks and measure conventional Wits value, superimposed Wits value and ANB angle. The correlation analysis among these three measures was done by SAS statistical package. The change of ANB angle has higher correlation with the change of superimposed Wits than that of the conventional Wits. The r-value is as high as 0.849 (P < 0.001). The superimposed Wits appraisal reflects the change of sagittal jaw relationship more objectively than the conventional one.
von Peter, Sebastian; Bieler, Patrick
2017-01-01
The Convention on the Rights of Persons with Disabilities (CRPD) has been received considerable attention internationally. The Convention's main arguments are conceptually analyzed. Implications for the development of research designs are elaborated upon. The Convention entails both a human rights and a sociopolitical dimension. Advancing a relational notion of disability, it enters a rather foreign terrain to medical sciences. Research designs have to be changed accordingly. Research designs in accordance with the CRPD should employ and further develop context-sensitive research strategies and interdisciplinary collaboration. Complex designs that allow for a relational analysis of personalized effects have to be established and evaluated, thereby systematically integrating qualitative methods.
Use of Latent Profile Analysis in Studies of Gifted Students
ERIC Educational Resources Information Center
Mammadov, Sakhavat; Ward, Thomas J.; Cross, Jennifer Riedl; Cross, Tracy L.
2016-01-01
To date, in gifted education and related fields various conventional factor analytic and clustering techniques have been used extensively for investigation of the underlying structure of data. Latent profile analysis is a relatively new method in the field. In this article, we provide an introduction to latent profile analysis for gifted education…
Hohmann, Monika; Monakhova, Yulia; Erich, Sarah; Christoph, Norbert; Wachter, Helmut; Holzgrabe, Ulrike
2015-11-04
Because the basic suitability of proton nuclear magnetic resonance spectroscopy ((1)H NMR) to differentiate organic versus conventional tomatoes was recently proven, the approach to optimize (1)H NMR classification models (comprising overall 205 authentic tomato samples) by including additional data of isotope ratio mass spectrometry (IRMS, δ(13)C, δ(15)N, and δ(18)O) and mid-infrared (MIR) spectroscopy was assessed. Both individual and combined analytical methods ((1)H NMR + MIR, (1)H NMR + IRMS, MIR + IRMS, and (1)H NMR + MIR + IRMS) were examined using principal component analysis (PCA), partial least squares discriminant analysis (PLS-DA), linear discriminant analysis (LDA), and common components and specific weight analysis (ComDim). With regard to classification abilities, fused data of (1)H NMR + MIR + IRMS yielded better validation results (ranging between 95.0 and 100.0%) than individual methods ((1)H NMR, 91.3-100%; MIR, 75.6-91.7%), suggesting that the combined examination of analytical profiles enhances authentication of organically produced tomatoes.
A comparative study of conventionally sintered and microwave sintered nickel zinc ferrite
NASA Astrophysics Data System (ADS)
Rani, Rekha; Juneja, J. K.; Raina, K. K.; Kotnala, R. K.; Prakash, Chandra
2014-04-01
For the present work, nickel zinc ferrite having compositional formula Ni0.8Zn0.2Fe2O4 was synthesized by conventional solid state method and sintered in conventional and microwave furnaces. Pellets were sintered with very short soaking time of 10 min at 1150 °C in microwave furnace whereas 4 hrs of soaking time was selected for conventional sintering at 1200 °C. Phase formation was confirmed by X-ray diffraction analysis technique. Scanning electron micrographs were taken for microstructural study. Dielectric properties were studied as a function of temperature. To study magnetic behavior, M-H hysteresis loops were recorded for both samples. It is observed that microwave sintered sample could obtain comparable properties to the conventionally sintered one in lesser soaking time at lower sintering temperature.
A Direct Cell Quenching Method for Cell-Culture Based Metabolomics
A crucial step in metabolomic analysis of cellular extracts is the cell quenching process. The conventional method first uses trypsin to detach cells from their growth surface. This inevitably changes the profile of cellular metabolites since the detachment of cells from the extr...
Tickner, James; Ganly, Brianna; Lovric, Bojan; O'Dwyer, Joel
2017-04-01
Mining companies rely on chemical analysis methods to determine concentrations of gold in mineral ore samples. As gold is often mined commercially at concentrations around 1 part-per-million, it is necessary for any analysis method to provide good sensitivity as well as high absolute accuracy. We describe work to improve both the sensitivity and accuracy of the gamma activation analysis (GAA) method for gold. We present analysis results for several suites of ore samples and discuss the design of a GAA facility designed to replace conventional chemical assay in industrial applications. Copyright © 2017. Published by Elsevier Ltd.
FSILP: fuzzy-stochastic-interval linear programming for supporting municipal solid waste management.
Li, Pu; Chen, Bing
2011-04-01
Although many studies on municipal solid waste management (MSW management) were conducted under uncertain conditions of fuzzy, stochastic, and interval coexistence, the solution to the conventional linear programming problems of integrating fuzzy method with the other two was inefficient. In this study, a fuzzy-stochastic-interval linear programming (FSILP) method is developed by integrating Nguyen's method with conventional linear programming for supporting municipal solid waste management. The Nguyen's method was used to convert the fuzzy and fuzzy-stochastic linear programming problems into the conventional linear programs, by measuring the attainment values of fuzzy numbers and/or fuzzy random variables, as well as superiority and inferiority between triangular fuzzy numbers/triangular fuzzy-stochastic variables. The developed method can effectively tackle uncertainties described in terms of probability density functions, fuzzy membership functions, and discrete intervals. Moreover, the method can also improve upon the conventional interval fuzzy programming and two-stage stochastic programming approaches, with advantageous capabilities that are easily achieved with fewer constraints and significantly reduces consumption time. The developed model was applied to a case study of municipal solid waste management system in a city. The results indicated that reasonable solutions had been generated. The solution can help quantify the relationship between the change of system cost and the uncertainties, which could support further analysis of tradeoffs between the waste management cost and the system failure risk. Copyright © 2010 Elsevier Ltd. All rights reserved.
Linear discriminant analysis based on L1-norm maximization.
Zhong, Fujin; Zhang, Jiashu
2013-08-01
Linear discriminant analysis (LDA) is a well-known dimensionality reduction technique, which is widely used for many purposes. However, conventional LDA is sensitive to outliers because its objective function is based on the distance criterion using L2-norm. This paper proposes a simple but effective robust LDA version based on L1-norm maximization, which learns a set of local optimal projection vectors by maximizing the ratio of the L1-norm-based between-class dispersion and the L1-norm-based within-class dispersion. The proposed method is theoretically proved to be feasible and robust to outliers while overcoming the singular problem of the within-class scatter matrix for conventional LDA. Experiments on artificial datasets, standard classification datasets and three popular image databases demonstrate the efficacy of the proposed method.
BAYESIAN META-ANALYSIS ON MEDICAL DEVICES: APPLICATION TO IMPLANTABLE CARDIOVERTER DEFIBRILLATORS
Youn, Ji-Hee; Lord, Joanne; Hemming, Karla; Girling, Alan; Buxton, Martin
2012-01-01
Objectives: The aim of this study is to describe and illustrate a method to obtain early estimates of the effectiveness of a new version of a medical device. Methods: In the absence of empirical data, expert opinion may be elicited on the expected difference between the conventional and modified devices. Bayesian Mixed Treatment Comparison (MTC) meta-analysis can then be used to combine this expert opinion with existing trial data on earlier versions of the device. We illustrate this approach for a new four-pole implantable cardioverter defibrillator (ICD) compared with conventional ICDs, Class III anti-arrhythmic drugs, and conventional drug therapy for the prevention of sudden cardiac death in high risk patients. Existing RCTs were identified from a published systematic review, and we elicited opinion on the difference between four-pole and conventional ICDs from experts recruited at a cardiology conference. Results: Twelve randomized controlled trials were identified. Seven experts provided valid probability distributions for the new ICDs compared with current devices. The MTC model resulted in estimated relative risks of mortality of 0.74 (0.60–0.89) (predictive relative risk [RR] = 0.77 [0.41–1.26]) and 0.83 (0.70–0.97) (predictive RR = 0.84 [0.55–1.22]) with the new ICD therapy compared to Class III anti-arrhythmic drug therapy and conventional drug therapy, respectively. These results showed negligible differences from the preliminary results for the existing ICDs. Conclusions: The proposed method incorporating expert opinion to adjust for a modification made to an existing device may play a useful role in assisting decision makers to make early informed judgments on the effectiveness of frequently modified healthcare technologies. PMID:22559753
High-Accuracy Ultrasound Contrast Agent Detection Method for Diagnostic Ultrasound Imaging Systems.
Ito, Koichi; Noro, Kazumasa; Yanagisawa, Yukari; Sakamoto, Maya; Mori, Shiro; Shiga, Kiyoto; Kodama, Tetsuya; Aoki, Takafumi
2015-12-01
An accurate method for detecting contrast agents using diagnostic ultrasound imaging systems is proposed. Contrast agents, such as microbubbles, passing through a blood vessel during ultrasound imaging are detected as blinking signals in the temporal axis, because their intensity value is constantly in motion. Ultrasound contrast agents are detected by evaluating the intensity variation of a pixel in the temporal axis. Conventional methods are based on simple subtraction of ultrasound images to detect ultrasound contrast agents. Even if the subject moves only slightly, a conventional detection method will introduce significant error. In contrast, the proposed technique employs spatiotemporal analysis of the pixel intensity variation over several frames. Experiments visualizing blood vessels in the mouse tail illustrated that the proposed method performs efficiently compared with conventional approaches. We also report that the new technique is useful for observing temporal changes in microvessel density in subiliac lymph nodes containing tumors. The results are compared with those of contrast-enhanced computed tomography. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Lee, Ki Song; Choe, Young Chan; Park, Sung Hee
2015-10-01
This study examined the structural variables affecting the environmental effects of organic farming compared to those of conventional farming. A meta-analysis based on 107 studies and 360 observations published from 1977 to 2012 compared energy efficiency (EE) and greenhouse gas emissions (GHGE) for organic and conventional farming. The meta-analysis systematically analyzed the results of earlier comparative studies and used logistic regression to identify the structural variables that contributed to differences in the effects of organic and conventional farming on the environment. The statistical evidence identified characteristics that differentiated the environmental effects of organic and conventional farming, which is controversial. The results indicated that data sources, sample size and product type significantly affected EE, whereas product type, cropping pattern and measurement unit significantly affected the GHGE of organic farming compared to conventional farming. Superior effects of organic farming on the environment were more likely to appear for larger samples, primary data rather than secondary data, monocropping rather than multicropping, and crops other than fruits and vegetables. The environmental effects of organic farming were not affected by the study period, geographic location, farm size, cropping pattern, or measurement method. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Hayashi, Nobuhide; Saegusa, Jun; Uto, Kenichi; Oyabu, Chinami; Saito, Toshiharu; Sato, Itsuko; Kawano, Seiji; Kumagai, Shunichi
2016-02-01
Antinuclear antibody (ANA) testing is indispensable for diagnosing and understanding clinical conditions of autoimmune diseases. The indirect immunofluorescence assay (IFA) is the gold standard for ANA screening, and it can detect more than 100 different antibodies, such as anti-PCNA as well as anti-cytoplasmic antibodies. However, complicated procedures of conventional IFA and visual interpretation require highly skilled laboratory staff. This study evaluates the capability, characteristics, and applicability of the recently developed ANA detection system (EUROPattern Cosmic IFA System, EPA) using HEp20-10 cells and the automated pattern recognition microscope. Findings using EPA and conventional methods were compared in 282 sera obtained from connective tissue disease patients and 250 sera from healthy individuals. The concordance of the positivity rate, antibody titer (within +/- 1 tube difference), and the accurate recognition rate of ANA patterns between the automated EPA method and the microscopic judgement of the EPA image by eye was 98.9, 97.4, and 55.3%, respectively. The EPA method showed concordance of the positivity rate as high as 93.3% and concordance of the antibody titer as high as 94.0% (within +/- 1 titer) compared with the conventional method. Regarding the four typical patterns of ANA (homogeneous, speckled, nucleolar, and centromere), large differences between the EPA and conventional methods were not observed, and the rate of concordance between the final EPA result and the conventional method was from 94.1 to 100%. The positivity rate of ANA using the EPA and conventional methods showed marked agreement among the six connective tissue diseases (SLE, MCTD, SSc, PM/DM, and SS) and healthy individuals. Although the EPA system is not considered a complete system and laboratory staff should verify the results, it is a useful system for routine ANA analysis because it contributes to ANA standardization and an efficient workflow.
Generation of Protein Crystals Using a Solution-Stirring Technique
NASA Astrophysics Data System (ADS)
Adachi, Hiroaki; Niino, Ai; Matsumura, Hiroyoshi; Takano, Kazufumi; Kinoshita, Takayoshi; Warizaya, Masaichi; Inoue, Tsuyoshi; Mori, Yusuke; Sasaki, Takatomo
2004-06-01
Crystals of bovine adenosine deaminase (ADA) were grown over a two week period in the presence of an inhibitor, whereas ADA crystals did not form using conventional crystallization methods when the inhibitor was excluded. To obtain ADA crystals in the absence of the inhibitor, a solution-stirring technique was used. The crystals obtained using this technique were found to be of high quality and were shown to have high structural resolution for X-ray diffraction analysis. The results of this study indicate that the stirring technique is a useful method for obtaining crystals of proteins that do not crystallize using conventional techniques.
Wu, Zheng; Zeng, Li-bo; Wu, Qiong-shui
2016-02-01
The conventional cervical cancer screening methods mainly include TBS (the bethesda system) classification method and cellular DNA quantitative analysis, however, by using multiple staining method in one cell slide, which is staining the cytoplasm with Papanicolaou reagent and the nucleus with Feulgen reagent, the study of achieving both two methods in the cervical cancer screening at the same time is still blank. Because the difficulty of this multiple staining method is that the absorbance of the non-DNA material may interfere with the absorbance of DNA, so that this paper has set up a multi-spectral imaging system, and established an absorbance unmixing model by using multiple linear regression method based on absorbance's linear superposition character, and successfully stripped out the absorbance of DNA to run the DNA quantitative analysis, and achieved the perfect combination of those two kinds of conventional screening method. Through a series of experiment we have proved that between the absorbance of DNA which is calculated by the absorbance unmixxing model and the absorbance of DNA which is measured there is no significant difference in statistics when the test level is 1%, also the result of actual application has shown that there is no intersection between the confidence interval of the DNA index of the tetraploid cells which are screened by using this paper's analysis method when the confidence level is 99% and the DNA index's judging interval of cancer cells, so that the accuracy and feasibility of the quantitative DNA analysis with multiple staining method expounded by this paper have been verified, therefore this analytical method has a broad application prospect and considerable market potential in early diagnosis of cervical cancer and other cancers.
Analysis of titanium content in titanium tetrachloride solution
NASA Astrophysics Data System (ADS)
Bi, Xiaoguo; Dong, Yingnan; Li, Shanshan; Guan, Duojiao; Wang, Jianyu; Tang, Meiling
2018-03-01
Strontium titanate, barium titan and lead titanate are new type of functional ceramic materials with good prospect, and titanium tetrachloride is a commonly in the production such products. Which excellent electrochemical performance of ferroelectric tempreature coefficient effect.In this article, three methods are used to calibrate the samples of titanium tetrachloride solution by back titration method, replacement titration method and gravimetric analysis method. The results show that the back titration method has many good points, for example, relatively simple operation, easy to judgment the titration end point, better accuracy and precision of analytical results, the relative standard deviation not less than 0.2%. So, it is the ideal of conventional analysis methods in the mass production.
Vojdani, M; Torabi, K; Farjood, E; Khaledi, AAR
2013-01-01
Statement of Problem: Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. Purpose: This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Materials and Method: Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student’s t- test was used for statistical analysis (α=0.05). Results: The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student’s t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Conclusion: Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um. PMID:24724133
Chen, Minghao; Wei, Shiyou; Hu, Junyan; Yuan, Jing; Liu, Fenghua
2017-01-01
The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94-1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06-1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80-1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use.
Formal hardware verification of digital circuits
NASA Technical Reports Server (NTRS)
Joyce, J.; Seger, C.-J.
1991-01-01
The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.
Global-Local Finite Element Analysis for Thermo-Mechanical Stresses in Bonded Joints
NASA Technical Reports Server (NTRS)
Shkarayev, S.; Madenci, Erdogan; Camarda, C. J.
1997-01-01
An analysis of adhesively bonded joints using conventional finite elements does not capture the singular behavior of the stress field in regions where two or three dissimilar materials form a junction with or without free edges. However, these regions are characteristic of the bonded joints and are prone to failure initiation. This study presents a method to capture the singular stress field arising from the geometric and material discontinuities in bonded composites. It is achieved by coupling the local (conventional) elements with global (special) elements whose interpolation functions are constructed from the asymptotic solution.
NASA Astrophysics Data System (ADS)
Lisitsa, Y. V.; Yatskou, M. M.; Apanasovich, V. V.; Apanasovich, T. V.
2015-09-01
We have developed an algorithm for segmentation of cancer cell nuclei in three-channel luminescent images of microbiological specimens. The algorithm is based on using a correlation between fluorescence signals in the detection channels for object segmentation, which permits complete automation of the data analysis procedure. We have carried out a comparative analysis of the proposed method and conventional algorithms implemented in the CellProfiler and ImageJ software packages. Our algorithm has an object localization uncertainty which is 2-3 times smaller than for the conventional algorithms, with comparable segmentation accuracy.
Effects of organic and conventional cultivation methods on composition of eggplant fruits.
Raigón, María D; Rodríguez-Burruezo, Adrián; Prohens, Jaime
2010-06-09
Organic food is associated by the general public with improved nutritional properties, and this has led to increasing demand for organic vegetables. The effects of organic and conventional cultivation methods on dry matter, protein, minerals, and total phenolic content has been studied for two successive years in two landraces and one commercial hybrid of eggplant. In the first year, organically produced eggplants had higher mean contents (expressed on a fresh weight basis) of K (196 vs 171 mg 100 g(-1)), Ca (11.1 vs 8.7 mg 100 g(-1)), Mg (6.0 vs 4.6 mg 100 g(-1)), and total phenolics (49.8 vs 38.2 mg 100 g(-1)) than conventionally grown eggplants. In the second year, in which matched plots having a history of organic management were cultivated following organic or conventional fertilization practices, organically produced eggplants still had higher contents of K (272 vs 249 mg 100 g(-1)) and Mg (8.8 vs 7.6), as well as of Cu (0.079 vs 0.065 mg 100 g(-1)), than conventionally fertilized eggplants. Conventionally cultivated eggplants had a higher polyphenol oxidase activity than organically cultivated ones (3.19 vs 2.17 enzyme activity units), although no differences in browning were observed. Important differences in mineral concentrations between years were detected, which resulted in many correlations among mineral contents being significant. The first component of the principal component analysis separates the eggplants according to year, whereas the second component separates them according to the cultivation method (organic or conventional). Overall, the results show that organic management and fertilization have a positive effect on the accumulation of certain beneficial minerals and phenolic compounds in eggplant and that organically and conventionally produced eggplants might be distinguished according to their composition profiles.
Leaf flavonoids of Albizia lebbeck.
el-Mousallamy, A M
1998-06-01
Two new tri-O-glycoside flavonols: kaempferol and quercetin 3-O-alpha-rhamnopyranosyl(1-->6)-beta-glucopyranosyl(1-->6)-beta- galactopyranosides, were identified from the leaves of Albizia lebbeck. Structures were established by conventional methods of analysis and confirmed by ESI-MS, 1H and 13C-NMR spectral analysis.
Automated Analysis of Child Phonetic Production Using Naturalistic Recordings
ERIC Educational Resources Information Center
Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill
2014-01-01
Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…
Baek, Hyun Jae; Shin, JaeWook; Jin, Gunwoo; Cho, Jaegeol
2017-10-24
Photoplethysmographic signals are useful for heart rate variability analysis in practical ambulatory applications. While reducing the sampling rate of signals is an important consideration for modern wearable devices that enable 24/7 continuous monitoring, there have not been many studies that have investigated how to compensate the low timing resolution of low-sampling-rate signals for accurate heart rate variability analysis. In this study, we utilized the parabola approximation method and measured it against the conventional cubic spline interpolation method for the time, frequency, and nonlinear domain variables of heart rate variability. For each parameter, the intra-class correlation, standard error of measurement, Bland-Altman 95% limits of agreement and root mean squared relative error were presented. Also, elapsed time taken to compute each interpolation algorithm was investigated. The results indicated that parabola approximation is a simple, fast, and accurate algorithm-based method for compensating the low timing resolution of pulse beat intervals. In addition, the method showed comparable performance with the conventional cubic spline interpolation method. Even though the absolute value of the heart rate variability variables calculated using a signal sampled at 20 Hz were not exactly matched with those calculated using a reference signal sampled at 250 Hz, the parabola approximation method remains a good interpolation method for assessing trends in HRV measurements for low-power wearable applications.
Interpreting findings from Mendelian randomization using the MR-Egger method.
Burgess, Stephen; Thompson, Simon G
2017-05-01
Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.
Koopman, Richelle J; Kochendorfer, Karl M; Moore, Joi L; Mehr, David R; Wakefield, Douglas S; Yadamsuren, Borchuluun; Coberly, Jared S; Kruse, Robin L; Wakefield, Bonnie J; Belden, Jeffery L
2011-01-01
We compared use of a new diabetes dashboard screen with use of a conventional approach of viewing multiple electronic health record (EHR) screens to find data needed for ambulatory diabetes care. We performed a usability study, including a quantitative time study and qualitative analysis of information-seeking behaviors. While being recorded with Morae Recorder software and "think-aloud" interview methods, 10 primary care physicians first searched their EHR for 10 diabetes data elements using a conventional approach for a simulated patient, and then using a new diabetes dashboard for another. We measured time, number of mouse clicks, and accuracy. Two coders analyzed think-aloud and interview data using grounded theory methodology. The mean time needed to find all data elements was 5.5 minutes using the conventional approach vs 1.3 minutes using the diabetes dashboard (P <.001). Physicians correctly identified 94% of the data requested using the conventional method, vs 100% with the dashboard (P <.01). The mean number of mouse clicks was 60 for conventional searching vs 3 clicks with the diabetes dashboard (P <.001). A common theme was that in everyday practice, if physicians had to spend too much time searching for data, they would either continue without it or order a test again. Using a patient-specific diabetes dashboard improves both the efficiency and accuracy of acquiring data needed for high-quality diabetes care. Usability analysis tools can provide important insights into the value of optimizing physician use of health information technologies.
Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) for a 3-D Flexible Wing
NASA Technical Reports Server (NTRS)
Gumbert, Clyde R.; Hou, Gene J.-W.
2001-01-01
The formulation and implementation of an optimization method called Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) are extended from single discipline analysis (aerodynamics only) to multidisciplinary analysis - in this case, static aero-structural analysis - and applied to a simple 3-D wing problem. The method aims to reduce the computational expense incurred in performing shape optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, Finite Element Method (FEM) structural analysis and sensitivity analysis tools. Results for this small problem show that the method reaches the same local optimum as conventional optimization. However, unlike its application to the win,, (single discipline analysis), the method. as I implemented here, may not show significant reduction in the computational cost. Similar reductions were seen in the two-design-variable (DV) problem results but not in the 8-DV results given here.
PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD
NASA Astrophysics Data System (ADS)
Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao
Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Magnetic Field Analysis of Lorentz Motors Using a Novel Segmented Magnetic Equivalent Circuit Method
Qian, Junbing; Chen, Xuedong; Chen, Han; Zeng, Lizhan; Li, Xiaoqing
2013-01-01
A simple and accurate method based on the magnetic equivalent circuit (MEC) model is proposed in this paper to predict magnetic flux density (MFD) distribution of the air-gap in a Lorentz motor (LM). In conventional MEC methods, the permanent magnet (PM) is treated as one common source and all branches of MEC are coupled together to become a MEC network. In our proposed method, every PM flux source is divided into three sub-sections (the outer, the middle and the inner). Thus, the MEC of LM is divided correspondingly into three independent sub-loops. As the size of the middle sub-MEC is small enough, it can be treated as an ideal MEC and solved accurately. Combining with decoupled analysis of outer and inner MECs, MFD distribution in the air-gap can be approximated by a quadratic curve, and the complex calculation of reluctances in MECs can be avoided. The segmented magnetic equivalent circuit (SMEC) method is used to analyze a LM, and its effectiveness is demonstrated by comparison with FEA, conventional MEC and experimental results. PMID:23358368
Avalos, Marta; Adroher, Nuria Duran; Lagarde, Emmanuel; Thiessard, Frantz; Grandvalet, Yves; Contrand, Benjamin; Orriols, Ludivine
2012-09-01
Large data sets with many variables provide particular challenges when constructing analytic models. Lasso-related methods provide a useful tool, although one that remains unfamiliar to most epidemiologists. We illustrate the application of lasso methods in an analysis of the impact of prescribed drugs on the risk of a road traffic crash, using a large French nationwide database (PLoS Med 2010;7:e1000366). In the original case-control study, the authors analyzed each exposure separately. We use the lasso method, which can simultaneously perform estimation and variable selection in a single model. We compare point estimates and confidence intervals using (1) a separate logistic regression model for each drug with a Bonferroni correction and (2) lasso shrinkage logistic regression analysis. Shrinkage regression had little effect on (bias corrected) point estimates, but led to less conservative results, noticeably for drugs with moderate levels of exposure. Carbamates, carboxamide derivative and fatty acid derivative antiepileptics, drugs used in opioid dependence, and mineral supplements of potassium showed stronger associations. Lasso is a relevant method in the analysis of databases with large number of exposures and can be recommended as an alternative to conventional strategies.
Joachimsthal, Eva L; Ivanov, Volodymyr; Tay, Joo-Hwa; Tay, Stephen T-L
2003-03-01
Conventional methods for bacteriological testing of water quality take long periods of time to complete. This makes them inappropriate for a shipping industry that is attempting to comply with the International Maritime Organization's anticipated regulations for ballast water discharge. Flow cytometry for the analysis of marine and ship's ballast water is a comparatively fast and accurate method. Compared to a 5% standard error for flow cytometry analysis the standard methods of culturing and epifluorescence analysis have errors of 2-58% and 10-30%, respectively. Also, unlike culturing methods, flow cytometry is capable of detecting both non-viable and viable but non-culturable microorganisms which can still pose health risks. The great variability in both cell concentrations and microbial content for the samples tested is an indication of the difficulties facing microbial monitoring programmes. The concentration of microorganisms in the ballast tank was generally lower than in local seawater. The proportion of aerobic, microaerophilic, and facultative anaerobic microorganisms present appeared to be influenced by conditions in the ballast tank. The gradual creation of anaerobic conditions in a ballast tank could lead to the accumulation of facultative anaerobic microorganisms, which might represent a potential source of pathogenic species.
ERIC Educational Resources Information Center
Dhingra, Sunita; Angrish, Chetna
2011-01-01
Qualitative organic analysis of an unknown compound is an integral part of the university chemistry laboratory curriculum. This type of training is essential as students learn to approach a problem systematically and to interpret the results logically. However, considerable quantities of waste are generated by using conventional methods of…
Comparative study of Sperm Motility Analysis System and conventional microscopic semen analysis
KOMORI, KAZUHIKO; ISHIJIMA, SUMIO; TANJAPATKUL, PHANU; FUJITA, KAZUTOSHI; MATSUOKA, YASUHIRO; TAKAO, TETSUYA; MIYAGAWA, YASUSHI; TAKADA, SHINGO; OKUYAMA, AKIHIKO
2006-01-01
Background and Aim: Conventional manual sperm analysis still shows variations in structure, process and outcome although World Health Organization (WHO) guidelines present an appropriate method for sperm analysis. In the present study a new system for sperm analysis, Sperm Motility Analysis System (SMAS), was compared with manual semen analysis based on WHO guidelines. Materials and methods: Samples from 30 infertility patients and 21 healthy volunteers were subjected to manual microscopic analysis and SMAS analysis, simultaneously. We compared these two methods with respect to sperm concentration and percent motility. Results: Sperm concentrations obtained by SMAS (Csmas) and manual microscopic analyses on WHO guidelines (Cwho) were strongly correlated (Cwho = 1.325 × Csmas; r = 0.95, P < 0.001). If we excluded subjects with Csmas values >30 × 106 sperm/mL, the results were more similar (Cwho = 1.022 × Csmas; r = 0.81, P < 0.001). Percent motility obtained by SMAS (Msmas) and manual analysis on WHO guidelines (Mwho) were strongly correlated (Mwho = 1.214 × Msmas; r = 0.89, P < 0.001). Conclusions: The data indicate that the results of SMAS and those of manual microscopic sperm analyses based on WHO guidelines are strongly correlated. SMAS is therefore a promising system for sperm analysis. (Reprod Med Biol 2006; 5: 195–200) PMID:29662398
The 21st century skills with model eliciting activities on linear program
NASA Astrophysics Data System (ADS)
Handajani, Septriana; Pratiwi, Hasih; Mardiyana
2018-04-01
Human resources in the 21st century are required to master various forms of skills, including critical thinking skills and problem solving. The teaching of the 21st century is a teaching that integrates literacy skills, knowledge, skills, attitudes, and mastery of ICT. This study aims to determine whether there are differences in the effect of applying Model Elliciting Activities (MEAs) that integrates 21st century skills, namely 4C and conventional learning to learning outcomes. This research was conducted at Vocational High School in the odd semester of 2017 and uses the experimental method. The experimental class is treated MEAs that integrates 4C skills and the control class is given conventional learning. Methods of data collection in this study using the method of documentation and test methods. The data analysis uses Z-test. Data obtained from experiment class and control class. The result of this study showed there are differences in the effect of applying MEAs that integrates 4C skills and conventional learning to learning outcomes. Classes with MEAs that integrates 4C skills give better learning outcomes than the ones in conventional learning classes. This happens because MEAs that integrates 4C skills can improved creativity skills, communication skills, collaboration skills, and problem-solving skills.
Han, Yang; Hou, Shao-Yang; Ji, Shang-Zhi; Cheng, Juan; Zhang, Meng-Yue; He, Li-Juan; Ye, Xiang-Zhong; Li, Yi-Min; Zhang, Yi-Xuan
2017-11-15
A novel method, real-time reverse transcription PCR (real-time RT-PCR) coupled with probe-melting curve analysis, has been established to detect two kinds of samples within one fluorescence channel. Besides a conventional TaqMan probe, this method employs another specially designed melting-probe with a 5' terminus modification which meets the same label with the same fluorescent group. By using an asymmetric PCR method, the melting-probe is able to detect an extra sample in the melting stage effectively while it almost has little influence on the amplification detection. Thus, this method allows the availability of united employment of both amplification stage and melting stage for detecting samples in one reaction. The further demonstration by simultaneous detection of human immunodeficiency virus (HIV) and hepatitis C virus (HCV) in one channel as a model system is presented in this essay. The sensitivity of detection by real-time RT-PCR coupled with probe-melting analysis was proved to be equal to that detected by conventional real-time RT-PCR. Because real-time RT-PCR coupled with probe-melting analysis can double the detection throughputs within one fluorescence channel, it is expected to be a good solution for the problem of low-throughput in current real-time PCR. Copyright © 2017 Elsevier Inc. All rights reserved.
Adventitious sounds identification and extraction using temporal-spectral dominance-based features.
Jin, Feng; Krishnan, Sridhar Sri; Sattar, Farook
2011-11-01
Respiratory sound (RS) signals carry significant information about the underlying functioning of the pulmonary system by the presence of adventitious sounds (ASs). Although many studies have addressed the problem of pathological RS classification, only a limited number of scientific works have focused on the analysis of the evolution of symptom-related signal components in joint time-frequency (TF) plane. This paper proposes a new signal identification and extraction method for various ASs based on instantaneous frequency (IF) analysis. The presented TF decomposition method produces a noise-resistant high definition TF representation of RS signals as compared to the conventional linear TF analysis methods, yet preserving the low computational complexity as compared to those quadratic TF analysis methods. The discarded phase information in conventional spectrogram has been adopted for the estimation of IF and group delay, and a temporal-spectral dominance spectrogram has subsequently been constructed by investigating the TF spreads of the computed time-corrected IF components. The proposed dominance measure enables the extraction of signal components correspond to ASs from noisy RS signal at high noise level. A new set of TF features has also been proposed to quantify the shapes of the obtained TF contours, and therefore strongly, enhances the identification of multicomponents signals such as polyphonic wheezes. An overall accuracy of 92.4±2.9% for the classification of real RS recordings shows the promising performance of the presented method.
Fatania, Nita; Fraser, Mark; Savage, Mike; Hart, Jason; Abdolrasouli, Alireza
2015-12-01
Performance of matrix-assisted laser desorption ionisation-time of flight mass spectrometry (MALDI-TOF MS) was compared in a side-by side-analysis with conventional phenotypic methods currently in use in our laboratory for identification of yeasts in a routine diagnostic setting. A diverse collection of 200 clinically important yeasts (19 species, five genera) were identified by both methods using standard protocols. Discordant or unreliable identifications were resolved by sequencing of the internal transcribed spacer region of the rRNA gene. MALDI-TOF and conventional methods were in agreement for 182 isolates (91%) with correct identification to species level. Eighteen discordant results (9%) were due to rarely encountered species, hence the difficulty in their identification using traditional phenotypic methods. MALDI-TOF MS enabled rapid, reliable and accurate identification of clinically important yeasts in a routine diagnostic microbiology laboratory. Isolates with rare, unusual or low probability identifications should be confirmed using robust molecular methods. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Lift-Shape Construction, An EFL Project Report.
ERIC Educational Resources Information Center
Evans, Ben H.
Research development of a construction system is detailed in terms of--(1) design and analysis, (2) construction methods, (3) testing, (4) cost analysis, and (5) architectural potentials. The system described permits construction of usual shapes without the use of conventional concrete formwork. The concrete involves development of a structural…
Vaughan, Patrick E; Orth, Michael W; Haut, Roger C; Karcher, Darrin M
2016-01-01
While conventional mechanical testing has been regarded as a gold standard for the evaluation of bone heath in numerous studies, with recent advances in medical imaging, virtual methods of biomechanics are rapidly evolving in the human literature. The objective of the current study was to evaluate the feasibility of determining the elastic and failure properties of poultry long bones using established methods of analysis from the human literature. In order to incorporate a large range of bone sizes and densities, a small number of specimens were utilized from an ongoing study of Regmi et al. (2016) that involved humeri and tibiae from 3 groups of animals (10 from each) including aviary, enriched, and conventional housing systems. Half the animals from each group were used for 'training' that involved the development of a regression equation relating bone density and geometry to bending properties from conventional mechanical tests. The remaining specimens from each group were used for 'testing' in which the mechanical properties from conventional tests were compared to those predicted by the regression equations. Based on the regression equations, the coefficients of determination for the 'test' set of data were 0.798 for bending bone stiffness and 0.901 for the yield (or failure) moment of the bones. All regression slopes and intercepts values for the tests versus predicted plots were not significantly different from 1 and 0, respectively. The study showed the feasibility of developing future methods of virtual biomechanics for the evaluation of poultry long bones. With further development, virtual biomechanics may have utility in future in vivo studies to assess laying hen bone health over time without the need to sacrifice large groups of animals at each time point. © 2016 Poultry Science Association Inc.
Atalay, Altay; Koc, Ayse Nedret; Suel, Ahmet; Sav, Hafize; Demir, Gonca; Elmali, Ferhan; Cakir, Nuri; Seyedmousavi, Seyedmojtaba
2016-09-01
Aspergillus species cause a wide range of diseases in humans, including allergies, localized infections, or fatal disseminated diseases. Rapid detection and identification of Aspergillus spp. facilitate effective patient management. In the current study we compared conventional morphological methods with PCR sequencing, rep-PCR, and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for the identification of Aspergillus strains. A total of 24 consecutive clinical isolates of Aspergillus were collected during 2012-2014. Conventional morphology and rep-PCR were performed in our Mycology Laboratory. The identification, evaluation, and reporting of strains using MALDI-TOF-MS were performed by BioMérieux Diagnostic, Inc. in Istanbul. DNA sequence analysis of the clinical isolates was performed by the BMLabosis laboratory in Ankara. Samples consisted of 18 (75%) lower respiratory tract specimens, 3 otomycosis (12.5%) ear tissues, 1 sample from keratitis, and 1 sample from a cutaneous wound. According to DNA sequence analysis, 12 (50%) specimens were identified as A. fumigatus, 8 (33.3%) as A. flavus, 3 (12.5%) as A. niger, and 1 (4.2%) as A. terreus. Statistically, there was good agreement between the conventional morphology and rep-PCR and MALDI-TOF methods; kappa values were κ = 0.869, 0.871, and 0.916, respectively (P < 0.001). The good level of agreement between the methods included in the present study and sequence method could be due to the identification of Aspergillus strains that were commonly encountered. Therefore, it was concluded that studies conducted with a higher number of isolates, which include other Aspergillus strains, are required. © 2016 Wiley Periodicals, Inc.
Gajjar, Ketan; Ahmadzai, Abdullah A.; Valasoulis, George; Trevisan, Júlio; Founta, Christina; Nasioutziki, Maria; Loufopoulos, Aristotelis; Kyrgiou, Maria; Stasinou, Sofia Melina; Karakitsos, Petros; Paraskevaidis, Evangelos; Da Gama-Rose, Bianca; Martin-Hirsch, Pierre L.; Martin, Francis L.
2014-01-01
Background Subjective visual assessment of cervical cytology is flawed, and this can manifest itself by inter- and intra-observer variability resulting ultimately in the degree of discordance in the grading categorisation of samples in screening vs. representative histology. Biospectroscopy methods have been suggested as sensor-based tools that can deliver objective assessments of cytology. However, studies to date have been apparently flawed by a corresponding lack of diagnostic efficiency when samples have previously been classed using cytology screening. This raises the question as to whether categorisation of cervical cytology based on imperfect conventional screening reduces the diagnostic accuracy of biospectroscopy approaches; are these latter methods more accurate and diagnose underlying disease? The purpose of this study was to compare the objective accuracy of infrared (IR) spectroscopy of cervical cytology samples using conventional cytology vs. histology-based categorisation. Methods Within a typical clinical setting, a total of n = 322 liquid-based cytology samples were collected immediately before biopsy. Of these, it was possible to acquire subsequent histology for n = 154. Cytology samples were categorised according to conventional screening methods and subsequently interrogated employing attenuated total reflection Fourier-transform IR (ATR-FTIR) spectroscopy. IR spectra were pre-processed and analysed using linear discriminant analysis. Dunn’s test was applied to identify the differences in spectra. Within the diagnostic categories, histology allowed us to determine the comparative efficiency of conventional screening vs. biospectroscopy to correctly identify either true atypia or underlying disease. Results Conventional cytology-based screening results in poor sensitivity and specificity. IR spectra derived from cervical cytology do not appear to discriminate in a diagnostic fashion when categories were based on conventional screening. Scores plots of IR spectra exhibit marked crossover of spectral points between different cytological categories. Although, significant differences between spectral bands in different categories are noted, crossover samples point to the potential for poor specificity and hampers the development of biospectroscopy as a diagnostic tool. However, when histology-based categories are used to conduct analyses, the scores plot of IR spectra exhibit markedly better segregation. Conclusions Histology demonstrates that ATR-FTIR spectroscopy of liquid-based cytology identifies the presence of underlying atypia or disease missed in conventional cytology screening. This study points to an urgent need for a future biospectroscopy study where categories are based on such histology. It will allow for the validation of this approach as a screening tool. PMID:24404130
de Almeida, Sandro Marco Steanini; Franca, Fabiana Mantovani Gomes; Florio, Flavia Martao; Ambrosano, Glaucia Maria Bovi; Basting, Roberta Tarkany
2013-07-01
Chemomechanical caries removal, when compared with removal using conventional rotary instruments, seems to preserve healthy tooth structure with less trauma to the patient. This study performed in vivo analysis of the total number of microorganisms in dentin after the use of conventional or chemomechanical (papain gel) caries removal methods. Analyses were performed before caries removal (baseline), immediately after caries removal, and 45 days after caries removal and temporary cavity sealing. Sixty patients were selected for this study, each with two mandibular molars (one on each side) with occlusal caries of moderate depth, for a total of 120 teeth. For each patient, the carious lesion of one tooth was removed by conventional methods using low speed drills (Group 1). For the other tooth, a chemomechanical method was used (Group 2). Dentin samples were collected at the three intervals and subjected to microbiological culture in blood agar. For the total number of microorganisms in both groups, ANOVA and Tukey tests (which considered the baseline values as a covariable) showed a higher microbial count immediately after the preparation of the cavity compared to the count at 45 days (P < 0.05). For both groups, the total count of microorganisms in dentin decreased 45 days after placing the temporary cavity sealing.
Feng, Jingwen; Lin, Jie; Zhang, Pengquan; Yang, Songnan; Sa, Yu; Feng, Yuanming
2017-08-29
High-content screening is commonly used in studies of the DNA damage response. The double-strand break (DSB) is one of the most harmful types of DNA damage lesions. The conventional method used to quantify DSBs is γH2AX foci counting, which requires manual adjustment and preset parameters and is usually regarded as imprecise, time-consuming, poorly reproducible, and inaccurate. Therefore, a robust automatic alternative method is highly desired. In this manuscript, we present a new method for quantifying DSBs which involves automatic image cropping, automatic foci-segmentation and fluorescent intensity measurement. Furthermore, an additional function was added for standardizing the measurement of DSB response inhibition based on co-localization analysis. We tested the method with a well-known inhibitor of DSB response. The new method requires only one preset parameter, which effectively minimizes operator-dependent variations. Compared with conventional methods, the new method detected a higher percentage difference of foci formation between different cells, which can improve measurement accuracy. The effects of the inhibitor on DSB response were successfully quantified with the new method (p = 0.000). The advantages of this method in terms of reliability, automation and simplicity show its potential in quantitative fluorescence imaging studies and high-content screening for compounds and factors involved in DSB response.
van Bochove, J A; van Amerongen, W E
2006-03-01
The aim is to investigate possible differences in discomfort during treatment with the atraumatic restorative treatment (ART) or the Conventional restorative method with and without local analgesia (LA). The study group consisted of 6 and 7 year old children with no dental experience (mean age 6.98, SD +/- 0.52) randomly divided into four treatment groups: Conventional method with and without LA and ART with and without LA. One or two proximal lesions in primary molars were treated. The heart rate and the behaviour (Venham) were measured. Statistical analysis was performed in SPSS version 10.0. In a first session 300 children were treated and 109 children for a second time in the same way as at the first visit. During the first session ART without LA gave the least discomfort while the Conventional method without LA gave the most discomfort. During the second treatment the least discomfort was observed with ART without LA and the most discomfort in the Conventional way with LA. There is a constant preference for hand instruments; the bur is increasingly accepted. The experience with LA is the reverse.
Keystroke dynamics in the pre-touchscreen era
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.
2013-01-01
Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568
Rainer, Matthias; Qureshi, Muhammad Nasimullah; Bonn, Günther Karl
2011-06-01
The application of matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) for the analysis of low molecular weight (LMW) compounds, such as pharmacologically active constituents or metabolites, is usually hampered by employing conventional MALDI matrices owing to interferences caused by matrix molecules below 700 Da. As a consequence, interpretation of mass spectra remains challenging, although matrix suppression can be achieved under certain conditions. Unlike the conventional MALDI methods which usually suffer from background signals, matrix-free techniques have become more and more popular for the analysis of LMW compounds. In this review we describe recently introduced materials for laser desorption/ionization (LDI) as alternatives to conventionally applied MALDI matrices. In particular, we want to highlight a new method for LDI which is referred to as matrix-free material-enhanced LDI (MELDI). In matrix-free MELDI it could be clearly shown, that besides chemical functionalities, the material's morphology plays a crucial role regarding energy-transfer capabilities. Therefore, it is of great interest to also investigate parameters such as particle size and porosity to study their impact on the LDI process. Especially nanomaterials such as diamond-like carbon, C(60) fullerenes and nanoparticulate silica beads were found to be excellent energy-absorbing materials in matrix-free MELDI.
Keystroke dynamics in the pre-touchscreen era.
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A
2013-12-19
Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.
Comparative analysis of stress in a new proposal of dental implants.
Valente, Mariana Lima da Costa; de Castro, Denise Tornavoi; Macedo, Ana Paula; Shimano, Antonio Carlos; Dos Reis, Andréa Cândido
2017-08-01
The purpose of this study was to compare, through photoelastic analysis, the stress distribution around conventional and modified external hexagon (EH) and morse taper (MT) dental implant connections. Four photoelastic models were prepared (n=1): Model 1 - conventional EH cylindrical implant (Ø 4.0mm×11mm - Neodent®), Model 2 - modified EH cylindrical implant, Model 3 - conventional MT Conical implant (Ø 4.3mm×10mm - Neodent®) and Model 4 - modified MT conical implant. 100 and 150N axial and oblique loads (30° tilt) were applied in the devices coupled to the implants. A plane transmission polariscope was used in the analysis of fringes and each position of interest was recorded by a digital camera. The Tardy method was used to quantify the fringe order (n), that calculates the maximum shear stress (τ) value in each selected point. The results showed lower stress concentration in the modified cylindrical implant (EH) compared to the conventional model, with application of 150N axial and 100N oblique loads. Lower stress was observed for the modified conical (MT) implant with the application of 100 and 150N oblique loads, which was not observed for the conventional implant model. The comparative analysis of the models showed that the new design proposal generates good stress distribution, especially in the cervical third, suggesting the preservation of bone tissue in the bone crest region. Copyright © 2017 Elsevier B.V. All rights reserved.
Laser fluorometric analysis of plants for uranium exploration
Harms, T.F.; Ward, F.N.; Erdman, J.A.
1981-01-01
A preliminary test of biogeochemical exploration for locating uranium occurrences in the Marfa Basin, Texas, was conducted in 1978. Only 6 of 74 plant samples (mostly catclaw mimosa, Mimosa biuncifera) contained uranium in amounts above the detection limit (0.4 ppm in the ash) of the conventional fluorometric method. The samples were then analyzed using a Scintrex UA-3 uranium analyzer* * Use of trade names in this paper is for descriptive purposes only and does not constitute endorsement by the U.S. Geological Survey. - an instrument designed for direct analysis of uranium in water, and which can be conveniently used in a mobile field laboratory. The detection limit for uranium in plant ash (0.05 ppm) by this method is almost an order of magnitude lower than with the fluorometric conventional method. Only 1 of the 74 samples contained uranium below the detection limit of the new method. Accuracy and precision were determined to be satisfactory. Samples of plants growing on mineralized soils and nonmineralized soils show a 15-fold difference in uranium content; whereas the soils themselves (analyzed by delayed neutron activation analysis) show only a 4-fold difference. The method involves acid digestion of ashed tissue, extraction of uranium into ethyl acetate, destruction of the ethyl acetate, dissolution of the residue in 0.005% nitric acid, and measurement. ?? 1981.
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.
Zou, L; Bloebaum, R D; Bachus, K N
1997-01-01
Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.
Study of Burn Scar Extraction Automatically Based on Level Set Method using Remote Sensing Data
Liu, Yang; Dai, Qin; Liu, JianBo; Liu, ShiBin; Yang, Jin
2014-01-01
Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model. PMID:24503563
Yuan, Jing; Liu, Fenghua
2017-01-01
Objective The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Methods Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Results Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94–1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06–1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80–1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Conclusions Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use. PMID:28570713
Monitoring beach changes using GPS surveying techniques
Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.
1993-01-01
The adaptation of Global Positioning System (GPS) surveying techniques to beach monitoring activities is a promising response to this challenge. An experiment that employed both GPS and conventional beach surveying was conducted, and a new beach monitoring method employing kinematic GPS surveys was devised. This new method involves the collection of precise shore-parallel and shore-normal GPS positions from a moving vehicle so that an accurate two-dimensional beach surface can be generated. Results show that the GPS measurements agree with conventional shore-normal surveys at the 1 cm level, and repeated GPS measurements employing the moving vehicle demonstrate a precision of better than 1 cm. In addition, the nearly continuous sampling and increased resolution provided by the GPS surveying technique reveals alongshore changes in beach morphology that are undetected by conventional shore-normal profiles. The application of GPS surveying techniques combined with the refinement of appropriate methods for data collection and analysis provides a better understanding of beach changes, sediment transport, and storm impacts.
Nauleau, Pierre; Apostolakis, Iason; McGarry, Matthew; Konofagou, Elisa
2018-05-29
The stiffness of the arteries is known to be an indicator of the progression of various cardiovascular diseases. Clinically, the pulse wave velocity (PWV) is used as a surrogate for arterial stiffness. Pulse wave imaging (PWI) is a non-invasive, ultrasound-based imaging technique capable of mapping the motion of the vessel walls, allowing the local assessment of arterial properties. Conventionally, a distinctive feature of the displacement wave (e.g. the 50% upstroke) is tracked across the map to estimate the PWV. However, the presence of reflections, such as those generated at the carotid bifurcation, can bias the PWV estimation. In this paper, we propose a two-step cross-correlation based method to characterize arteries using the information available in the PWI spatio-temporal map. First, the area under the cross-correlation curve is proposed as an index for locating the regions of different properties. Second, a local peak of the cross-correlation function is tracked to obtain a less biased estimate of the PWV. Three series of experiments were conducted in phantoms to evaluate the capabilities of the proposed method compared with the conventional method. In the ideal case of a homogeneous phantom, the two methods performed similarly and correctly estimated the PWV. In the presence of reflections, the proposed method provided a more accurate estimate than conventional processing: e.g. for the soft phantom, biases of -0.27 and -0.71 m · s -1 were observed. In a third series of experiments, the correlation-based method was able to locate two regions of different properties with an error smaller than 1 mm. It also provided more accurate PWV estimates than conventional processing (biases: -0.12 versus -0.26 m · s -1 ). Finally, the in vivo feasibility of the proposed method was demonstrated in eleven healthy subjects. The results indicate that the correlation-based method might be less precise in vivo but more accurate than the conventional method.
A novel spinal kinematic analysis using X-ray imaging and vicon motion analysis: a case study.
Noh, Dong K; Lee, Nam G; You, Joshua H
2014-01-01
This study highlights a novel spinal kinematic analysis method and the feasibility of X-ray imaging measurements to accurately assess thoracic spine motion. The advanced X-ray Nash-Moe method and analysis were used to compute the segmental range of motion in thoracic vertebra pedicles in vivo. This Nash-Moe X-ray imaging method was compared with a standardized method using the Vicon 3-dimensional motion capture system. Linear regression analysis showed an excellent and significant correlation between the two methods (R2 = 0.99, p < 0.05), suggesting that the analysis of spinal segmental range of motion using X-ray imaging measurements was accurate and comparable to the conventional 3-dimensional motion analysis system. Clinically, this novel finding is compelling evidence demonstrating that measurements with X-ray imaging are useful to accurately decipher pathological spinal alignment and movement impairments in idiopathic scoliosis (IS).
NASA Technical Reports Server (NTRS)
Stoner, Mary Cecilia; Hehir, Austin R.; Ivanco, Marie L.; Domack, Marcia S.
2016-01-01
This cost-benefit analysis assesses the benefits of the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. These preliminary, rough order-of-magnitude results report a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Production cost savings of 35 to 58 percent were reported over the composite manufacturing technique used in this study for comparison; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels when compared with conventional metallic manufacturing. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. A case study compared these three alternatives for manufacturing a cylinder of specified geometry, with particular focus placed on production costs and process complexity, with cost analyses performed by the analogy and parametric methods. Furthermore, a scalability study was conducted for three tank diameters to assess the highest potential payoff of the ANNST process for manufacture of large-diameter cryogenic tanks. The analytical hierarchy process (AHP) was subsequently used with a group of selected subject matter experts to assess the value of the various benefits achieved by the ANNST method for potential stakeholders. The AHP study results revealed that decreased final cylinder mass and quality assurance were the most valued benefits of cylinder manufacturing methods, therefore emphasizing the relevance of the benefits achieved with the ANNST process for future projects.
Taguchi, Y-h; Iwadate, Mitsuo; Umeyama, Hideaki
2015-04-30
Feature extraction (FE) is difficult, particularly if there are more features than samples, as small sample numbers often result in biased outcomes or overfitting. Furthermore, multiple sample classes often complicate FE because evaluating performance, which is usual in supervised FE, is generally harder than the two-class problem. Developing sample classification independent unsupervised methods would solve many of these problems. Two principal component analysis (PCA)-based FE, specifically, variational Bayes PCA (VBPCA) was extended to perform unsupervised FE, and together with conventional PCA (CPCA)-based unsupervised FE, were tested as sample classification independent unsupervised FE methods. VBPCA- and CPCA-based unsupervised FE both performed well when applied to simulated data, and a posttraumatic stress disorder (PTSD)-mediated heart disease data set that had multiple categorical class observations in mRNA/microRNA expression of stressed mouse heart. A critical set of PTSD miRNAs/mRNAs were identified that show aberrant expression between treatment and control samples, and significant, negative correlation with one another. Moreover, greater stability and biological feasibility than conventional supervised FE was also demonstrated. Based on the results obtained, in silico drug discovery was performed as translational validation of the methods. Our two proposed unsupervised FE methods (CPCA- and VBPCA-based) worked well on simulated data, and outperformed two conventional supervised FE methods on a real data set. Thus, these two methods have suggested equivalence for FE on categorical multiclass data sets, with potential translational utility for in silico drug discovery.
Principal Component Analysis for pulse-shape discrimination of scintillation radiation detectors
NASA Astrophysics Data System (ADS)
Alharbi, T.
2016-01-01
In this paper, we report on the application of Principal Component analysis (PCA) for pulse-shape discrimination (PSD) of scintillation radiation detectors. The details of the method are described and the performance of the method is experimentally examined by discriminating between neutrons and gamma-rays with a liquid scintillation detector in a mixed radiation field. The performance of the method is also compared against that of the conventional charge-comparison method, demonstrating the superior performance of the method particularly at low light output range. PCA analysis has the important advantage of automatic extraction of the pulse-shape characteristics which makes the PSD method directly applicable to various scintillation detectors without the need for the adjustment of a PSD parameter.
Assessing a novel polymer-wick based electrode for EEG neurophysiological research.
Pasion, Rita; Paiva, Tiago O; Pedrosa, Paulo; Gaspar, Hugo; Vasconcelos, Beatriz; Martins, Ana C; Amaral, Maria H; Nóbrega, João M; Páscoa, Ricardo; Fonseca, Carlos; Barbosa, Fernando
2016-07-15
The EEG technique has decades of valid applications in clinical and experimental neurophysiology. EEG equipment and data analysis methods have been characterized by remarkable developments, but the skin-to-electrode signal transfer remains a challenge for EEG recording. A novel quasi-dry system - the polymer wick-based electrode - was developed to overcome the limitations of conventional dry and wet silver/silver-chloride (Ag/AgCl) electrodes for EEG recording. Nine participants completed an auditory oddball protocol with simultaneous EEG acquisition using both the conventional Ag/AgCl and the wick electrodes. Wick system successfully recorded the expected P300 modulation. Standard ERP analysis, residual random noise analysis, and single-trial analysis of the P300 wave were performed in order to compare signal acquired by both electrodes. It was found that the novel wick electrode performed similarly to the conventional Ag/AgCl electrodes. The developed wick electrode appears to be a reliable alternative for EEG research, representing a promising halfway alternative between wet and dry electrodes. Copyright © 2016 Elsevier B.V. All rights reserved.
Ahn, Tae-Jung; Jung, Yongmin; Oh, Kyunghwan; Kim, Dug Young
2005-12-12
We propose a new chromatic dispersion measurement method for the higher-order modes of an optical fiber using optical frequency modulated continuous-wave (FMCW) interferometry. An optical fiber which supports few excited modes was prepared for our experiments. Three different guiding modes of the fiber were identified by using far-field spatial beam profile measurements and confirmed with numerical mode analysis. By using the principle of a conventional FMWC interferometry with a tunable external cavity laser, we have demonstrated that the chromatic dispersion of a few-mode optical fiber can be obtained directly and quantitatively as well as qualitatively. We have also compared our measurement results with those of conventional modulation phase-shift method.
Acoustic analysis of the propfan
NASA Technical Reports Server (NTRS)
Farassat, F.; Succi, G. P.
1979-01-01
A review of propeller noise prediction technology is presented. Two methods for the prediction of the noise from conventional and advanced propellers in forward flight are described. These methods are based on different time domain formulations. Brief descriptions of the computer algorithms based on these formulations are given. The output of the programs (the acoustic pressure signature) was Fourier analyzed to get the acoustic pressure spectrum. The main difference between the two programs is that one can handle propellers with supersonic tip speed while the other is for subsonic tip speed propellers. Comparisons of the calculated and measured acoustic data for a conventional and an advanced propeller show good agreement in general.
A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.
Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian
2018-01-19
This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.
Amidžić Klarić, Daniela; Klarić, Ilija; Mornar, Ana; Velić, Darko; Velić, Natalija
2015-08-01
This study brings out the data on the content of 21 mineral and heavy metal in 15 blackberry wines made of conventionally and organically grown blackberries. The objective of this study was to classify the blackberry wine samples based on their mineral composition and the applied cultivation method of the starting raw material by using chemometric analysis. The metal content of Croatian blackberry wine samples was determined by AAS after dry ashing. The comparison between an organic and conventional group of investigated blackberry wines showed statistically significant difference in concentrations of Si and Li, where the organic group contained higher concentrations of these compounds. According to multivariate data analysis, the model based on the original metal content data set finally included seven original variables (K, Fe, Mn, Cu, Ba, Cd and Cr) and gave a satisfactory separation of two applied cultivation methods of the starting raw material.
Polarimetric Thomson scattering for high Te fusion plasmas
NASA Astrophysics Data System (ADS)
Giudicotti, L.
2017-11-01
Polarimetric Thomson scattering (TS) is a technique for the analysis of TS spectra in which the electron temperature Te is determined from the depolarization of the scattered radiation, a relativistic effect noticeable only in very hot (Te >= 10 keV) fusion plasmas. It has been proposed as a complementary technique to supplement the conventional spectral analysis in the ITER CPTS (Core Plasma Thomson Scattering) system for measurements in high Te, low ne plasma conditions. In this paper we review the characteristics of the depolarized TS radiation with special emphasis to the conditions of the ITER CPTS system and we describe a possible implementation of this diagnostic method suitable to significantly improve the performances of the conventional TS spectral analysis in the high Te range.
NASA Astrophysics Data System (ADS)
Zhang, Rui; Xin, Binjie
2016-08-01
Yarn density is always considered as the fundamental structural parameter used for the quality evaluation of woven fabrics. The conventional yarn density measurement method is based on one-side analysis. In this paper, a novel density measurement method is developed for yarn-dyed woven fabrics based on a dual-side fusion technique. Firstly, a lab-used dual-side imaging system is established to acquire both face-side and back-side images of woven fabric and the affine transform is used for the alignment and fusion of the dual-side images. Then, the color images of the woven fabrics are transferred from the RGB to the CIE-Lab color space, and the intensity information of the image extracted from the L component is used for texture fusion and analysis. Subsequently, three image fusion methods are developed and utilized to merge the dual-side images: the weighted average method, wavelet transform method and Laplacian pyramid blending method. The fusion efficacy of each method is evaluated by three evaluation indicators and the best of them is selected to do the reconstruction of the complete fabric texture. Finally, the yarn density of the fused image is measured based on the fast Fourier transform, and the yarn alignment image could be reconstructed using the inverse fast Fourier transform. Our experimental results show that the accuracy of density measurement by using the proposed method is close to 99.44% compared with the traditional method and the robustness of this new proposed method is better than that of conventional analysis methods.
[Nitrogen status diagnosis of rice by using a digital camera].
Jia, Liang-Liang; Fan, Ming-Sheng; Zhang, Fu-Suo; Chen, Xin-Ping; Lü, Shi-Hua; Sun, Yan-Ming
2009-08-01
In the present research, a field experiment with different N application rate was conducted to study the possibility of using visible band color analysis methods to monitor the N status of rice canopy. The Correlations of visible spectrum band color intensity between rice canopy image acquired from a digital camera and conventional nitrogen status diagnosis parameters of leaf SPAD chlorophyll meter readings, total N content, upland biomass and N uptake were studied. The results showed that the red color intensity (R), green color intensity (G) and normalized redness intensity (NRI) have significant inverse linear correlations with the conventional N diagnosis parameters of SPAD readings, total N content, upland biomass and total N uptake. The correlation coefficient values (r) were from -0.561 to -0.714 for red band (R), from -0.452 to -0.505 for green band (G), and from -0.541 to 0.817 for normalized redness intensity (NRI). But the normalized greenness intensity (NGI) showed a significant positive correlation with conventional N parameters and the correlation coefficient values (r) were from 0.505 to 0.559. Compared with SPAD readings, the normalized redness intensity (NRI), with a high r value of 0.541-0.780 with conventional N parameters, could better express the N status of rice. The digital image color analysis method showed the potential of being used in rice N status diagnosis in the future.
USDA-ARS?s Scientific Manuscript database
Various technologies have been developed for pathogen detection using optical, electrochemical, biochemical and physical properties. Conventional microbiological methods need time from days to week to get the result. Though this method is very sensitive and accurate, a rapid detection of pathogens i...
Current trends in endotoxin detection and analysis of endotoxin-protein interactions.
Dullah, Elvina Clarie; Ongkudon, Clarence M
2017-03-01
Endotoxin is a type of pyrogen that can be found in Gram-negative bacteria. Endotoxin can form a stable interaction with other biomolecules thus making its removal difficult especially during the production of biopharmaceutical drugs. The prevention of endotoxins from contaminating biopharmaceutical products is paramount as endotoxin contamination, even in small quantities, can result in fever, inflammation, sepsis, tissue damage and even lead to death. Highly sensitive and accurate detection of endotoxins are keys in the development of biopharmaceutical products derived from Gram-negative bacteria. It will facilitate the study of the intermolecular interaction of an endotoxin with other biomolecules, hence the selection of appropriate endotoxin removal strategies. Currently, most researchers rely on the conventional LAL-based endotoxin detection method. However, new methods have been and are being developed to overcome the problems associated with the LAL-based method. This review paper highlights the current research trends in endotoxin detection from conventional methods to newly developed biosensors. Additionally, it also provides an overview of the use of electron microscopy, dynamic light scattering (DLS), fluorescence resonance energy transfer (FRET) and docking programs in the endotoxin-protein analysis.
Feasibility of ballistic strengthening exercises in neurologic rehabilitation.
Williams, Gavin; Clark, Ross A; Hansson, Jessica; Paterson, Kade
2014-09-01
Conventional methods for strength training in neurologic rehabilitation are not task specific for walking. Ballistic strength training was developed to improve the functional transfer of strength training; however, no research has investigated this in neurologic populations. The aim of this pilot study was to evaluate the feasibility of applying ballistic principles to conventional leg strengthening exercises in individuals with mobility limitations as a result of neurologic injuries. Eleven individuals with neurologic injuries completed seated and reclined leg press using conventional and ballistic techniques. A 2 × 2 repeated-measures analysis of variance was used to compare power measures (peak movement height and peak velocity) between exercises and conditions. Peak jump velocity and peak jump height were greater when using the ballistic jump technique rather than the conventional concentric technique (P < 0.01). These findings suggest that when compared with conventional strengthening exercises, the incorporation of ballistic principles was associated with increased peak height and peak velocities.
Deviation Value for Conventional X-ray in Hospitals in South Sulawesi Province from 2014 to 2016
NASA Astrophysics Data System (ADS)
Bachtiar, Ilham; Abdullah, Bualkar; Tahir, Dahlan
2018-03-01
This paper describes the conventional X-ray machine parameters tested in the region of South Sulawesi from 2014 to 2016. The objective of this research is to know deviation of every parameter of conventional X-ray machine. The testing parameters were analyzed by using quantitative methods with participatory observational approach. Data collection was performed by testing the output of conventional X-ray plane using non-invasive x-ray multimeter. The test parameters include tube voltage (kV) accuracy, radiation output linearity, reproducibility and radiation beam value (HVL) quality. The results of the analysis show four conventional X-ray test parameters have varying deviation spans, where the tube voltage (kV) accuracy has an average value of 4.12%, the average radiation output linearity is 4.47% of the average reproducibility of 0.62% and the averaged of the radiation beam (HVL) is 3.00 mm.
Shivakumarswamy, Udasimath; Arakeri, Surekha U; Karigowdar, Mahesh H; Yelikar, Br
2012-01-01
The cytological examinations of serous effusions have been well-accepted, and a positive diagnosis is often considered as a definitive diagnosis. It helps in staging, prognosis and management of the patients in malignancies and also gives information about various inflammatory and non-inflammatory lesions. Diagnostic problems arise in everyday practice to differentiate reactive atypical mesothelial cells and malignant cells by the routine conventional smear (CS) method. To compare the morphological features of the CS method with those of the cell block (CB) method and also to assess the utility and sensitivity of the CB method in the cytodiagnosis of pleural effusions. The study was conducted in the cytology section of the Department of Pathology. Sixty pleural fluid samples were subjected to diagnostic evaluation for over a period of 20 months. Along with the conventional smears, cell blocks were prepared by using 10% alcohol-formalin as a fixative agent. Statistical analysis with the 'z test' was performed to identify the cellularity, using the CS and CB methods. Mc. Naemer's χ(2)test was used to identify the additional yield for malignancy by the CB method. Cellularity and additional yield for malignancy was 15% more by the CB method. The CB method provides high cellularity, better architectural patterns, morphological features and an additional yield of malignant cells, and thereby, increases the sensitivity of the cytodiagnosis when compared with the CS method.
Meyer, Georg F; Spray, Amy; Fairlie, Jo E; Uomini, Natalie T
2014-01-01
Current neuroimaging techniques with high spatial resolution constrain participant motion so that many natural tasks cannot be carried out. The aim of this paper is to show how a time-locked correlation-analysis of cerebral blood flow velocity (CBFV) lateralization data, obtained with functional TransCranial Doppler (fTCD) ultrasound, can be used to infer cerebral activation patterns across tasks. In a first experiment we demonstrate that the proposed analysis method results in data that are comparable with the standard Lateralization Index (LI) for within-task comparisons of CBFV patterns, recorded during cued word generation (CWG) at two difficulty levels. In the main experiment we demonstrate that the proposed analysis method shows correlated blood-flow patterns for two different cognitive tasks that are known to draw on common brain areas, CWG, and Music Synthesis. We show that CBFV patterns for Music and CWG are correlated only for participants with prior musical training. CBFV patterns for tasks that draw on distinct brain areas, the Tower of London and CWG, are not correlated. The proposed methodology extends conventional fTCD analysis by including temporal information in the analysis of cerebral blood-flow patterns to provide a robust, non-invasive method to infer whether common brain areas are used in different cognitive tasks. It complements conventional high resolution imaging techniques.
Monsen, T; Ryden, P
2017-09-01
Urinary tract infections (UTIs) are among the most common bacterial infections in men and urine culture is gold standard for diagnosis. Considering the high prevalence of culture-negative specimens, any method that identifies such specimens is of interest. The aim was to evaluate a new screening concept for flow cytometry analysis (FCA). The outcomes were evaluated against urine culture, uropathogen species and three conventional screening methods. A prospective, consecutive study examined 1,312 urine specimens, collected during January and February 2012. The specimens were analyzed using the Sysmex UF1000i FCA. Based on the FCA data culture negative specimens were identified in a new model by use of linear discriminant analysis (FCA-LDA). In total 1,312 patients were included. In- and outpatients represented 19.6% and 79.4%, respectively; 68.3% of the specimens originated from women. Of the 610 culture-positive specimens, Escherichia coli represented 64%, enterococci 8% and Klebsiella spp. 7%. Screening with FCA-LDA at 95% sensitivity identified 42% (552/1312) as culture negative specimens when UTI was defined according to European guidelines. The proposed screening method was either superior or similar in comparison to the three conventional screening methods. In conclusion, the proposed/suggested and new FCA-LDA screening method was superior or similar to three conventional screening methods. We recommend the proposed screening method to be used in clinic to exclude culture negative specimens, to reduce workload, costs and the turnaround time. In addition, the FCA data may add information that enhance handling and support diagnosis of patients with suspected UTI pending urine culture [corrected].
Wu, Shuaibin; Yang, Kaiguang; Liang, Zhen; Zhang, Lihua; Zhang, Yukui
2011-10-30
A formic acid (FA)-assisted sample preparation method was presented for protein identification via mass spectrometry (MS). Detailedly, an aqueous solution containing 2% FA and dithiothreitol was selected to perform protein denaturation, aspartic acid (D) sites cleavage and disulfide linkages reduction simultaneously at 108°C for 2h. Subsequently, FA wiped off via vacuum concentration. Finally, iodoacetamide (IAA) alkylation and trypsin digestion could be performed ordinally. A series of model proteins (BSA, β-lactoglobulin and apo-Transferrin) were treated respectively using such method, followed by matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) analysis. The identified peptide number was increased by ∼ 80% in comparison with the conventional urea-assisted sample preparation method. Moreover, BSA identification was achieved efficiently down to femtomole (25 ± 0 sequence coverage and 16 ± 1 peptides) via such method. In contrast, there were not peptides identified confidently via the urea-assisted method before desalination via the C18 zip tip. The absence of urea in this sample preparation method was an advantage for the more favorable digestion and MALDI-TOF MS analysis. The performances of two methods for the real sample (rat liver proteome) were also compared, followed by a nanoflow reversed-phase liquid chromatography with electrospray ionization tandem mass spectrometry system analysis. As a result, 1335 ± 43 peptides were identified confidently (false discovery rate <1%) via FA-assisted method, corresponding to 295 ± 12 proteins (of top match=1 and requiring 2 unique peptides at least). In contrast, there were only 1107 ± 16 peptides (corresponding to 231 ± 10 proteins) obtained from the conventional urea-assisted method. It was serving as a more efficient protein sample preparation method for researching specific proteomes better, and providing assistance to develop other proteomics analysis methods, such as, peptide quantitative analysis. Copyright © 2011 Elsevier B.V. All rights reserved.
Interactive-predictive detection of handwritten text blocks
NASA Astrophysics Data System (ADS)
Ramos Terrades, O.; Serrano, N.; Gordó, A.; Valveny, E.; Juan, A.
2010-01-01
A method for text block detection is introduced for old handwritten documents. The proposed method takes advantage of sequential book structure, taking into account layout information from pages previously transcribed. This glance at the past is used to predict the position of text blocks in the current page with the help of conventional layout analysis methods. The method is integrated into the GIDOC prototype: a first attempt to provide integrated support for interactive-predictive page layout analysis, text line detection and handwritten text transcription. Results are given in a transcription task on a 764-page Spanish manuscript from 1891.
Effects of 99mTc-TRODAT-1 drug template on image quantitative analysis
Yang, Bang-Hung; Chou, Yuan-Hwa; Wang, Shyh-Jen; Chen, Jyh-Cheng
2018-01-01
99mTc-TRODAT-1 is a type of drug that can bind to dopamine transporters in living organisms and is often used in SPCT imaging for observation of changes in the activity uptake of dopamine in the striatum. Therefore, it is currently widely used in studies on clinical diagnosis of Parkinson’s disease (PD) and movement-related disorders. In conventional 99mTc-TRODAT-1 SPECT image evaluation, visual inspection or manual selection of ROI for semiquantitative analysis is mainly used to observe and evaluate the degree of striatal defects. However, these methods are dependent on the subjective opinions of observers, which lead to human errors, have shortcomings such as long duration, increased effort, and have low reproducibility. To solve this problem, this study aimed to establish an automatic semiquantitative analytical method for 99mTc-TRODAT-1. This method combines three drug templates (one built-in SPECT template in SPM software and two self-generated MRI-based and HMPAO-based TRODAT-1 templates) for the semiquantitative analysis of the striatal phantom and clinical images. At the same time, the results of automatic analysis of the three templates were compared with results from a conventional manual analysis for examining the feasibility of automatic analysis and the effects of drug templates on automatic semiquantitative analysis results. After comparison, it was found that the MRI-based TRODAT-1 template generated from MRI images is the most suitable template for 99mTc-TRODAT-1 automatic semiquantitative analysis. PMID:29543874
Retention of denture bases fabricated by three different processing techniques – An in vivo study
Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen
2016-01-01
Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542
NASA Technical Reports Server (NTRS)
Hooke, F. H.
1972-01-01
Both the conventional and reliability analyses for determining safe fatigue life are predicted on a population having a specified (usually log normal) distribution of life to collapse under a fatigue test load. Under a random service load spectrum, random occurrences of load larger than the fatigue test load may confront and cause collapse of structures which are weakened, though not yet to the fatigue test load. These collapses are included in reliability but excluded in conventional analysis. The theory of risk determination by each method is given, and several reasonably typical examples have been worked out, in which it transpires that if one excludes collapse through exceedance of the uncracked strength, the reliability and conventional analyses gave virtually identical probabilities of failure or survival.
Bass, Roger
2010-01-01
Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless—a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to Enlightenment and Samādhi. The concept of stimulus singularity is introduced to account for why, within Zen's frame of reference, its methods can be studied but its primary outcomes (e.g., Samādhi and Satori) cannot be described in any conventional sense. PMID:22479128
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mavroidis, P; Boci, N; Kostopoulos, S
2015-06-15
Purpose: The aim of this present study is to increase bandwidth (BW) and echo train length (ETL) in Proton Density Turbo Spin Echo (PD TSE) sequences with and without fat saturation (FS) as well as in Turbo Inversion Recovery Magnitude sequences (TIRM) in order to assess whether these sequences are capable of reducing susceptibility artifacts. Methods: We compared 1) TIRM coronal (COR) with the same sequence with increased both BW and ETL 2) Conventional PD TSE sagittal (SAG) with FS with an increased BW 3) Conventional PD TSE SAG without FS with an increased BW 4) Conventional PD TSE SAGmore » without FS with increased both BW and ETL. A quantitative analysis was performed to measure the extent of the susceptibility artifacts. Furthermore, a qualitative analysis was performed by two radiologists in order to evaluate the susceptibility artifacts, image distortion and fat suppression. The depiction of cartilage, menisci, muscles, tendons and bone marrow were also qualitatively analyzed. Results: The quantitative analysis found that the modified TIRM sequence is significantly superior to the conventional one regarding the extent of the susceptibility artifacts. In the qualitative analysis, the modified TIRM sequence was superior to the corresponding conventional one in eight characteristics out of ten that were analyzed. The modified PD TSE with FS was superior to the corresponding conventional one regarding the susceptibility artifacts, image distortion and depiction of bone marrow and cartilage while achieving effective fat saturation. The modified PD TSE sequence without FS with a high (H) BW was found to be superior corresponding to the conventional one in the case of cartilage. Conclusion: Consequently, TIRM sequence with an increased BW and ETL is proposed for producing images of high quality and modified PD TSE with H BW for smaller metals, especially when FS is used.« less
Mapping brain activity in gradient-echo functional MRI using principal component analysis
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Singh, Manbir; Don, Manuel
1997-05-01
The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
A New Method for Extubation: Comparison between Conventional and New Methods.
Yousefshahi, Fardin; Barkhordari, Khosro; Movafegh, Ali; Tavakoli, Vida; Paknejad, Omalbanin; Bina, Payvand; Yousefshahi, Hadi; Sheikh Fathollahi, Mahmood
2012-08-01
Extubation is associated with the risk of complications such as accumulated secretion above the endotracheal tube cuff, eventual atelectasia following a reduction in pulmonary volumes because of a lack of physiological positive end expiratory pressure, and intra-tracheal suction. In order to reduce these complications, and, based on basic physiological principles, a new practical extubation method is presented in this article. The study was designed as a six-month prospective cross-sectional clinical trial. Two hundred fifty-seven patients undergoing coronary artery bypass grafting (CABG) were divided into two groups based on their scheduled surgery time. The first group underwent the conventional extubation method, while the other group was extubated according to a new described method. Arterial blood gas (ABG) analysis results before and after extubation were compared between the two groups to find the effect of the extubation method on the ABG parameters and the oxygenation profile. In all time intervals, the partial pressure of oxygen in arterial blood / fraction of inspired oxygen (PaO(2)/FiO(2)) ratio in the new method group patients was improved compared to that in the conventional method; some differences, like PaO(2)/FiO(2) four hours after extubation, were statistically significant, however (p value=0.0063). The new extubation method improved some respiratory parameters and thus attenuated oxygenation complications and amplified oxygenation after extubation.
Development of ocular viscosity characterization method.
Shu-Hao Lu; Guo-Zhen Chen; Leung, Stanley Y Y; Lam, David C C
2016-08-01
Glaucoma is the second leading cause for blindness. Irreversible and progressive optic nerve damage results when the intraocular pressure (IOP) exceeds 21 mmHg. The elevated IOP is attributed to blocked fluid drainage from the eye. Methods to measure the IOP are widely available, but methods to measure the viscous response to blocked drainage has yet been developed. An indentation method to characterize the ocular flow is developed in this study. Analysis of the load-relaxation data from indentation tests on drainage-controlled porcine eyes showed that the blocked drainage is correlated with increases in ocular viscosity. Successful correlation of the ocular viscosity with drainage suggests that ocular viscosity maybe further developed as a new diagnostic parameter for assessment of normal tension glaucoma where nerve damage occurs without noticeable IOP elevation; and as a diagnostic parameter complimentary to conventional IOP in conventional diagnosis.
Design component method for sensitivity analysis of built-up structures
NASA Technical Reports Server (NTRS)
Choi, Kyung K.; Seong, Hwai G.
1986-01-01
A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.
Current approaches for the assessment of in situ biodegradation.
Bombach, Petra; Richnow, Hans H; Kästner, Matthias; Fischer, Anko
2010-04-01
Considering the high costs and technical difficulties associated with conventional remediation strategies, in situ biodegradation has become a promising approach for cleaning up contaminated aquifers. To verify if in situ biodegradation of organic contaminants is taking place at a contaminated site and to determine if these processes are efficient enough to replace conventional cleanup technologies, a comprehensive characterization of site-specific biodegradation processes is essential. In recent years, several strategies including geochemical analyses, microbial and molecular methods, tracer tests, metabolite analysis, compound-specific isotope analysis, and in situ microcosms have been developed to investigate the relevance of biodegradation processes for cleaning up contaminated aquifers. In this review, we outline current approaches for the assessment of in situ biodegradation and discuss their potential and limitations. We also discuss the benefits of research strategies combining complementary methods to gain a more comprehensive understanding of the complex hydrogeological and microbial interactions governing contaminant biodegradation in the field.
Biomarkers of Selenium Action in Prostate Cancer
2005-01-01
secretory by conventional methods according to published literature. In addition, we have determined the similarities and differences in global gene...transition zone tissue of a 42-year-old man ac- arrays in the resulting data tables were ordered by their cording to previously described methods [4]. The pre...hundred fifteen genes identified by ELISA method . Replicating the conditions used for the SAM analysis showed significant differential expres- microarray
Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen
2015-01-01
Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. PMID:26346558
Environmental scanning electron microscope imaging examples related to particle analysis.
Wight, S A; Zeissler, C J
1993-08-01
This work provides examples of some of the imaging capabilities of environmental scanning electron microscopy applied to easily charged samples relevant to particle analysis. Environmental SEM (also referred to as high pressure or low vacuum SEM) can address uncoated samples that are known to be difficult to image. Most of these specimens are difficult to image by conventional SEM even when coated with a conductive layer. Another area where environmental SEM is particularly applicable is for specimens not compatible with high vacuum, such as volatile specimens. Samples from which images were obtained that otherwise may not have been possible by conventional methods included fly ash particles on an oiled plastic membrane impactor substrate, a one micrometer diameter fiber mounted on the end of a wire, uranium oxide particles embedded in oil-bearing cellulose nitrate, teflon and polycarbonate filter materials with collected air particulate matter, polystyrene latex spheres on cellulosic filter paper, polystyrene latex spheres "loosely" sitting on a glass slide, and subsurface tracks in an etched nuclear track-etch detector. Surface charging problems experienced in high vacuum SEMs are virtually eliminated in the low vacuum SEM, extending imaging capabilities to samples previously difficult to use or incompatible with conventional methods.
[Utility of MALDI-TOF MS for the identification of anaerobic bacteria].
Zárate, Mariela S; Romano, Vanesa; Nievas, Jimena; Smayevsky, Jorgelina
2014-01-01
The analysis by MALDI-TOF MS (Matrix-assited laser desorption/ionization time-of-flight mass spectrometry) has become a reference method for the identification of microorganisms in Clinical Microbiology. However, data on some groups of microorganisms are still controversial. The aim of this study is to determine the utility of MALDI-TOF MS for the identification of clinical isolates of anaerobic bacteria. One-hundred and six anaerobic bacteria isolates were analyzed by MALDI-TOF MS and by conventional biochemical tests. In those cases where identification by conventional methodology was not applicable or in the face of discordance between sequencing methodologies, 16 S rRNA gene sequence analysis was performed. The conventional method and MALDI-TOF MS agreed at genus and species level by 95.3 %. Concordance in gram-negative bacilli was 91.4% and 100% among gram-positive bacilli; there was also concordance both in the 8 isolates studied in gram-positive cocci and in the single gram-negative cocci included. The data obtained in this study demonstrate that MALDI-TOF MS offers the possibility of adequate identification of anaerobic bacteria. Copyright © 2014 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen
2015-01-01
Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain.
[Analysis of scatterer microstructure feature based on Chirp-Z transform cepstrum].
Guo, Jianzhong; Lin, Shuyu
2007-12-01
The fundamental research field of medical ultrasound has been the characterization of tissue scatterers. The signal processing method is widely used in this research field. A new method of Chirp-Z Transform Cepstrum for mean spacing estimation of tissue scatterers using ultrasonic scattered signals has been developed. By using this method together with conventional AR cepstrum method, we processed the backscattered signals of mimic tissue and pig liver in vitro. The results illustrated that the Chirp-Z Transform Cepstrum method is effective for signal analysis of ultrasonic scattering and characterization of tissue scatterers, and it can improve the resolution for mean spacing estimation of tissue scatterers.
Geometric Integration of Weakly Dissipative Systems
NASA Astrophysics Data System (ADS)
Modin, K.; Führer, C.; Soöderlind, G.
2009-09-01
Some problems in mechanics, e.g. in bearing simulation, contain subsystems that are conservative as well as weakly dissipative subsystems. Our experience is that geometric integration methods are often superior for such systems, as long as the dissipation is weak. Here we develop adaptive methods for dissipative perturbations of Hamiltonian systems. The methods are "geometric" in the sense that the form of the dissipative perturbation is preserved. The methods are linearly explicit, i.e., they require the solution of a linear subsystem. We sketch an analysis in terms of backward error analysis and numerical comparisons with a conventional RK method of the same order is given.
Faraji, Hakim; Helalizadeh, Masoumeh; Kordi, Mohammad Reza
2018-01-01
A rapid, simple, and sensitive approach to the analysis of trihalomethanes (THMs) in swimming pool water samples has been developed. The main goal of this study was to overcome or to improve the shortcomings of conventional dispersive liquid-liquid microextraction (DLLME) and to maximize the realization of green analytical chemistry principles. The method involves a simple vortex-assisted microextraction step, in the absence of the dispersive solvent, followed by salting-out effect for the elimination of the centrifugation step. A bell-shaped device and a solidifiable solvent were used to simplify the extraction solvent collection after phase separation. Optimization of the independent variables was performed by using chemometric methods in three steps. The method was statistically validated based on authentic guidance documents. The completion time for extraction was less than 8 min, and the limits of detection were in the range between 4 and 72 ng L -1 . Using this method, good linearity and precision were achieved. The results of THMs determination in different real samples showed that in some cases the concentration of total THMs was more than threshold values of THMs determined by accredited healthcare organizations. This method indicated satisfactory analytical figures of merit. Graphical Abstract A novel green microextraction technique for overcoming the challenges of conventional DLLME. The proposed procedure complies with the principles of green/sustainable analytical chemistry, comprising decreasing the sample size, making easy automation of the process, reducing organic waste, diminishing energy consumption, replacing toxic reagents with safer reagents, and enhancing operator safety.
Thermal Desorption Analysis of Effective Specific Soil Surface Area
NASA Astrophysics Data System (ADS)
Smagin, A. V.; Bashina, A. S.; Klyueva, V. V.; Kubareva, A. V.
2017-12-01
A new method of assessing the effective specific surface area based on the successive thermal desorption of water vapor at different temperature stages of sample drying is analyzed in comparison with the conventional static adsorption method using a representative set of soil samples of different genesis and degree of dispersion. The theory of the method uses the fundamental relationship between the thermodynamic water potential (Ψ) and the absolute temperature of drying ( T): Ψ = Q - aT, where Q is the specific heat of vaporization, and a is the physically based parameter related to the initial temperature and relative humidity of the air in the external thermodynamic reservoir (laboratory). From gravimetric data on the mass fraction of water ( W) and the Ψ value, Polyanyi potential curves ( W(Ψ)) for the studied samples are plotted. Water sorption isotherms are then calculated, from which the capacity of monolayer and the target effective specific surface area are determined using the BET theory. Comparative analysis shows that the new method well agrees with the conventional estimation of the degree of dispersion by the BET and Kutilek methods in a wide range of specific surface area values between 10 and 250 m2/g.
Nichols, Jessica E; Harries, Megan E; Lovestead, Tara M; Bruno, Thomas J
2014-03-21
In this paper we present results of the application of PLOT-cryoadsorption (PLOT-cryo) to the analysis of ignitable liquids in fire debris. We tested ignitable liquids, broadly divided into fuels and solvents (although the majority of the results presented here were obtained with gasoline and diesel fuel) on three substrates: Douglas fir, oak plywood and Nylon carpet. We determined that PLOT-cryo allows the analyst to distinguish all of the ignitable liquids tested by use of a very rapid sampling protocol, and performs better (more recovered components, higher efficiency, lower elution solvent volumes) than a conventional purge and trap method. We also tested the effect of latency (the time period between applying the ignitable liquid and ignition), and we tested a variety of sampling times and a variety of PLOT capillary lengths. Reliable results can be obtained with sampling time periods as short as 3min, and on PLOT capillaries as short as 20cm. The variability of separate samples was also assessed, a study made possible by the high throughput nature of the PLOT-cryo method. We also determined that the method performs better than the conventional carbon strip method that is commonly used in fire debris analysis. Published by Elsevier B.V.
Sajduda, Anna; Martin, Anandi; Portaels, Françoise; Palomino, Juan Carlos
2010-02-01
We developed a scheme for rapid identification of Mycobacterium species using an automated fluorescence capillary electrophoresis instrument. A 441-bp region of the hsp65 gene was examined using PCR-restriction analysis (PRA). The assay was initially evaluated on 38 reference strains. The observed sizes of restriction fragments were consistently smaller than the real sizes for each of the species as deduced from the sequence analysis (mean variance=7bp). Nevertheless, the obtained PRA patterns were highly reproducible and resulted in correct species identifications. A blind test was then successfully performed on 64 test isolates previously characterized by conventional biochemical methods, a commercial INNO-LiPA Mycobacteria assay and/or sequence determination of the 5' end of 16S rRNA gene. A total of 14 of 64 isolates were erroneously identified by conventional methods (78% accuracy). In contrast, PRA performed very well in comparison with the LiPA (89% concordance) and especially with DNA sequencing (93.3% of concordant results). Also, PRA identified seven isolates representing five previously unreported hsp65 alleles. We conclude that hsp65 PRA based on automated capillary electrophoresis is a rapid, simple and reliable method for identification of mycobacteria. Copyright 2010 Elsevier B.V. All rights reserved.
Exploring nursing students’ experience of peer learning in clinical practice
Ravanipour, Maryam; Bahreini, Masoud; Ravanipour, Masoumeh
2015-01-01
Background: Peer learning is an educational process wherein someone of the same age or level of experience level interacts with other students interested in the same topic. There is limited evidence specifically focusing on the practical use of peer learning in Iran. The aim of this study was to explore nursing students’ experiences of peer learning in clinical practice. Materials and Methods: A qualitative content analysis was conducted. Focus groups were used to find the students’ experiences about peerlearning. Twenty-eight baccalaureate nursing students at Bushehr University of Medical Sciences were selected purposively, and were arranged in four groups of seven students each. The focus group interviews were conducted using a semi-structured interview schedule. All interviews were tape-recorded, transcribed verbatim, and analyzed using conventional content analysis method. Results: The analysis identified four themes: Paradoxical dualism, peer exploitation, first learning efficacy, and socialization practice. Gained advantages and perceived disadvantages created paradoxical dualism, and peer exploitation resulted from peer selection and peer training. Conclusion: Nursing students reported general satisfaction concerning peer learning due to much more in-depth learning with little stress than conventional learning methods. Peer learning is a useful method for nursing students for practicing educational leadership and learning the clinical skills before they get a job. PMID:26097860
Yang, S; Liu, D G
2014-01-01
Objectives: The purposes of the study are to investigate the consistency of linear measurements between CBCT orthogonally synthesized cephalograms and conventional cephalograms and to evaluate the influence of different magnifications on these comparisons based on a simulation algorithm. Methods: Conventional cephalograms and CBCT scans were taken on 12 dry skulls with spherical metal markers. Orthogonally synthesized cephalograms were created from CBCT data. Linear parameters on both cephalograms were measured via Photoshop CS v. 5.0 (Adobe® Systems, San Jose, CA), named measurement group (MG). Bland–Altman analysis was utilized to assess the agreement of two imaging modalities. Reproducibility was investigated using paired t-test. By a specific mathematical programme “cepha”, corresponding linear parameters [mandibular corpus length (Go-Me), mandibular ramus length (Co-Go), posterior facial height (Go-S)] on these two types of cephalograms were calculated, named simulation group (SG). Bland–Altman analysis was used to assess the agreement between MG and SG. Simulated linear measurements with varying magnifications were generated based on “cepha” as well. Bland–Altman analysis was used to assess the agreement of simulated measurements between two modalities. Results: Bland–Altman analysis suggested the agreement between measurements on conventional cephalograms and orthogonally synthesized cephalograms, with a mean bias of 0.47 mm. Comparison between MG and SG showed that the difference did not reach clinical significance. The consistency between simulated measurements of both modalities with four different magnifications was demonstrated. Conclusions: Normative data of conventional cephalograms could be used for CBCT orthogonally synthesized cephalograms during this transitional period. PMID:25029593
Sahm, Maik; Otto, Ronny; Pross, Matthias; Mantke, Rene
2018-06-25
Approximately 90,000 thyroid operations are performed in Germany each year. Minimally invasive video-assisted thyroidectomy (MIVAT) accounts for 5 - 10% of these operations. There are few data that compare long-term cosmetic results after MIVAT to those after conventional surgery. Current systematic reviews show no advantage for MIVAT. The goal of this study was to analyse the long-term postoperative results in both procedures and the evaluation of relevant factors. The analysis of the long-term results is based on follow-up examinations using a validated method for scar appraisal (POSAS). Cohort analysis was performed on MIVAT operations in our hospital between 2004 and 2011 and conventional thyroid operations in 2011. Follow-up examination data were analysed from 117 patients from the MIVAT group and 102 patients from the conventional group. The follow-up examination was performed with a mean of 23.1 vs. 23.6 months postoperatively (MIVAT vs. conventional). The Friedman Test showed that scar pigmentation (mean rank 4.79) and scar surface structure (mean rank 3.62) were the deciding factors influencing the long-term cosmetic results. Both MIVAT and conventional surgery gave very good long-term cosmetic results. From the patient's perspective, there is no significant advantage with conventional surgery. The evaluation of the long-term results largely depends on factors such as scar pigmentation and surface structure that can only be influenced to a limited extent by the surgical procedure. Georg Thieme Verlag KG Stuttgart · New York.
Commercial Crop Yields Reveal Strengths and Weaknesses for Organic Agriculture in the United States.
Kniss, Andrew R; Savage, Steven D; Jabbour, Randa
2016-01-01
Land area devoted to organic agriculture has increased steadily over the last 20 years in the United States, and elsewhere around the world. A primary criticism of organic agriculture is lower yield compared to non-organic systems. Previous analyses documenting the yield deficiency in organic production have relied mostly on data generated under experimental conditions, but these studies do not necessarily reflect the full range of innovation or practical limitations that are part of commercial agriculture. The analysis we present here offers a new perspective, based on organic yield data collected from over 10,000 organic farmers representing nearly 800,000 hectares of organic farmland. We used publicly available data from the United States Department of Agriculture to estimate yield differences between organic and conventional production methods for the 2014 production year. Similar to previous work, organic crop yields in our analysis were lower than conventional crop yields for most crops. Averaged across all crops, organic yield averaged 67% of conventional yield [corrected]. However, several crops had no significant difference in yields between organic and conventional production, and organic yields surpassed conventional yields for some hay crops. The organic to conventional yield ratio varied widely among crops, and in some cases, among locations within a crop. For soybean (Glycine max) and potato (Solanum tuberosum), organic yield was more similar to conventional yield in states where conventional yield was greatest. The opposite trend was observed for barley (Hordeum vulgare), wheat (Triticum aestevum), and hay crops, however, suggesting the geographical yield potential has an inconsistent effect on the organic yield gap.
Commercial Crop Yields Reveal Strengths and Weaknesses for Organic Agriculture in the United States
Savage, Steven D.; Jabbour, Randa
2016-01-01
Land area devoted to organic agriculture has increased steadily over the last 20 years in the United States, and elsewhere around the world. A primary criticism of organic agriculture is lower yield compared to non-organic systems. Previous analyses documenting the yield deficiency in organic production have relied mostly on data generated under experimental conditions, but these studies do not necessarily reflect the full range of innovation or practical limitations that are part of commercial agriculture. The analysis we present here offers a new perspective, based on organic yield data collected from over 10,000 organic farmers representing nearly 800,000 hectares of organic farmland. We used publicly available data from the United States Department of Agriculture to estimate yield differences between organic and conventional production methods for the 2014 production year. Similar to previous work, organic crop yields in our analysis were lower than conventional crop yields for most crops. Averaged across all crops, organic yield averaged 80% of conventional yield. However, several crops had no significant difference in yields between organic and conventional production, and organic yields surpassed conventional yields for some hay crops. The organic to conventional yield ratio varied widely among crops, and in some cases, among locations within a crop. For soybean (Glycine max) and potato (Solanum tuberosum), organic yield was more similar to conventional yield in states where conventional yield was greatest. The opposite trend was observed for barley (Hordeum vulgare), wheat (Triticum aestevum), and hay crops, however, suggesting the geographical yield potential has an inconsistent effect on the organic yield gap. PMID:27552217
Vojdani, M; Torabi, K; Farjood, E; Khaledi, Aar
2013-09-01
Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student's t- test was used for statistical analysis (α=0.05). The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student's t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um.
Added Value of Assessing Adnexal Masses with Advanced MRI Techniques
Thomassin-Naggara, I.; Balvay, D.; Rockall, A.; Carette, M. F.; Ballester, M.; Darai, E.; Bazot, M.
2015-01-01
This review will present the added value of perfusion and diffusion MR sequences to characterize adnexal masses. These two functional MR techniques are readily available in routine clinical practice. We will describe the acquisition parameters and a method of analysis to optimize their added value compared with conventional images. We will then propose a model of interpretation that combines the anatomical and morphological information from conventional MRI sequences with the functional information provided by perfusion and diffusion weighted sequences. PMID:26413542
Farjood, Ehsan; Vojdani, Mahroo; Torabi, Kiyanoosh; Khaledi, Amir Ali Reza
2017-01-01
Given the limitations of conventional waxing, computer-aided design and computer-aided manufacturing (CAD-CAM) technologies have been developed as alternative methods of making patterns. The purpose of this in vitro study was to compare the marginal and internal fit of metal copings derived from wax patterns fabricated by rapid prototyping (RP) to those created by the conventional handmade technique. Twenty-four standardized brass dies were milled and divided into 2 groups (n=12) according to the wax pattern fabrication method. The CAD-RP group was assigned to the experimental group, and the conventional group to the control group. The cross-sectional technique was used to assess the marginal and internal discrepancies at 15 points on the master die by using a digital microscope. An independent t test was used for statistical analysis (α=.01). The CAD-RP group had a total mean (±SD) for absolute marginal discrepancy of 117.1 (±11.5) μm and a mean marginal discrepancy of 89.8 (±8.3) μm. The conventional group had an absolute marginal discrepancy 88.1 (±10.7) μm and a mean marginal discrepancy of 69.5 (±15.6) μm. The overall mean (±SD) of the total internal discrepancy, separately calculated as the axial internal discrepancy and occlusal internal discrepancy, was 95.9 (±8.0) μm for the CAD-RP group and 76.9 (±10.2) μm for the conventional group. The independent t test results showed significant differences between the 2 groups. The CAD-RP group had larger discrepancies at all measured areas than the conventional group, which was statistically significant (P<.01). Within the limitations of this in vitro study, the conventional method of wax pattern fabrication produced copings with better marginal and internal fit than the CAD-RP method. However, the marginal and internal fit for both groups were within clinically acceptable ranges. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gray, Bonnie L.
2012-04-01
Microfluidics is revolutionizing laboratory methods and biomedical devices, offering new capabilities and instrumentation in multiple areas such as DNA analysis, proteomics, enzymatic analysis, single cell analysis, immunology, point-of-care medicine, personalized medicine, drug delivery, and environmental toxin and pathogen detection. For many applications (e.g., wearable and implantable health monitors, drug delivery devices, and prosthetics) mechanically flexible polymer devices and systems that can conform to the body offer benefits that cannot be achieved using systems based on conventional rigid substrate materials. However, difficulties in implementing active devices and reliable packaging technologies have limited the success of flexible microfluidics. Employing highly compliant materials such as PDMS that are typically employed for prototyping, we review mechanically flexible polymer microfluidic technologies based on free-standing polymer substrates and novel electronic and microfluidic interconnection schemes. Central to these new technologies are hybrid microfabrication methods employing novel nanocomposite polymer materials and devices. We review microfabrication methods using these materials, along with demonstrations of example devices and packaging schemes that employ them. We review these recent developments and place them in the context of the fields of flexible microfluidics and conformable systems, and discuss cross-over applications to conventional rigid-substrate microfluidics.
Tambe, Varsha H; Nagmode, Pradnya S; Vishwas, Jayshree R; P, Saujanya K; Angadi, Prabakar; Ali, Fareedi Mukram
2013-01-01
Background: To compare the amount of debris extruded apically by using conventional syringe, Endovac & Ultrasonic irrigation. Materials & Methods: Thirty freshly extracted mandibular premolars were selected, working length was determined and mounted in a debris and collection apparatus. The canals were prepared. After each instrument change, 1 ml. of 3% sodium hypochlorite was used as irrigation. Debris extruded apically by using conventional syringe, endovac& ultrasonic irrigation tech, was measured using the electronic balance to determine its weight and statistical analysis was performed. The mean difference between the groups was determined using statistical analysis within the groups &between the groups for equal variances. Results: Among all the groups, significantly less debris were found apically in the Endovac group (0.96) compared to conventional and ultrasonic group (1.23) syringe. Conclusion: The present study showed that endovac system extrudes less amount of debris apically as compared to ultrasonic followed by conventional so incidence of flare up can be reduce by using endovac irrigation system. How to cite this article: Tambe V H, Nagmode P S, Vishwas J R, Saujanya K P, Angadi P, Ali F M. Evaluation of the Amount of Debris extruded apically by using Conventional Syringe, Endovac and Ultrasonic Irrigation Technique: An In Vitro Study. J Int Oral Health 2013; 5(3):63-66. PMID:24155604
Analysis of International Space Station Materials on MISSE-3 and MISSE-4
NASA Technical Reports Server (NTRS)
Finckenor, Miria M.; Golden, Johnny L.; O'Rourke, Mary Jane
2008-01-01
For high-temperature applications (> 2,000 C) such as solid rocket motors, hypersonic aircraft, nuclear electric/thermal propulsion for spacecraft, and more efficient jet engines, creep becomes one of the most important design factors to be considered. Conventional creep-testing methods, where the specimen and test apparatus are in contact with each other, are limited to temperatures 1,700 deg. C. Development of alloys for higher-temperature applications is limited by the availability of testing methods at temperatures above 2000 C. Development of alloys for applications requiring a long service life at temperatures as low as 1500 C, such as the next generation of jet turbine superalloys, is limited by the difficulty of accelerated testing at temperatures above 1700 0c. For these reasons, a new, non-contact creep-measurement technique is needed for higher temperature applications. A new non-contact method for creep measurements of ultra-high-temperature metals and ceramics has been developed and validated. Using the electrostatic levitation (ESL) facility at NASA Marshall Space Flight Center, a spherical sample is rotated quickly enough to cause creep deformation due to centrifugal acceleration. Very accurate measurement of the deformed shape through digital image analysis allows the stress exponent n to be determined very precisely from a single test, rather than from numerous conventional tests. Validation tests on single-crystal niobium spheres showed excellent agreement with conventional tests at 1985 C; however the non-contact method provides much greater precision while using only about 40 milligrams of material. This method is being applied to materials including metals and ceramics for noneroding throats in solid rockets and next-generation superalloys for turbine engines. Recent advances in the method and the current state of these new measurements will be presented.
Jitian, Simion; White, Samuel R; Yang, H-H Wendy; Weisz, Adrian
2014-01-10
Specifications in the U.S. Code of Federal Regulations for the color additive D&C Green No. 8 (Colour Index No. 59040) limit the levels of the subsidiary colors 1,3,6-pyrenetrisulfonic acid trisodium salt (P3S) and 1,3,6,8-pyrenetetrasulfonic acid tetrasodium salt (P4S). The present paper describes a comparative study of two possible methods to replace the currently used multi-step TLC/spectrophotometry method of separating and quantifying the minor components P3S and P4S in G8. One of the new approaches uses conventional high-performance liquid chromatography (HPLC) and the other, derivative spectrophotometry. While the derivative spectrophotometric method was shown to be inadequate for the analysis of minor components overwhelmed by components of much higher concentration, the HPLC method was proven highly effective. The closely related, very polar compounds P3S and P4S were separated by the new HPLC method in less than 4 min using a conventional HPLC instrument. P3S and P4S were quantified by using five-point calibration curves with data points that ranged from 0.45 to 7.63% and from 0.13 to 1.82%, by weight, for P3S and P4S, respectively. The HPLC method was applied to the analysis of test portions from 20 batches of D&C Green No. 8 submitted to the U.S. Food and Drug Administration for certification. Published by Elsevier B.V.
Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw
2017-01-01
Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes.
Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw
2017-01-01
Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes. PMID:29250096
Application of Grey Relational Analysis to Decision-Making during Product Development
ERIC Educational Resources Information Center
Hsiao, Shih-Wen; Lin, Hsin-Hung; Ko, Ya-Chuan
2017-01-01
A multi-attribute decision-making (MADM) approach was proposed in this study as a prediction method that differs from the conventional production and design methods for a product. When a client has different dimensional requirements, this approach can quickly provide a company with design decisions for each product. The production factors of a…
Improved methods of vibration analysis of pretwisted, airfoil blades
NASA Technical Reports Server (NTRS)
Subrahmanyam, K. B.; Kaza, K. R. V.
1984-01-01
Vibration analysis of pretwisted blades of asymmetric airfoil cross section is performed by using two mixed variational approaches. Numerical results obtained from these two methods are compared to those obtained from an improved finite difference method and also to those given by the ordinary finite difference method. The relative merits, convergence properties and accuracies of all four methods are studied and discussed. The effects of asymmetry and pretwist on natural frequencies and mode shapes are investigated. The improved finite difference method is shown to be far superior to the conventional finite difference method in several respects. Close lower bound solutions are provided by the improved finite difference method for untwisted blades with a relatively coarse mesh while the mixed methods have not indicated any specific bound.
Granchi, Simona; Vannacci, Enrico; Biagi, Elena
2017-04-22
To evaluate the capability of the HyperSPACE (Hyper SPectral Analysis for Characterization in Echography) method in tissue characterization, in order to provide information for the laser treatment of benign thyroid nodules in respect of conventional B-mode images and elastography. The method, based on the spectral analysis of the raw radiofrequency ultrasonic signal, was applied to characterize the nodule before and after laser treatment. Thirty patients (25 females and 5 males, age between 37 and 81 years) with thyroid benign nodule at cytology (Thyr 2) were evaluated by conventional ultrasonography, elastography, and HyperSPACE, before and after laser ablation. The images processed by HyperSPACE exhibit different color distributions that are referred to different tissue features. By calculating the percentages of the color coverages, the analysed nodules were subdivided into 3 groups. Each nodule belonging to the same group experienced, on average, similar necrosis extension. The nodules exhibit different Configurations (colors) distributions that could be indicative of the response of nodular tissue to the laser treatmentConclusions: HyperSPACEcan characterize benign nodules by providing additional information in respect of conventional ultrasound and elastography which is useful for support in the laser treatment of nodules in order to increase the probability of success.
Knowledge-Base Semantic Gap Analysis for the Vulnerability Detection
NASA Astrophysics Data System (ADS)
Wu, Raymond; Seki, Keisuke; Sakamoto, Ryusuke; Hisada, Masayuki
Web security became an alert in internet computing. To cope with ever-rising security complexity, semantic analysis is proposed to fill-in the gap that the current approaches fail to commit. Conventional methods limit their focus to the physical source codes instead of the abstraction of semantics. It bypasses new types of vulnerability and causes tremendous business loss.
Perception of Spiritual Health: A Qualitative Content Analysis in Iranian Older Adults
ERIC Educational Resources Information Center
Zibad, Hosein Ajam; Foroughan, Mahshid; Shahboulaghi, Farahnaz Mohammadi; Rafiey, Hassan; Rassouli, Maryam
2017-01-01
The present study was performed with the aim to explain older people's perceptions of spiritual health. It was conducted using the conventional content analysis method. Twelve individuals aged 60 years or older with normal cognition participated in the study using purposive sampling. Data were collected by in-depth interviews. Data analysis…
NASA Astrophysics Data System (ADS)
He, Xin; Frey, Eric C.
2007-03-01
Binary ROC analysis has solid decision-theoretic foundations and a close relationship to linear discriminant analysis (LDA). In particular, for the case of Gaussian equal covariance input data, the area under the ROC curve (AUC) value has a direct relationship to the Hotelling trace. Many attempts have been made to extend binary classification methods to multi-class. For example, Fukunaga extended binary LDA to obtain multi-class LDA, which uses the multi-class Hotelling trace as a figure-of-merit, and we have previously developed a three-class ROC analysis method. This work explores the relationship between conventional multi-class LDA and three-class ROC analysis. First, we developed a linear observer, the three-class Hotelling observer (3-HO). For Gaussian equal covariance data, the 3- HO provides equivalent performance to the three-class ideal observer and, under less strict conditions, maximizes the signal to noise ratio for classification of all pairs of the three classes simultaneously. The 3-HO templates are not the eigenvectors obtained from multi-class LDA. Second, we show that the three-class Hotelling trace, which is the figureof- merit in the conventional three-class extension of LDA, has significant limitations. Third, we demonstrate that, under certain conditions, there is a linear relationship between the eigenvectors obtained from multi-class LDA and 3-HO templates. We conclude that the 3-HO based on decision theory has advantages both in its decision theoretic background and in the usefulness of its figure-of-merit. Additionally, there exists the possibility of interpreting the two linear features extracted by the conventional extension of LDA from a decision theoretic point of view.
DOT National Transportation Integrated Search
1971-04-01
An automated fluorometric trihydroxyindole procedure is described for the measurement of norepinephrine (NE) and epinephrine (E) in blood plasma or urine. The method employs conventional techniques for isolation of the catecholamines by alumina colum...
Woynaroski, Tiffany; Oller, D. Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-01-01
Theory and research suggest that vocal development predicts “useful speech” in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently “in development” and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. PMID:27459107
Vaccaro, Calogero; Busetto, Roberto; Bernardini, Daniele; Anselmi, Carlo; Zotti, Alessandro
2012-03-01
To evaluate the precision and accuracy of assessing bone mineral density (BMD) by use of mean gray value (MGV) on digitalized and digital images of conventional and digital radiographs, respectively, of ex vivo bovine and equine bone specimens in relation to the gold-standard technique of dual-energy x-ray absorptiometry (DEXA). Left and right metatarsal bones from 11 beef cattle and right femurs from 2 horses. Bovine specimens were imaged by use of conventional radiography, whereas equine specimens were imaged by use of computed radiography (digital radiography). Each specimen was subsequently scanned by use of the same DEXA equipment. The BMD values resulting from each DEXA scan were paired with the MGVs obtained by use of software on the corresponding digitalized or digital radiographic image. The MGV analysis of digitalized and digital x-ray images was a precise (coefficient of variation, 0.1 and 0.09, respectively) and highly accurate method for assessing BMD, compared with DEXA (correlation coefficient, 0.910 and 0.937 for conventional and digital radiography, respectively). The high correlation between MGV and BMD indicated that MGV analysis may be a reliable alternative to DEXA in assessing radiographic bone density. This may provide a new, inexpensive, and readily available estimate of BMD.
Shivakumarswamy, Udasimath; Arakeri, Surekha U; Karigowdar, Mahesh H; Yelikar, BR
2012-01-01
Background: The cytological examinations of serous effusions have been well-accepted, and a positive diagnosis is often considered as a definitive diagnosis. It helps in staging, prognosis and management of the patients in malignancies and also gives information about various inflammatory and non-inflammatory lesions. Diagnostic problems arise in everyday practice to differentiate reactive atypical mesothelial cells and malignant cells by the routine conventional smear (CS) method. Aims: To compare the morphological features of the CS method with those of the cell block (CB) method and also to assess the utility and sensitivity of the CB method in the cytodiagnosis of pleural effusions. Materials and Methods: The study was conducted in the cytology section of the Department of Pathology. Sixty pleural fluid samples were subjected to diagnostic evaluation for over a period of 20 months. Along with the conventional smears, cell blocks were prepared by using 10% alcohol–formalin as a fixative agent. Statistical analysis with the ‘z test’ was performed to identify the cellularity, using the CS and CB methods. Mc. Naemer's χ2test was used to identify the additional yield for malignancy by the CB method. Results: Cellularity and additional yield for malignancy was 15% more by the CB method. Conclusions: The CB method provides high cellularity, better architectural patterns, morphological features and an additional yield of malignant cells, and thereby, increases the sensitivity of the cytodiagnosis when compared with the CS method. PMID:22438610
An improved silver staining procedure for schizodeme analysis in polyacrylamide gradient gels.
Gonçalves, A M; Nehme, N S; Morel, C M
1990-01-01
A simple protocol is described for the silver staining of polyacrylamide gradient gels used for the separation of restriction fragments of kinetoplast DNA [schizodeme analysis of trypanosomatids (Morel et al., 1980)]. The method overcomes the problems of non-uniform staining and strong background color which are frequently encountered when conventional protocols for silver staining of linear gels are applied to gradient gels. The method described has proven to be of general applicability for DNA, RNA and protein separations in gradient gels.
Haiko, Johanna; Savolainen, Laura E; Hilla, Risto; Pätäri-Sampo, Anu
2016-10-01
Complicated urinary tract infections, such as pyelonephritis, may lead to sepsis. Rapid diagnosis is needed to identify the causative urinary pathogen and to verify the appropriate empirical antimicrobial therapy. We describe here a rapid identification method for urinary pathogens: urine is incubated on chocolate agar for 3h at 35°C with 5% CO2 and subjected to MALDI-TOF MS analysis by VITEK MS. Overall 207 screened clinical urine samples were tested in parallel with conventional urine culture. The method, called U-si-MALDI-TOF (urine short incubation MALDI-TOF), showed correct identification for 86% of Gram-negative urinary tract pathogens (Escherichia coli, Klebsiella pneumoniae, and other Enterobacteriaceae), when present at >10(5)cfu/ml in culture (n=107), compared with conventional culture method. However, Gram-positive bacteria (n=28) were not successfully identified by U-si-MALDI-TOF. This method is especially suitable for rapid identification of E. coli, the most common cause of urinary tract infections and urosepsis. Turnaround time for identification using U-si-MALDI-TOF compared with conventional urine culture was improved from 24h to 4-6h. Copyright © 2016 Elsevier B.V. All rights reserved.
Shin, Hye Young; Suh, Mina; Baik, Hyung Won; Choi, Kui Son; Park, Boyoung; Jun, Jae Kwan; Hwang, Sang-Hyun; Kim, Byung Chang; Lee, Chan Wha; Oh, Jae Hwan; Lee, You Kyoung; Han, Dong Soo; Lee, Do-Hoon
2016-11-15
We are in the process of conducting a randomized trial to determine whether compliance with the fecal immunochemical test (FIT) for colorectal cancer screening differs according to the stool-collection method. This study was an interim analysis of the performance of two stool-collection devices (sampling bottle vs conventional container). In total, 1,701 individuals (age range, 50 to 74 years) were randomized into the sampling bottle group (intervention arm) or the conventional container group (control arm). In both groups, we evaluated the FIT positivity rate, the positive predictive value for advanced neoplasia, and the detection rate for advanced neoplasia. The FIT positivity rates were 4.1% for the sampling bottles and 2.0% for the conventional containers; these values were significantly different. The positive predictive values for advanced neoplasia in the sampling bottles and conventional containers were 11.1% (95% confidence interval [CI], -3.4 to 25.6) and 12.0% (95% CI, -0.7 to 24.7), respectively. The detection rates for advanced neoplasia in the sampling bottles and conventional containers were 4.5 per 1,000 persons (95% CI, 2.0 to 11.0) and 2.4 per 1,000 persons (95% CI, 0.0 to 5.0), respectively. The impact of these findings on FIT screening performance was unclear in this interim analysis. This impact should therefore be evaluated in the final analysis following the final enrollment period.
da Silva Neto, Ulisses Tavares; Joly, Julio Cesar; Gehrke, Sergio Alexandre
2014-02-01
We used resonance frequency analysis to evaluate the implant stability quotient (ISQ) of dental implants that were installed in sites prepared by either conventional drilling or piezoelectric tips. We studied 30 patients with bilateral edentulous areas in the maxillary premolar region who were randomised to have the implant inserted with conventional drilling, or with piezoelectric surgery. The stability of each implant was measured by resonance frequency analysis immediately after placement to assess the immediate stability (time 1) and again at 90 days (time 2) and 150 days (time 3). In the conventional group the mean (SD) ISQ for time 1 was 69.1 (6.1) (95% CI 52.4-77.3); for time 2, 70.7 (5.7) (95% CI 60.4-82.8); and for time 3, 71.7 (4.5) (95% CI 64.2-79.2). In the piezosurgery group the corresponding values were: 77.5 (4.6) (95% CI 71.1-84.3) for time 1, 77.0 (4.2) (95% CI, 69.7-85.2) for time 2, and 79.1 (3.1) (95% CI 74.5-87.3) for time 3. The results showed significant increases in the ISQ values for the piezosurgery group at each time point (p=0.04). The stability of implants placed using the piezoelectric method was greater than that of implants placed using the conventional technique. Copyright © 2013 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Design sensitivity analysis with Applicon IFAD using the adjoint variable method
NASA Technical Reports Server (NTRS)
Frederick, Marjorie C.; Choi, Kyung K.
1984-01-01
A numerical method is presented to implement structural design sensitivity analysis using the versatility and convenience of existing finite element structural analysis program and the theoretical foundation in structural design sensitivity analysis. Conventional design variables, such as thickness and cross-sectional areas, are considered. Structural performance functionals considered include compliance, displacement, and stress. It is shown that calculations can be carried out outside existing finite element codes, using postprocessing data only. That is, design sensitivity analysis software does not have to be imbedded in an existing finite element code. The finite element structural analysis program used in the implementation presented is IFAD. Feasibility of the method is shown through analysis of several problems, including built-up structures. Accurate design sensitivity results are obtained without the uncertainty of numerical accuracy associated with selection of a finite difference perturbation.
Piippo-Huotari, Oili; Norrman, Eva; Anderzén-Carlsson, Agneta; Geijer, Håkan
2018-05-01
The radiation dose for patients can be reduced with many methods and one way is to use abdominal compression. In this study, the radiation dose and image quality for a new patient-controlled compression device were compared with conventional compression and compression in the prone position . To compare radiation dose and image quality of patient-controlled compression compared with conventional and prone compression in general radiography. An experimental design with quantitative approach. After obtaining the approval of the ethics committee, a consecutive sample of 48 patients was examined with the standard clinical urography protocol. The radiation doses were measured as dose-area product and analyzed with a paired t-test. The image quality was evaluated by visual grading analysis. Four radiologists evaluated each image individually by scoring nine criteria modified from the European quality criteria for diagnostic radiographic images. There was no significant difference in radiation dose or image quality between conventional and patient-controlled compression. Prone position resulted in both higher dose and inferior image quality. Patient-controlled compression gave similar dose levels as conventional compression and lower than prone compression. Image quality was similar with both patient-controlled and conventional compression and was judged to be better than in the prone position.
Video-based teleradiology for intraosseous lesions. A receiver operating characteristic analysis.
Tyndall, D A; Boyd, K S; Matteson, S R; Dove, S B
1995-11-01
Immediate access to off-site expert diagnostic consultants regarding unusual radiographic findings or radiographic quality assurance issues could be a current problem for private dental practitioners. Teleradiology, a system for transmitting radiographic images, offers a potential solution to this problem. Although much research has been done to evaluate feasibility and utilization of teleradiology systems in medical imaging, little research on dental applications has been performed. In this investigation 47 panoramic films with an equal distribution of images with intraosseous jaw lesions and no disease were viewed by a panel of observers with teleradiology and conventional viewing methods. The teleradiology system consisted of an analog video-based system simulating remote radiographic consultation between a general dentist and a dental imaging specialist. Conventional viewing consisted of traditional viewbox methods. Observers were asked to identify the presence or absence of 24 intraosseous lesions and to determine their locations. No statistically significant differences in modalities or observers were identified between methods at the 0.05 level. The results indicate that viewing intraosseous lesions of video-based panoramic images is equal to conventional light box viewing.
Analysis of Endocrine Disrupting Pesticides by Capillary GC with Mass Spectrometric Detection
Matisová, Eva; Hrouzková, Svetlana
2012-01-01
Endocrine disrupting chemicals, among them many pesticides, alter the normal functioning of the endocrine system of both wildlife and humans at very low concentration levels. Therefore, the importance of method development for their analysis in food and the environment is increasing. This also covers contributions in the field of ultra-trace analysis of multicomponent mixtures of organic pollutants in complex matrices. With this fact conventional capillary gas chromatography (CGC) and fast CGC with mass spectrometric detection (MS) has acquired a real importance in the analysis of endocrine disrupting pesticide (EDP) residues. This paper provides an overview of GC methods, including sample preparation steps, for analysis of EDPs in a variety of matrices at ultra-trace concentration levels. Emphasis is put on separation method, mode of MS detection and ionization and obtained limits of detection and quantification. Analysis time is one of the most important aspects that should be considered in the choice of analytical methods for routine analysis. Therefore, the benefits of developed fast GC methods are important. PMID:23202677
NASA Technical Reports Server (NTRS)
Klein, L. R.
1974-01-01
The free vibrations of elastic structures of arbitrary complexity were analyzed in terms of their component modes. The method was based upon the use of the normal unconstrained modes of the components in a Rayleigh-Ritz analysis. The continuity conditions were enforced by means of Lagrange Multipliers. Examples of the structures considered are: (1) beams with nonuniform properties; (2) airplane structures with high or low aspect ratio lifting surface components; (3) the oblique wing airplane; and (4) plate structures. The method was also applied to the analysis of modal damping of linear elastic structures. Convergence of the method versus the number of modes per component and/or the number of components is discussed and compared to more conventional approaches, ad-hoc methods, and experimental results.
A GPU-based calculation using the three-dimensional FDTD method for electromagnetic field analysis.
Nagaoka, Tomoaki; Watanabe, Soichi
2010-01-01
Numerical simulations with the numerical human model using the finite-difference time domain (FDTD) method have recently been performed frequently in a number of fields in biomedical engineering. However, the FDTD calculation runs too slowly. We focus, therefore, on general purpose programming on the graphics processing unit (GPGPU). The three-dimensional FDTD method was implemented on the GPU using Compute Unified Device Architecture (CUDA). In this study, we used the NVIDIA Tesla C1060 as a GPGPU board. The performance of the GPU is evaluated in comparison with the performance of a conventional CPU and a vector supercomputer. The results indicate that three-dimensional FDTD calculations using a GPU can significantly reduce run time in comparison with that using a conventional CPU, even a native GPU implementation of the three-dimensional FDTD method, while the GPU/CPU speed ratio varies with the calculation domain and thread block size.
Matsudate, Yoshihiro; Naruto, Takuya; Hayashi, Yumiko; Minami, Mitsuyoshi; Tohyama, Mikiko; Yokota, Kenji; Yamada, Daisuke; Imoto, Issei; Kubo, Yoshiaki
2017-06-01
Nevoid basal cell carcinoma syndrome (NBCCS) is an autosomal dominant disorder mainly caused by heterozygous mutations of PTCH1. In addition to characteristic clinical features, detection of a mutation in causative genes is reliable for the diagnosis of NBCCS; however, no mutations have been identified in some patients using conventional methods. To improve the method for the molecular diagnosis of NBCCS. We performed targeted exome sequencing (TES) analysis using a multi-gene panel, including PTCH1, PTCH2, SUFU, and other sonic hedgehog signaling pathway-related genes, based on next-generation sequencing (NGS) technology in 8 cases in whom possible causative mutations were not detected by previously performed conventional analysis and 2 recent cases of NBCCS. Subsequent analysis of gross deletion within or around PTCH1 detected by TES was performed using chromosomal microarray (CMA). Through TES analysis, specific single nucleotide variants or small indels of PTCH1 causing inferred amino acid changes were identified in 2 novel cases and 2 undiagnosed cases, whereas gross deletions within or around PTCH1, which are validated by CMA, were found in 3 undiagnosed cases. However, no mutations were detected even by TES in 3 cases. Among 3 cases with gross deletions of PTCH1, deletions containing the entire PTCH1 and additional neighboring genes were detected in 2 cases, one of which exhibited atypical clinical features, such as severe mental retardation, likely associated with genes located within the 4.3Mb deleted region, especially. TES-based simultaneous evaluation of sequences and copy number status in all targeted coding exons by NGS is likely to be more useful for the molecular diagnosis of NBCCS than conventional methods. CMA is recommended as a subsequent analysis for validation and detailed mapping of deleted regions, which may explain the atypical clinical features of NBCCS cases. Copyright © 2017 Japanese Society for Investigative Dermatology. Published by Elsevier B.V. All rights reserved.
Relative Displacement Method for Track-Structure Interaction
Ramos, Óscar Ramón; Pantaleón, Marcos J.
2014-01-01
The track-structure interaction effects are usually analysed with conventional FEM programs, where it is difficult to implement the complex track-structure connection behaviour, which is nonlinear, elastic-plastic and depends on the vertical load. The authors developed an alternative analysis method, which they call the relative displacement method. It is based on the calculation of deformation states in single DOF element models that satisfy the boundary conditions. For its solution, an iterative optimisation algorithm is used. This method can be implemented in any programming language or analysis software. A comparison with ABAQUS calculations shows a very good result correlation and compliance with the standard's specifications. PMID:24634610
NASA Astrophysics Data System (ADS)
Nauleau, Pierre; Apostolakis, Iason; McGarry, Matthew; Konofagou, Elisa
2018-06-01
The stiffness of the arteries is known to be an indicator of the progression of various cardiovascular diseases. Clinically, the pulse wave velocity (PWV) is used as a surrogate for arterial stiffness. Pulse wave imaging (PWI) is a non-invasive, ultrasound-based imaging technique capable of mapping the motion of the vessel walls, allowing the local assessment of arterial properties. Conventionally, a distinctive feature of the displacement wave (e.g. the 50% upstroke) is tracked across the map to estimate the PWV. However, the presence of reflections, such as those generated at the carotid bifurcation, can bias the PWV estimation. In this paper, we propose a two-step cross-correlation based method to characterize arteries using the information available in the PWI spatio-temporal map. First, the area under the cross-correlation curve is proposed as an index for locating the regions of different properties. Second, a local peak of the cross-correlation function is tracked to obtain a less biased estimate of the PWV. Three series of experiments were conducted in phantoms to evaluate the capabilities of the proposed method compared with the conventional method. In the ideal case of a homogeneous phantom, the two methods performed similarly and correctly estimated the PWV. In the presence of reflections, the proposed method provided a more accurate estimate than conventional processing: e.g. for the soft phantom, biases of ‑0.27 and ‑0.71 m · s–1 were observed. In a third series of experiments, the correlation-based method was able to locate two regions of different properties with an error smaller than 1 mm. It also provided more accurate PWV estimates than conventional processing (biases: ‑0.12 versus ‑0.26 m · s–1). Finally, the in vivo feasibility of the proposed method was demonstrated in eleven healthy subjects. The results indicate that the correlation-based method might be less precise in vivo but more accurate than the conventional method.
NASA Astrophysics Data System (ADS)
Delgado, Carlos; Cátedra, Manuel Felipe
2018-05-01
This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.
Method for improving accuracy in full evaporation headspace analysis.
Xie, Wei-Qi; Chai, Xin-Sheng
2017-05-01
We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mason, H. E.; Uribe, E. C.; Shusterman, J. A.
2018-01-01
Tensor-rank decomposition methods have been applied to variable contact time 29 Si{ 1 H} CP/CPMG NMR data sets to extract NMR dynamics information and dramatically decrease conventional NMR acquisition times.
POLLUTION PREVENTION AND ENHANCEMENT OF BIODEGRADABILITY VIA ISOMER ELIMINATION IN CONSUMER PRODUCTS
The purpose of this project is to develop novel methodologies for the analysis and detection of chiral environmental contaminants. Conventional analytical techniques do not discriminate between enantiomers. By using newly developed enantioselective methods, the environmental pers...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mason, H. E.; Uribe, E. C.; Shusterman, J. A.
Tensor-rank decomposition methods have been applied to variable contact time 29 Si{ 1 H} CP/CPMG NMR data sets to extract NMR dynamics information and dramatically decrease conventional NMR acquisition times.
Accurate estimates of 3D Ising critical exponents using the coherent-anomaly method
NASA Astrophysics Data System (ADS)
Kolesik, Miroslav; Suzuki, Masuo
1995-02-01
An analysis of the critical behavior of the three-dimensional Ising model using the coherent-anomaly method (CAM) is presented. Various sources of errors in CAM estimates of critical exponents are discussed, and an improved scheme for the CAM data analysis is tested. Using a set of mean-field type approximations based on the variational series expansion approach, accuracy comparable to the most precise conventional methods has been achieved. Our results for the critical exponents are given by α = 0.108(5), β = 0.327(4), γ = 1.237(4) and δ = 4.77(5).
Automated quantification of pancreatic β-cell mass
Golson, Maria L.; Bush, William S.
2014-01-01
β-Cell mass is a parameter commonly measured in studies of islet biology and diabetes. However, the rigorous quantification of pancreatic β-cell mass using conventional histological methods is a time-consuming process. Rapidly evolving virtual slide technology with high-resolution slide scanners and newly developed image analysis tools has the potential to transform β-cell mass measurement. To test the effectiveness and accuracy of this new approach, we assessed pancreata from normal C57Bl/6J mice and from mouse models of β-cell ablation (streptozotocin-treated mice) and β-cell hyperplasia (leptin-deficient mice), using a standardized systematic sampling of pancreatic specimens. Our data indicate that automated analysis of virtual pancreatic slides is highly reliable and yields results consistent with those obtained by conventional morphometric analysis. This new methodology will allow investigators to dramatically reduce the time required for β-cell mass measurement by automating high-resolution image capture and analysis of entire pancreatic sections. PMID:24760991
Application of ECT inspection to the first wall of a fusion reactor with wavelet analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, G.; Yoshida, Y.; Miya, K.
1994-12-31
The first wall of a fusion reactor will be subjected to intensive loads during fusion operations. Since these loads may cause defects in the first wall, nondestructive evaluation techniques of the first wall should be developed. In this paper, we try to apply eddy current testing (ECT) technique to the inspection of the first wall. A method based on current vector potential and wavelet analysis is proposed. Owing to the use of wavelet analysis, a new theory developed recently, the accuracy of the present method is shown to be better than a conventional one.
Facebook advertising for participant recruitment into a blood pressure clinical trial.
Nash, Erin L; Gilroy, Deborah; Srikusalanukul, Wichat; Abhayaratna, Walter P; Stanton, Tony; Mitchell, Geoffrey; Stowasser, Michael; Sharman, James E
2017-12-01
Recruitment of sufficient sample size into clinical trials is challenging. Conventional advertising methods are expensive and are often ineffective. The effectiveness of Facebook for recruitment into blood pressure clinical trials of middle-to-older-aged people is unknown. This study aimed to assess this by comparing Facebook advertising with conventional recruitment methods from a retrospective analysis within a clinical trial. Conventional advertisements (newspaper, radio and posters) were employed for the first 20 months of a randomized controlled clinical trial conducted in three Australian capital cities from Tasmania, Queensland and the Australian Capital Territory. With dwindling participant recruitment, at 20 months a Facebook advertising campaign was employed intermittently over a 4-month period. Recruitment results were retrospectively compared with those using conventional methods in the previous 4 months. Compared with conventional recruitment methods, Facebook advertisement was associated with a significant increase in the number of participants recruited in the Australian Capital Territory (from an average 1.8-7.3/month; P < 0.05). There was also an increase in Tasmania that was of borderline significance (from 4.0 participants recruited/month to 9.3/month; P = 0.052). However, there was no effect in Queensland (from 6.0 participants recruited/month to 3.0/month; P = 0.15). Facebook advertisement was associated with a significant decrease in the age of participants enquiring into the study (from 60.9 to 58.7 years; P < 0.001). Facebook advertising was successful in helping to increase recruitment of middle-to-older aged participants into a blood pressure clinical trial, although there may be some variability in effect that is dependent on location.
Lacroix, C; Gicquel, A; Sendid, B; Meyer, J; Accoceberry, I; François, N; Morio, F; Desoubeaux, G; Chandenier, J; Kauffmann-Lacroix, C; Hennequin, C; Guitard, J; Nassif, X; Bougnoux, M-E
2014-02-01
Candida spp. are responsible for severe infections in immunocompromised patients and those undergoing invasive procedures. The accurate identification of Candida species is important because emerging species can be associated with various antifungal susceptibility spectra. Conventional methods have been developed to identify the most common pathogens, but have often failed to identify uncommon species. Several studies have reported the efficiency of matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) for the identification of clinically relevant Candida species. In this study, we evaluated two commercially available MALDI-TOF systems, Andromas™ and Bruker Biotyper™, for Candida identification in routine diagnosis. For this purpose, we investigated 1383 Candida isolates prospectively collected in eight hospital laboratories during routine practice. MALDI-TOF MS results were compared with those obtained using conventional phenotypic methods. Analysis of rDNA gene sequences with internal transcribed regions or D1-D2 regions is considered the reference standard for identification. Both MALDI-TOF MS systems could accurately identify 98.3% of the isolates at the species level (1359/1383 for Andromas™; 1360/1383 for Bruker Biotyper™) vs. 96.5% for conventional techniques. Furthermore, whereas conventional methods failed to identify rare or emerging species, these were correctly identified by MALDI-TOF MS. Both MALDI-TOF MS systems are accurate and cost-effective alternatives to conventional methods for mycological identification of clinically relevant Candida species and should improve the diagnosis of fungal infections as well as patient management. © 2013 The Authors Clinical Microbiology and Infection © 2013 European Society of Clinical Microbiology and Infectious Diseases.
2013-01-01
Background A conventional gravimetry and electro-gravimetry study has been carried out for the precise and accurate purity determination of lead (Pb) in high purity lead stick and for preparation of reference standard. Reference materials are standards containing a known amount of an analyte and provide a reference value to determine unknown concentrations or to calibrate analytical instruments. A stock solution of approximate 2 kg has been prepared after dissolving approximate 2 g of Pb stick in 5% ultra pure nitric acid. From the stock solution five replicates of approximate 50 g have been taken for determination of purity by each method. The Pb has been determined as PbSO4 by conventional gravimetry, as PbO2 by electro gravimetry. The percentage purity of the metallic Pb was calculated accordingly from PbSO4 and PbO2. Results On the basis of experimental observations it has been concluded that by conventional gravimetry and electro-gravimetry the purity of Pb was found to be 99.98 ± 0.24 and 99.97 ± 0.27 g/100 g and on the basis of Pb purity the concentration of reference standard solutions were found to be 1000.88 ± 2.44 and 1000.81 ± 2.68 mg kg-1 respectively with 95% confidence level (k = 2). The uncertainty evaluation has also been carried out in Pb determination following EURACHEM/GUM guidelines. The final analytical results quantifying uncertainty fulfills this requirement and gives a measure of the confidence level of the concerned laboratory. Conclusions Gravimetry is the most reliable technique in comparison to titremetry and instrumental method and the results of gravimetry are directly traceable to SI unit. Gravimetric analysis, if methods are followed carefully, provides for exceedingly precise analysis. In classical gravimetry the major uncertainties are due to repeatability but in electro-gravimetry several other factors also affect the final results. PMID:23800080
Singh, Nahar; Singh, Niranjan; Tripathy, S Swarupa; Soni, Daya; Singh, Khem; Gupta, Prabhat K
2013-06-26
A conventional gravimetry and electro-gravimetry study has been carried out for the precise and accurate purity determination of lead (Pb) in high purity lead stick and for preparation of reference standard. Reference materials are standards containing a known amount of an analyte and provide a reference value to determine unknown concentrations or to calibrate analytical instruments. A stock solution of approximate 2 kg has been prepared after dissolving approximate 2 g of Pb stick in 5% ultra pure nitric acid. From the stock solution five replicates of approximate 50 g have been taken for determination of purity by each method. The Pb has been determined as PbSO4 by conventional gravimetry, as PbO2 by electro gravimetry. The percentage purity of the metallic Pb was calculated accordingly from PbSO4 and PbO2. On the basis of experimental observations it has been concluded that by conventional gravimetry and electro-gravimetry the purity of Pb was found to be 99.98 ± 0.24 and 99.97 ± 0.27 g/100 g and on the basis of Pb purity the concentration of reference standard solutions were found to be 1000.88 ± 2.44 and 1000.81 ± 2.68 mg kg-1 respectively with 95% confidence level (k = 2). The uncertainty evaluation has also been carried out in Pb determination following EURACHEM/GUM guidelines. The final analytical results quantifying uncertainty fulfills this requirement and gives a measure of the confidence level of the concerned laboratory. Gravimetry is the most reliable technique in comparison to titremetry and instrumental method and the results of gravimetry are directly traceable to SI unit. Gravimetric analysis, if methods are followed carefully, provides for exceedingly precise analysis. In classical gravimetry the major uncertainties are due to repeatability but in electro-gravimetry several other factors also affect the final results.
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.
1980-11-01
The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.
Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A
2008-01-01
Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.
Charpentier, R.R.; Klett, T.R.
2005-01-01
During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.
Giannaki, Christoforos D; Aphamis, George; Sakkis, Panikos; Hadjicharalambous, Marios
2016-04-01
High intensity interval training (HIIT) has been recently promoted as an effective, low volume and time-efficient training method for improving fitness and health related parameters. The aim of the current study was to examine the effect of a combination of a group-based HIIT and conventional gym training on physical fitness and body composition parameters in healthy adults. Thirty nine healthy adults volunteered to participate in this eight-week intervention study. Twenty three participants performed regular gym training 4 days a week (C group), whereas the remaining 16 participants engaged twice a week in HIIT and twice in regular gym training (HIIT-C group) as the other group. Total body fat and visceral adiposity levels were calculated using bioelectrical impedance analysis. Physical fitness parameters such as cardiorespiratory fitness, speed, lower limb explosiveness, flexibility and isometric arm strength were assessed through a battery of field tests. Both exercise programs were effective in reducing total body fat and visceral adiposity (P<0.05) and improving handgrip strength, sprint time, jumping ability and flexibility (P<0.05) whilst only the combination of HIIT and conventional training improved cardiorespiratory fitness levels (P<0.05). A between of group changes analysis revealed that HIIT-C resulted in significantly greater reduction in both abdominal girth and visceral adiposity compared with conventional training (P<0.05). Eight weeks of combined group-based HIIT and conventional training improve various physical fitness parameters and reduce both total and visceral fat levels. This type of training was also found to be superior compared with conventional exercise training alone in terms of reducing more visceral adiposity levels. Group-based HIIT may consider as a good methods for individuals who exercise in gyms and craving to acquire significant fitness benefits in relatively short period of time.
Zhang, Lei; Zeng, Zhi; Ji, Qiang
2011-09-01
Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.
Measurement of RBC agglutination with microscopic cell image analysis in a microchannel chip.
Cho, Chi Hyun; Kim, Ju Yeon; Nyeck, Agnes E; Lim, Chae Seung; Hur, Dae Sung; Chung, Chanil; Chang, Jun Keun; An, Seong Soo A; Shin, Sehyun
2014-01-01
Since Landsteiner's discovery of ABO blood groups, RBC agglutination has been one of the most important immunohematologic techniques for ABO and RhD blood groupings. The conventional RBC agglutination grading system for RhD blood typings relies on macroscopic reading, followed by the assignment of a grade ranging from (-) to (4+) to the degree of red blood cells clumping. However, with the new scoring method introduced in this report, microscopically captured cell images of agglutinated RBCs, placed in a microchannel chip, are used for analysis. Indeed, the cell images' pixel number first allows the differentiation of agglutinated and non-agglutinated red blood cells. Finally, the ratio of agglutinated RBCs per total RBC counts (CRAT) from 90 captured images is then calculated. During the trial, it was observed that the agglutinated group's CRAT was significantly higher (3.77-0.003) than that of the normal control (0). Based on these facts, it was established that the microchannel method was more suitable for the discrimination between agglutinated RBCs and non-agglutinated RhD negative, and thus more reliable for the grading of RBCs agglutination than the conventional method.
Mori, Masanobu; Nakano, Koji; Sasaki, Masaya; Shinozaki, Haruka; Suzuki, Shiho; Okawara, Chitose; Miró, Manuel; Itabashi, Hideyuki
2016-02-01
A dynamic flow-through microcolumn extraction system based on extractant re-circulation is herein proposed as a novel analytical approach for simplification of bioaccessibility tests of trace elements in sediments. On-line metal leaching is undertaken in the format of all injection (AI) analysis, which is a sequel of flow injection analysis, but involving extraction under steady-state conditions. The minimum circulation times and flow rates required to determine the maximum bioaccessible pools of target metals (viz., Cu, Zn, Cd, and Pb) from lake and river sediment samples were estimated using Tessier's sequential extraction scheme and an acid single extraction test. The on-line AIA method was successfully validated by mass balance studies of CRM and real sediment samples. Tessier's test in on-line AI format demonstrated to be carried out by one third of extraction time (6h against more than 17 h by the conventional method), with better analytical precision (<9.2% against >15% by the conventional method) and significant decrease in blank readouts as compared with the manual batch counterpart. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Roberts, Laura; Mancuso, Steven V.
2014-01-01
This mixed-methods study of 84 job advertisements for international school leaders on six continents from 2006 to 2012 entailed both qualitative and quantitative research methods. Job advertisements were obtained from the most active recruiting agency for school leaders worldwide. Conventional and summative content analysis procedures were used to…
USDA-ARS?s Scientific Manuscript database
Aims: Conventional phenotypic and genotypic analyses for the differentiation of phenotypically ambiguous Edwardsiella congeners was evaluated and historical E. tarda designations were linked to current taxonomic nomenclature. Methods and Results: Forty-seven Edwardsiella spp. isolates recovered over...
Strength and life criteria for corrugated fiberboard by three methods
Thomas J. Urbanik
1997-01-01
The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...
Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm
Veladi, H.
2014-01-01
A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717
Performance-based seismic design of steel frames utilizing colliding bodies algorithm.
Veladi, H
2014-01-01
A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm.
Use of vectors in sequence analysis.
Ishikawa, T; Yamamoto, K; Yoshikura, H
1987-10-01
Applications of the vector diagram, a new type of representation of protein structure, in homology search of various proteins including oncogene products are presented. The method takes account of various kinds of information concerning the properties of amino acids, such as Chou and Fasman's probability data. The method can detect conformational similarities of proteins which may not be detected by the conventional programs.
Challenging convention: symbolic interactionism and grounded theory.
Newman, Barbara
2008-01-01
Not very much is written in the literature about decisions made by researchers and the justifications on method as a result of a particular clinical problem, together with an appropriate and congruent theoretical perspective, particularly for Glaserian grounded theory. I contend the utilisation of symbolic interactionism as a theoretical perspective to inform and guide the evolving research process and analysis of data when using classic or Glaserian grounded theory (GT) method, is not always appropriate. Within this article I offer an analysis of the key issues to be addressed when contemplating the use of Glaserian GT and the utilisation of an appropriate theoretical perspective, rather than accepting convention of symbolic interactionism (SI). The analysis became imperative in a study I conducted that sought to explore the concerns, adaptive behaviours, psychosocial processes and relevant interactions over a 12-month period, among newly diagnosed persons with end stage renal disease, dependent on haemodialysis in the home environment for survival. The reality of perception was central to the end product in the study. Human ethics approval was granted by six committees within New South Wales Health Department and one from a university.
NASA Technical Reports Server (NTRS)
Lawson, Denise L.; James, Mark L.
1989-01-01
The Spacecraft Health Automated Reasoning Prototype (SHARP) is a system designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. Telecommunications link analysis of the Voyager 2 spacecraft is the initial focus for the SHARP system demonstration which will occur during Voyager's encounter with the planet Neptune in August, 1989, in parallel with real time Voyager operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. A brief introduction is given to the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory. The current method of operation for monitoring the Voyager Telecommunications subsystem is described, and the difficulties associated with the existing technology are highlighted. The approach taken in the SHARP system to overcome the current limitations is also described, as well as both the conventional and artificial intelligence solutions developed in SHARP.
SHARP: A multi-mission AI system for spacecraft telemetry monitoring and diagnosis
NASA Technical Reports Server (NTRS)
Lawson, Denise L.; James, Mark L.
1989-01-01
The Spacecraft Health Automated Reasoning Prototype (SHARP) is a system designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. Telecommunications link analysis of the Voyager II spacecraft is the initial focus for the SHARP system demonstration which will occur during Voyager's encounter with the planet Neptune in August, 1989, in parallel with real-time Voyager operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real-time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. A brief introduction is given to the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory. The current method of operation for monitoring the Voyager Telecommunications subsystem is described, and the difficulties associated with the existing technology are highlighted. The approach taken in the SHARP system to overcome the current limitations is also described, as well as both the conventional and artificial intelligence solutions developed in SHARP.
Point focusing using loudspeaker arrays from the perspective of optimal beamforming.
Bai, Mingsian R; Hsieh, Yu-Hao
2015-06-01
Sound focusing is to create a concentrated acoustic field in the region surrounded by a loudspeaker array. This problem was tackled in the previous research via the Helmholtz integral approach, brightness control, acoustic contrast control, etc. In this paper, the same problem was revisited from the perspective of beamforming. A source array model is reformulated in terms of the steering matrix between the source and the field points, which lends itself to the use of beamforming algorithms such as minimum variance distortionless response (MVDR) and linearly constrained minimum variance (LCMV) originally intended for sensor arrays. The beamforming methods are compared with the conventional methods in terms of beam pattern, directional index, and control effort. Objective tests are conducted to assess the audio quality by using perceptual evaluation of audio quality (PEAQ). Experiments of produced sound field and listening tests are conducted in a listening room, with results processed using analysis of variance and regression analysis. In contrast to the conventional energy-based methods, the results have shown that the proposed methods are phase-sensitive in light of the distortionless constraint in formulating the array filters, which helps enhance audio quality and focusing performance.
Shivasakthy, M.; Asharaf Ali, Syed
2013-01-01
Statement of Problem: A new material is proposed in dentistry in the form of strips for producing gingival retraction. The clinical efficacy of the material remains untested. Purpose of the Study: This study aimed to determine whether the polyvinyl acetate strips are able to effectively displace the gingival tissues in comparison with the conventional retraction cord. Material and Methods: Complete metal ceramic preparation with supra-gingival margin was performed in fourteen maxillary incisors and gingival retraction was done using Merocel strips and conventional retraction cords alternatively in 2 weeks time interval. The amount of displacement was compared using a digital vernier caliper of 0.01mm accuracy. Results were analyzed statistically using Paired students t-test. Results: The statistical analysis of the data revealed that both the conventional retraction cord and the Merocel strip produce significant retraction. Among both the materials, Merocel proved to be significantly more effective. Conclusion: Merocel strip produces more gingival displacement than the conventional retraction cord. PMID:24298531
Holographic Refraction and the Measurement of Spherical Ametropia.
Nguyen, Nicholas Hoai Nam
2016-10-01
To evaluate the performance of a holographic logMAR chart for the subjective spherical refraction of the human eye. Bland-Altman analysis was used to assess the level of agreement between subjective spherical refraction using the holographic logMAR chart and conventional autorefraction and subjective spherical refraction. The 95% limits of agreement (LoA) were calculated between holographic refraction and the two standard methods (subjective and autorefraction). Holographic refraction has a lower mean spherical refraction when compared to conventional refraction (LoA 0.11 ± 0.65 D) and when compared to autorefraction (LoA 0.36 ± 0.77 D). After correcting for systemic bias, this is comparable between autorefraction and conventional subjective refraction (LoA 0.45 ± 0.79 D). After correcting for differences in vergence distance and chromatic aberration between holographic and conventional refraction, approximately 65% (group 1) of measurements between holography and conventional subjective refraction were similar (MD = 0.13 D, SD = 0.00 D). The remaining 35% (group 2) had a mean difference of 0.45 D (SD = 0.12 D) between the two subjective methods. Descriptive statistics showed group 2's mean age (21 years, SD = 13 years) was considerably lower than group 1's mean age (41 years, SD = 17), suggesting accommodation may have a role in the greater mean difference of group 2. Overall, holographic refraction has good agreement with conventional refraction and is a viable alternative for spherical subjective refraction. A larger bias between holographic and conventional refraction was found in younger subjects than older subjects, suggesting an association between accommodation and myopic over-correction during holographic refraction.
Embryonic development in human oocytes fertilized by split insemination
Kim, Myo Sun; Kim, Jayeon; Youm, Hye Won; Park, Jung Yeon; Choi, Hwa Young
2015-01-01
Objective To compare the laboratory outcomes of intracytoplasmic sperm injection (ICSI) and conventional insemination using sibling oocytes in poor prognosis IVF cycles where ICSI is not indicated. Methods Couples undergoing IVF with following conditions were enrolled: history of more than 3 years of unexplained infertility, history of ≥3 failed intrauterine insemination, leukocytospermia or wide variation in semen analysis, poor oocyte quality, or ≥50% of embryos had poor quality in previous IVF cycle(s). Couples with severe male factor requiring ICSI were excluded. Oocytes were randomly assigned to the conventional insemination (conventional group) or ICSI (ICSI group). Fertilization rate (FR), total fertilization failure, and embryonic development at day 3 and day 5 were assessed. Results A total of 309 mature oocytes from 37 IVF cycles (32 couples) were obtained: 161 were assigned to conventional group and 148 to ICSI group. FR was significantly higher in the ICSI group compared to the conventional group (90.5% vs. 72.7%, P<0.001). Total fertilization failure occurred in only one cycle in conventional group. On day 3, the percentage of cleavage stage embryos was higher in ICSI group however the difference was marginally significant (P=0.055). In 11 cycles in which day 5 culture was attempted, the percentage of blastocyst (per cleaved embryo) was significantly higher in the ICSI group than the conventional group (55.9% vs. 25.9%, P=0.029). Conclusion Higher FR and more blastocyst could be achieved by ICSI in specific circumstances. Fertilization method can be tailored accordingly to improve IVF outcomes. PMID:26023671
Shao, Shiying; Guo, Tiannan; Gross, Vera; Lazarev, Alexander; Koh, Ching Chiek; Gillessen, Silke; Joerger, Markus; Jochum, Wolfram; Aebersold, Ruedi
2016-06-03
The reproducible and efficient extraction of proteins from biopsy samples for quantitative analysis is a critical step in biomarker and translational research. Recently, we described a method consisting of pressure-cycling technology (PCT) and sequential windowed acquisition of all theoretical fragment ions-mass spectrometry (SWATH-MS) for the rapid quantification of thousands of proteins from biopsy-size tissue samples. As an improvement of the method, we have incorporated the PCT-MicroPestle into the PCT-SWATH workflow. The PCT-MicroPestle is a novel, miniaturized, disposable mechanical tissue homogenizer that fits directly into the microTube sample container. We optimized the pressure-cycling conditions for tissue lysis with the PCT-MicroPestle and benchmarked the performance of the system against the conventional PCT-MicroCap method using mouse liver, heart, brain, and human kidney tissues as test samples. The data indicate that the digestion of the PCT-MicroPestle-extracted proteins yielded 20-40% more MS-ready peptide mass from all tissues tested with a comparable reproducibility when compared to the conventional PCT method. Subsequent SWATH-MS analysis identified a higher number of biologically informative proteins from a given sample. In conclusion, we have developed a new device that can be seamlessly integrated into the PCT-SWATH workflow, leading to increased sample throughput and improved reproducibility at both the protein extraction and proteomic analysis levels when applied to the quantitative proteomic analysis of biopsy-level samples.
Ullattuthodi, Sujana; Cherian, Kandathil Phillip; Anandkumar, R; Nambiar, M Sreedevi
2017-01-01
This in vitro study seeks to evaluate and compare the marginal and internal fit of cobalt-chromium copings fabricated using the conventional and direct metal laser sintering (DMLS) techniques. A master model of a prepared molar tooth was made using cobalt-chromium alloy. Silicone impression of the master model was made and thirty standardized working models were then produced; twenty working models for conventional lost-wax technique and ten working models for DMLS technique. A total of twenty metal copings were fabricated using two different production techniques: conventional lost-wax method and DMLS; ten samples in each group. The conventional and DMLS copings were cemented to the working models using glass ionomer cement. Marginal gap of the copings were measured at predetermined four points. The die with the cemented copings are standardized-sectioned with a heavy duty lathe. Then, each sectioned samples were analyzed for the internal gap between the die and the metal coping using a metallurgical microscope. Digital photographs were taken at ×50 magnification and analyzed using measurement software. Statistical analysis was done by unpaired t -test and analysis of variance (ANOVA). The results of this study reveal that no significant difference was present in the marginal gap of conventional and DMLS copings ( P > 0.05) by means of ANOVA. The mean values of internal gap of DMLS copings were significantly greater than that of conventional copings ( P < 0.05). Within the limitations of this in vitro study, it was concluded that the internal fit of conventional copings was superior to that of the DMLS copings. Marginal fit of the copings fabricated by two different techniques had no significant difference.
ERIC Educational Resources Information Center
Firdausiah Mansur, Andi Besse; Yusof, Norazah
2013-01-01
Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…
Chu, Y; Wu, D; Hou, Q F; Huo, X D; Gao, Y; Wang, T; Wang, H D; Yang, Y L; Liao, S X
2016-08-25
To investigate the value of array-based comparative genomic hybridization (array-CGH) technique for the detection of chromosomal analysis of miscarried embryo, and to provide genetic counseling for couples with spontaneous abortion. Totally 382 patients who underwent miscarriage were enrolled in this study. All aborted tissues were analyzed with conventional cytogenetic karyotyping and array-CGH, respectively. Through genetic analysis, all of the 382 specimens were successfully analyzed by array-CGH (100.0%, 382/382), and the detection rate of chromosomal aberrations was 46.6% (178/382). However, conventional karyotype analysis was successfully performed in 281 cases (73.6%, 281/382), and 113 (40.2%, 113/281) were found with chromosomal aberrations. Of these 178 samples identified by array-CGH, 163 samples (91.6%, 163/178) were aneuploidy, 15 samples (8.4%, 15/178) were segmental deletion and (or) duplication cases. Four of 10 cases with small segmental deletion and duplication were validated to be transferred from their fathers or mathers who were carriers of submicroscopic reciprocal translocation. Of these 113 abnormal karyotypes founded by conventional karyotyping, 108 cases (95.6%, 108/113) were aneuploidy and 5 cases (4.4%, 5/113) had chromosome structural aberrations. Most array-CGH results were consistent with conventional karyotyping but with 3 cases of discrepancy, which included 2 cases of triploids, 1 case of low-level mosaicism that undetcted by array-CGH. Compared with conventional karyotyping, there is an increased detection rate of chromosomal abnormalities when array-CGH is used to analyse the products of conception, primarilly because of its sucess with nonviable tissues. It could be a first-line method to determine the reason of miscarrage with higher accuracy and sensitivity.
Dionysopoulos, Dimitrios; Strakas, Dimitrios; Tolidis, Kosmas; Tsitrou, Effrosyni; Koumpia, Effimia; Koliniotou-Koumpia, Eugenia
2017-11-01
The purpose of this in vitro study was to compare the effectiveness of a novel Er,Cr:YSGG laser-assisted in-office tooth bleaching method with a conventional method by spectrophotometric analysis of the tooth color change. Furthermore, the influence of the application time of the bleaching gel on the effectiveness of the methods and the maintenance of the results 7 days and 1 month after the treatments were also evaluated. Twenty-four bovine incisors were stained and randomly distributed into four groups. Group 1 specimens received an in-office bleaching treatment with 35% H 2 O 2 for 2 × 15 min. Group 2 specimens received the same treatment but with extended application time (2 × 20 min). In Group 3, the same in-office bleaching procedure (2 × 15 min) was carried out as that in Group 1, using Er,Cr:YSGG laser irradiation for 2 × 15 s on each specimen to catalyze the reaction of H 2 O 2 breakdown. Group 4 specimens received the same bleaching treatment as Group 3 but with extended application time (2 × 20 min). Er,Cr:YSGG laser-assisted tooth bleaching treatment is more effective than the conventional treatment regarding color change of the teeth. Application time of the bleaching agent may influence the effectiveness of the methods. The color change of the tested treatments decreases after 7 days and 1 month. The clinical relevance of this study is that this novel laser-assisted bleaching treatment may be more advantageous in color change and application time compared to the conventional bleaching treatment.
NASA Astrophysics Data System (ADS)
Ren, Weiwei; Yang, Tao; Shi, Pengfei; Xu, Chong-yu; Zhang, Ke; Zhou, Xudong; Shao, Quanxi; Ciais, Philippe
2018-06-01
Climate change imposes profound influence on regional hydrological cycle and water security in many alpine regions worldwide. Investigating regional climate impacts using watershed scale hydrological models requires a large number of input data such as topography, meteorological and hydrological data. However, data scarcity in alpine regions seriously restricts evaluation of climate change impacts on water cycle using conventional approaches based on global or regional climate models, statistical downscaling methods and hydrological models. Therefore, this study is dedicated to development of a probabilistic model to replace the conventional approaches for streamflow projection. The probabilistic model was built upon an advanced Bayesian Neural Network (BNN) approach directly fed by the large-scale climate predictor variables and tested in a typical data sparse alpine region, the Kaidu River basin in Central Asia. Results show that BNN model performs better than the general methods across a number of statistical measures. The BNN method with flexible model structures by active indicator functions, which reduce the dependence on the initial specification for the input variables and the number of hidden units, can work well in a data limited region. Moreover, it can provide more reliable streamflow projections with a robust generalization ability. Forced by the latest bias-corrected GCM scenarios, streamflow projections for the 21st century under three RCP emission pathways were constructed and analyzed. Briefly, the proposed probabilistic projection approach could improve runoff predictive ability over conventional methods and provide better support to water resources planning and management under data limited conditions as well as enable a facilitated climate change impact analysis on runoff and water resources in alpine regions worldwide.
Otsuji, Kazutaka; Sasaki, Takeshi; Tanaka, Atsushi; Kunita, Akiko; Ikemura, Masako; Matsusaka, Keisuke; Tada, Keiichiro; Fukayama, Masashi; Seto, Yasuyuki
2017-02-01
Digital polymerase chain reaction (dPCR) has been used to yield an absolute measure of nucleic acid concentrations. Recently, a new method referred to as droplet digital PCR (ddPCR) has gained attention as a more precise and less subjective assay to quantify DNA amplification. We demonstrated the usefulness of ddPCR to determine HER2 gene amplification of breast cancer. In this study, we used ddPCR to measure the HER2 gene copy number in clinical formalin-fixed paraffin-embedded samples of 41 primary breast cancer patients. To improve the accuracy of ddPCR analysis, we also estimated the tumor content ratio (TCR) for each sample. Our determination method for HER2 gene amplification using the ddPCR ratio (ERBB2:ch17cent copy number ratio) combined with the TCR showed high consistency with the conventionally defined HER2 gene status according to ASCO-CAP (American Society of Clinical Oncology/College of American Pathologists) guidelines (P<0.0001, Fisher's exact test). The equivocal area was established by adopting 99% confidence intervals obtained by cell line assays, which made it possible to identify all conventionally HER2-positive cases with our method. In addition, we succeeded in automating a major part of the process from DNA extraction to determination of HER2 gene status. The introduction of ddPCR to determine the HER2 gene status in breast cancer is feasible for use in clinical practice and might complement or even replace conventional methods of examination in the future.
2012-01-01
Background Conventional transabdominal ultrasound usually fails to visualize parts of the ureter or extrahepatic bile duct covered by bowel gas. In this study, we propose a new method for gaining acoustic access to the ureters and extrahepatic bile duct to help determine the nature of obstruction to these structures when conventional transabdominal ultrasound fails. Methods The normal saline retention enema method, that is, using normal saline-filled colons to gain acoustic access to the bilateral ureters and extrahepatic bile duct and detecting the lesions with transabdominal ultrasonic diagnostic apparatus, was applied to 777 patients with obstructive lesions, including 603 with hydroureter and 174 with dilated common bile duct, which were not visualized by conventional ultrasonography. The follow-up data of all the patients were collected to verify the results obtained by this method. Results Of the 755 patients who successfully finished the examination after normal saline retention enema (the success rate of the enema is about 98%), the nature of obstruction in 718 patients was determined (the visualizing rate is approximately 95%), including 533 with ureteral calculus, 23 with ureteral stricture, 129 with extrahepatic bile duct calculus, and 33 with common bile duct tumor. Conclusions Colons filled fully with normal saline can surely give acoustic access to the bilateral ureters and extrahepatic bile duct so as to determine the nature of obstruction of these structures when conventional transabdominal ultrasound fails. PMID:22871226
Correlation and agreement of a digital and conventional method to measure arch parameters.
Nawi, Nes; Mohamed, Alizae Marny; Marizan Nor, Murshida; Ashar, Nor Atika
2018-01-01
The aim of the present study was to determine the overall reliability and validity of arch parameters measured digitally compared to conventional measurement. A sample of 111 plaster study models of Down syndrome (DS) patients were digitized using a blue light three-dimensional (3D) scanner. Digital and manual measurements of defined parameters were performed using Geomagic analysis software (Geomagic Studio 2014 software, 3D Systems, Rock Hill, SC, USA) on digital models and with a digital calliper (Tuten, Germany) on plaster study models. Both measurements were repeated twice to validate the intraexaminer reliability based on intraclass correlation coefficients (ICCs) using the independent t test and Pearson's correlation, respectively. The Bland-Altman method of analysis was used to evaluate the agreement of the measurement between the digital and plaster models. No statistically significant differences (p > 0.05) were found between the manual and digital methods when measuring the arch width, arch length, and space analysis. In addition, all parameters showed a significant correlation coefficient (r ≥ 0.972; p < 0.01) between all digital and manual measurements. Furthermore, a positive agreement between digital and manual measurements of the arch width (90-96%), arch length and space analysis (95-99%) were also distinguished using the Bland-Altman method. These results demonstrate that 3D blue light scanning and measurement software are able to precisely produce 3D digital model and measure arch width, arch length, and space analysis. The 3D digital model is valid to be used in various clinical applications.
Parish, Chad M.; Miller, Michael K.
2014-12-09
Nanostructured ferritic alloys (NFAs) exhibit complex microstructures consisting of 100-500 nm ferrite grains, grain boundary solute enrichment, and multiple populations of precipitates and nanoclusters (NCs). Understanding these materials' excellent creep and radiation-tolerance properties requires a combination of multiple atomic-scale experimental techniques. Recent advances in scanning transmission electron microscopy (STEM) hardware and data analysis methods have the potential to revolutionize nanometer to micrometer scale materials analysis. The application of these methods is applied to NFAs as a test case and is compared to both conventional STEM methods as well as complementary methods such as scanning electron microscopy and atom probe tomography.more » In this paper, we review past results and present new results illustrating the effectiveness of latest-generation STEM instrumentation and data analysis.« less
Crystal structure prediction supported by incomplete experimental data
NASA Astrophysics Data System (ADS)
Tsujimoto, Naoto; Adachi, Daiki; Akashi, Ryosuke; Todo, Synge; Tsuneyuki, Shinji
2018-05-01
We propose an efficient theoretical scheme for structure prediction on the basis of the idea of combining methods, which optimize theoretical calculation and experimental data simultaneously. In this scheme, we formulate a cost function based on a weighted sum of interatomic potential energies and a penalty function which is defined with partial experimental data totally insufficient for conventional structure analysis. In particular, we define the cost function using "crystallinity" formulated with only peak positions within the small range of the x-ray-diffraction pattern. We apply this method to well-known polymorphs of SiO2 and C with up to 108 atoms in the simulation cell and show that it reproduces the correct structures efficiently with very limited information of diffraction peaks. This scheme opens a new avenue for determining and predicting structures that are difficult to determine by conventional methods.
Effect of organic and conventional rearing system on the mineral content of pork.
Zhao, Yan; Wang, Donghua; Yang, Shuming
2016-08-01
Dietary composition and rearing regime largely determine the trace elemental composition of pigs, and consequently their concentration in animal products. The present study evaluates thirteen macro- and trace element concentrations in pork from organic and conventional farms. Conventional pigs were given a commercial feed with added minerals; organic pigs were given a feed based on organic feedstuffs. The content of macro-elements (Na, K, Mg and Ca) and some trace elements (Ni, Fe, Zn and Sr) in organic and conventional meat samples showed no significant differences (P>0.05). Several trace element concentrations in organic pork were significantly higher (P<0.05) compared to conventional pork: Cr (808 and 500μg/kg in organic and conventional pork, respectively), Mn (695 and 473μg/kg) and Cu (1.80 and 1.49mg/kg). The results showed considerable differences in mineral content between samples from pigs reared in organic and conventional systems. Our results also indicate that authentication of organic pork can be realized by applying multivariate chemometric methods such as discriminant analysis to this multi-element data. Copyright © 2016 Elsevier Ltd. All rights reserved.
PRODUCTION ENGINEERING AND MARKETING ANALYSIS OF THE ROTATING DISK EVAPORATOR
Recent EPA-funded research into the onsite, mechanical evaporation of wastewater from single family homes revealed that a rotating disk evaporator (RDE) could function in a nondischarging mode. Such a device has potential use where site limitations preclude conventional methods o...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandez, Jose M., E-mail: joseman@sas.upenn.edu; Plaza, Cesar; Polo, Alfredo
2012-01-15
Highlights: Black-Right-Pointing-Pointer Thermal analysis was used to assess stability and composition of organic matter in three diverse municipal waste streams. Black-Right-Pointing-Pointer Results were compared with C mineralization during 90-day incubation, FTIR and {sup 13}C NMR. Black-Right-Pointing-Pointer Thermal analysis reflected the differences between the organic wastes before and after the incubation. Black-Right-Pointing-Pointer The calculated energy density showed a strong correlation with cumulative respiration. Black-Right-Pointing-Pointer Conventional and thermal methods provide complimentary means of characterizing organic wastes. - Abstract: The use of organic municipal wastes as soil amendments is an increasing practice that can divert significant amounts of waste from landfill, and providesmore » a potential source of nutrients and organic matter to ameliorate degraded soils. Due to the high heterogeneity of organic municipal waste streams, it is difficult to rapidly and cost-effectively establish their suitability as soil amendments using a single method. Thermal analysis has been proposed as an evolving technique to assess the stability and composition of the organic matter present in these wastes. In this study, three different organic municipal waste streams (i.e., a municipal waste compost (MC), a composted sewage sludge (CS) and a thermally dried sewage sludge (TS)) were characterized using conventional and thermal methods. The conventional methods used to test organic matter stability included laboratory incubation with measurement of respired C, and spectroscopic methods to characterize chemical composition. Carbon mineralization was measured during a 90-day incubation, and samples before and after incubation were analyzed by chemical (elemental analysis) and spectroscopic (infrared and nuclear magnetic resonance) methods. Results were compared with those obtained by thermogravimetry (TG) and differential scanning calorimetry (DSC) techniques. Total amounts of CO{sub 2} respired indicated that the organic matter in the TS was the least stable, while that in the CS was the most stable. This was confirmed by changes detected with the spectroscopic methods in the composition of the organic wastes due to C mineralization. Differences were especially pronounced for TS, which showed a remarkable loss of aliphatic and proteinaceous compounds during the incubation process. TG, and especially DSC analysis, clearly reflected these differences between the three organic wastes before and after the incubation. Furthermore, the calculated energy density, which represents the energy available per unit of organic matter, showed a strong correlation with cumulative respiration. Results obtained support the hypothesis of a potential link between the thermal and biological stability of the studied organic materials, and consequently the ability of thermal analysis to characterize the maturity of municipal organic wastes and composts.« less
ERIC Educational Resources Information Center
Umeasiegbu, Veronica I.; Bishop, Malachy; Mpofu, Elias
2013-01-01
This article presents an analysis of the United Nations Convention on the Rights of Persons with Disabilities (CRPD) in relation to prior United Nations conventions on disability and U.S. disability policy law with a view to identifying the conventional and also the incremental advances of the CRPD. Previous United Nations conventions related to…
Method for measuring visual resolution at the retinal level.
Liang, J; Westheimer, G
1993-08-01
To measure the intrinsic resolving capacity of the retinal and neural levels of vision, we devised a method that creates two lines with controllable contrast on the retina. The line separation can be varied at will, down to values below those achievable with conventional optical techniques. Implementation of the method with use of a He-Ne laser leads to a procedure that permits analysis of the performance of the human visual apparatus.
Formalizing Space Shuttle Software Requirements
NASA Technical Reports Server (NTRS)
Crow, Judith; DiVito, Ben L.
1996-01-01
This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.
Automated lithology prediction from PGNAA and other geophysical logs.
Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T
2006-02-01
Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.
Parikh, Harshal R; De, Anuradha S; Baveja, Sujata M
2012-07-01
Physicians and microbiologists have long recognized that the presence of living microorganisms in the blood of a patient carries with it considerable morbidity and mortality. Hence, blood cultures have become critically important and frequently performed test in clinical microbiology laboratories for diagnosis of sepsis. To compare the conventional blood culture method with the lysis centrifugation method in cases of sepsis. Two hundred nonduplicate blood cultures from cases of sepsis were analyzed using two blood culture methods concurrently for recovery of bacteria from patients diagnosed clinically with sepsis - the conventional blood culture method using trypticase soy broth and the lysis centrifugation method using saponin by centrifuging at 3000 g for 30 minutes. Overall bacteria recovered from 200 blood cultures were 17.5%. The conventional blood culture method had a higher yield of organisms, especially Gram positive cocci. The lysis centrifugation method was comparable with the former method with respect to Gram negative bacilli. The sensitivity of lysis centrifugation method in comparison to conventional blood culture method was 49.75% in this study, specificity was 98.21% and diagnostic accuracy was 89.5%. In almost every instance, the time required for detection of the growth was earlier by lysis centrifugation method, which was statistically significant. Contamination by lysis centrifugation was minimal, while that by conventional method was high. Time to growth by the lysis centrifugation method was highly significant (P value 0.000) as compared to time to growth by the conventional blood culture method. For the diagnosis of sepsis, combination of the lysis centrifugation method and the conventional blood culture method with trypticase soy broth or biphasic media is advocable, in order to achieve faster recovery and a better yield of microorganisms.
Advancement of Analysis Method for Electromagnetic Screening Effect of Mountain Tunnel
NASA Astrophysics Data System (ADS)
Okutani, Tamio; Nakamura, Nobuyuki; Terada, Natsuki; Fukuda, Mitsuyoshi; Tate, Yutaka; Inada, Satoshi; Itoh, Hidenori; Wakao, Shinji
In this paper we report advancement of an analysis method for electromagnetic screening effect of mountain tunnel with a multiple conductor circuit model. On A.C. electrified railways it is a great issue to manage the influence of electromagnetic induction caused by feeding circuits. Tunnels are said to have a screening effect to reduce the electromagnetic induction because a large amount of steel is used in the tunnels. But recently the screening effect is less expected because New Austrian Tunneling Method (NATM), in which the amount of steel used is less than in conventional methods, is adopted as the standard tunneling method for constructing mountain tunnels. So we measured and analyzed the actual screening effect of mountain tunnels constructed with NATM. In the process of the analysis we have advanced a method to analyze the screening effect more precisely. In this method we can adequately model tunnel structure as a part of multiple conductor circuit.
Fasihi, Yasser; Fooladi, Saba; Mohammadi, Mohammad Ali; Emaneini, Mohammad; Kalantar-Neyestanaki, Davood
2017-09-06
Molecular typing is an important tool for control and prevention of infection. A suitable molecular typing method for epidemiological investigation must be easy to perform, highly reproducible, inexpensive, rapid and easy to interpret. In this study, two molecular typing methods including the conventional PCR-sequencing method and high resolution melting (HRM) analysis were used for staphylococcal protein A (spa) typing of 30 Methicillin-resistant Staphylococcus aureus (MRSA) isolates recovered from clinical samples. Based on PCR-sequencing method results, 16 different spa types were identified among the 30 MRSA isolates. Among the 16 different spa types, 14 spa types separated by HRM method. Two spa types including t4718 and t2894 were not separated from each other. According to our results, spa typing based on HRM analysis method is very rapid, easy to perform and cost-effective, but this method must be standardized for different regions, spa types, and real-time machinery.
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul
1998-01-01
This paper describes a finite volume computational thermo-fluid dynamics method to solve for Navier-Stokes equations in conjunction with energy equation and thermodynamic equation of state in an unstructured coordinate system. The system of equations have been solved by a simultaneous Newton-Raphson method and compared with several benchmark solutions. Excellent agreements have been obtained in each case and the method has been found to be significantly faster than conventional Computational Fluid Dynamic(CFD) methods and therefore has the potential for implementation in Multi-Disciplinary analysis and design optimization in fluid and thermal systems. The paper also describes an algorithm of design optimization based on Newton-Raphson method which has been recently tested in a turbomachinery application.
Chen, Xiaodong; Ren, Liqiang; Zheng, Bin; Liu, Hong
2013-01-01
The conventional optical microscopes have been used widely in scientific research and in clinical practice. The modern digital microscopic devices combine the power of optical imaging and computerized analysis, archiving and communication techniques. It has a great potential in pathological examinations for improving the efficiency and accuracy of clinical diagnosis. This chapter reviews the basic optical principles of conventional microscopes, fluorescence microscopes and electron microscopes. The recent developments and future clinical applications of advanced digital microscopic imaging methods and computer assisted diagnosis schemes are also discussed.
NASA Astrophysics Data System (ADS)
Tang, Xiaoxing; Qian, Yuan; Guo, Yanchuan; Wei, Nannan; Li, Yulan; Yao, Jian; Wang, Guanghua; Ma, Jifei; Liu, Wei
2017-12-01
A novel method has been improved for analyzing atmospheric pollutant metals (Be, Mn, Fe, Co, Ni, Cu, Zn, Se, Sr, Cd, and Pb) by laser ablation inductively coupled plasma mass spectrometry. In this method, solid standards are prepared by depositing droplets of aqueous standard solutions on the surface of a membrane filter, which is the same type as used for collecting atmospheric pollutant metals. Laser parameters were optimized, and ablation behaviors of the filter discs were studied. The mode of radial line scans across the filter disc was a representative ablation strategy and can avoid error from the inhomogeneous filter standards and marginal effect of the filter disc. Pt, as the internal standard, greatly improved the correlation coefficient of the calibration curve. The developed method provides low detection limits, from 0.01 ng m- 3 for Be and Co to 1.92 ng m- 3 for Fe. It was successfully applied for the determination of atmospheric pollutant metals collected in Lhasa, China. The analytical results showed good agreement with those obtained by conventional liquid analysis. In contrast to the conventional acid digestion procedure, the novel method not only greatly reduces sample preparation and shortens the analysis time but also provides a possible means for studying the spatial distribution of atmospheric filter samples.
Factors influencing the results of faculty evaluation in Isfahan University of Medical Sciences.
Kamali, Farahnaz; Yamani, Nikoo; Changiz, Tahereh; Zoubin, Fatemeh
2018-01-01
This study aimed to explore factors influencing the results of faculty member evaluation from the viewpoints of faculty members affiliated with Isfahan University of Medical Sciences, Isfahan, Iran. This qualitative study was done using a conventional content analysis method. Participants were faculty members of Isfahan University of Medical Sciences who, considering maximum variation in sampling, were chosen with a purposive sampling method. Semi-structured interviews were held with 11 faculty members until data saturation was reached. The interviews were transcribed verbatim and analyzed with conventional content analysis method for theme development. Further, the MAXQDA software was used for data management. The data analysis led to the development of two main themes, namely, "characteristics of the educational system" and "characteristics of the faculty member evaluation system." The first main theme consists of three categories, i.e. "characteristics of influential people in evaluation," "features of the courses," and "background characteristics." The other theme has the following as its categories: "evaluation methods," "evaluation tools," "evaluation process," and "application of evaluation results." Each category will have its subcategories. Many factors affect the evaluation of faculty members that should be taken into account by educational policymakers for improving the quality of the educational process. In addition to the factors that directly influence the educational system, methodological problems in the evaluation system need special attention.
[An improved medical image fusion algorithm and quality evaluation].
Chen, Meiling; Tao, Ling; Qian, Zhiyu
2009-08-01
Medical image fusion is of very important value for application in medical image analysis and diagnosis. In this paper, the conventional method of wavelet fusion is improved,so a new algorithm of medical image fusion is presented and the high frequency and low frequency coefficients are studied respectively. When high frequency coefficients are chosen, the regional edge intensities of each sub-image are calculated to realize adaptive fusion. The choice of low frequency coefficient is based on the edges of images, so that the fused image preserves all useful information and appears more distinctly. We apply the conventional and the improved fusion algorithms based on wavelet transform to fuse two images of human body and also evaluate the fusion results through a quality evaluation method. Experimental results show that this algorithm can effectively retain the details of information on original images and enhance their edge and texture features. This new algorithm is better than the conventional fusion algorithm based on wavelet transform.
Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.
2018-01-01
Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Tupinambá, Rogerio Amaral; Claro, Cristiane Aparecida de Assis; Pereira, Cristiane Aparecida; Nobrega, Celestino José Prudente; Claro, Ana Paula Rosifini Alves
2017-01-01
ABSTRACT Introduction: Plasma-polymerized film deposition was created to modify metallic orthodontic brackets surface properties in order to inhibit bacterial adhesion. Methods: Hexamethyldisiloxane (HMDSO) polymer films were deposited on conventional (n = 10) and self-ligating (n = 10) stainless steel orthodontic brackets using the Plasma-Enhanced Chemical Vapor Deposition (PECVD) radio frequency technique. The samples were divided into two groups according to the kind of bracket and two subgroups after surface treatment. Scanning Electron Microscopy (SEM) analysis was performed to assess the presence of bacterial adhesion over samples surfaces (slot and wings region) and film layer integrity. Surface roughness was assessed by Confocal Interferometry (CI) and surface wettability, by goniometry. For bacterial adhesion analysis, samples were exposed for 72 hours to a Streptococcus mutans solution for biofilm formation. The values obtained for surface roughness were analyzed using the Mann-Whitney test while biofilm adhesion were assessed by Kruskal-Wallis and SNK test. Results: Significant statistical differences (p< 0.05) for surface roughness and bacterial adhesion reduction were observed on conventional brackets after surface treatment and between conventional and self-ligating brackets; no significant statistical differences were observed between self-ligating groups (p> 0.05). Conclusion: Plasma-polymerized film deposition was only effective on reducing surface roughness and bacterial adhesion in conventional brackets. It was also noted that conventional brackets showed lower biofilm adhesion than self-ligating brackets despite the absence of film. PMID:28902253
Wong, M S; Cheng, J C Y; Wong, M W; So, S F
2005-04-01
A study was conducted to compare the CAD/CAM method with the conventional manual method in fabrication of spinal orthoses for patients with adolescent idiopathic scoliosis. Ten subjects were recruited for this study. Efficiency analyses of the two methods were performed from cast filling/ digitization process to completion of cast/image rectification. The dimensional changes of the casts/ models rectified by the two cast rectification methods were also investigated. The results demonstrated that the CAD/CAM method was faster than the conventional manual method in the studied processes. The mean rectification time of the CAD/CAM method was shorter than that of the conventional manual method by 108.3 min (63.5%). This indicated that the CAD/CAM method took about 1/3 of the time of the conventional manual to finish cast rectification. In the comparison of cast/image dimensional differences between the conventional manual method and the CAD/CAM method, five major dimensions in each of the five rectified regions namely the axilla, thoracic, lumbar, abdominal and pelvic regions were involved. There were no significant dimensional differences (p < 0.05) in 19 out of the 25 studied dimensions. This study demonstrated that the CAD/CAM system could save the time in the rectification process and offer a relatively high resemblance in cast rectification as compared with the conventional manual method.
Moon, Myungjin; Nakai, Kenta
2018-04-01
Currently, cancer biomarker discovery is one of the important research topics worldwide. In particular, detecting significant genes related to cancer is an important task for early diagnosis and treatment of cancer. Conventional studies mostly focus on genes that are differentially expressed in different states of cancer; however, noise in gene expression datasets and insufficient information in limited datasets impede precise analysis of novel candidate biomarkers. In this study, we propose an integrative analysis of gene expression and DNA methylation using normalization and unsupervised feature extractions to identify candidate biomarkers of cancer using renal cell carcinoma RNA-seq datasets. Gene expression and DNA methylation datasets are normalized by Box-Cox transformation and integrated into a one-dimensional dataset that retains the major characteristics of the original datasets by unsupervised feature extraction methods, and differentially expressed genes are selected from the integrated dataset. Use of the integrated dataset demonstrated improved performance as compared with conventional approaches that utilize gene expression or DNA methylation datasets alone. Validation based on the literature showed that a considerable number of top-ranked genes from the integrated dataset have known relationships with cancer, implying that novel candidate biomarkers can also be acquired from the proposed analysis method. Furthermore, we expect that the proposed method can be expanded for applications involving various types of multi-omics datasets.
2009-12-01
ER D C/ EL T R- 09 -2 1 Preconstruction Biogeochemical Analysis of Mercury in Wetlands Bordering the Hamilton Army Airfield (HAAF) Wetlands...Preconstruction Biogeochemical Analysis of Mercury in Wetlands Bordering the Hamilton Army Airfield (HAAF) Wetlands Restoration Site Part 3 Elly P. H... mercury methylation and demethylation, and biogeochemical parameters related to the mercury cycle as measured by both conventional and emerging methods
[Cytocompatibility of nanophase hydroxyapatite ceramics].
Wen, Bo; Chen, Zhi-qing; Jiang, Yin-shan; Yang, Zheng-wen; Xu, Yong-zhong
2004-12-01
To evaluate the cytocompatibility of nanophase hydroxyapatite ceramics in vitro. Hydroxyapatite (HA) was prepared via wet method. The grain size of the hydroxyapatite in the study was determined by scanning electron microscope and atomic force microscope with image analysis software. Primary osteoblast culture was established from rat calvaria. Cell adherence and proliferation on nanophase hydroxyapatite ceramics and conventional hydroxyapatite ceramics were examined at 1, 3, 5, 7 days. Morphology of the cells was observed by microscope. The average grain size of the nanophase and conventional HA was 55 nm and 780 nm, respectively. Throughout 7 days period, osteoblast proliferation on the HA was similar to that on tissue culture borosilicate glass controls, osteoblasts could attach, spread and proliferate on HA. However, compared to conventional ceramics, osteoblast proliferation on nanophase HA was significantly better after 1, 3, 5 and 7 days. Cytocompatibility of nanophase HA was significantly better than conventional ceramics.
Saha, Sonali; Jaiswal, JN; Samadi, Firoza
2014-01-01
ABSTRACT Aim: The present study was taken up to clinically evaluate and compare effectiveness of transcutaneous electrical nerve stimulator (TENS) and comfort control syringe (CCS) in various pediatric dental procedures as an alternative to the conventional method of local anesthesia (LA) administration. Materials and methods: Ninety healthy children having at least one deciduous molar tooth indicated for extraction in either maxillary right or left quadrant in age group of 6 to 10 years were randomly divided into three equal groups having 30 subjects each. Group I: LA administration using conventional syringe, group II: LA administration using TENS along with the conventional syringe, group III: LA administration using CCS. After LA by the three techniques, pain, anxiety and heart rate were measured. Statistical analysis: The observations, thus, obtained were subjected to statistical analysis using analysis of variance (ANOVA), student t-test and paired t-test. Results: The mean pain score was maximum in group I followed by group II, while group III revealed the minimum pain, where LA was administered using CCS. Mean anxiety score was maximum in group I followed by group II, while group III revealed the minimum score. Mean heart rate was maximum in group I followed in descending order by groups II and III. Conclusion: The study supports the belief that CCS could be a viable alternative in comparison to the other two methods of LA delivery in children. How to cite this article: Bansal N, Saha S, Jaiswal JN, Samadi F. Pain Elimination during Injection with Newer Electronic Devices: A Comparative Evaluation in Children. Int J Clin Pediatr Dent 2014;7(2):71-76. PMID:25356003
Digital versus conventional techniques for pattern fabrication of implant-supported frameworks
Alikhasi, Marzieh; Rohanian, Ahmad; Ghodsi, Safoura; Kolde, Amin Mohammadpour
2018-01-01
Objective: The aim of this experimental study was to compare retention of frameworks cast from wax patterns fabricated by three different methods. Materials and Methods: Thirty-six implant analogs connected to one-piece abutments were divided randomly into three groups according to the wax pattern fabrication method (n = 12). Computer-aided design/computer-aided manufacturing (CAD/CAM) milling machine, three-dimensional printer, and conventional technique were used for fabrication of waxing patterns. All laboratory procedures were performed by an expert-reliable technician to eliminate intra-operator bias. The wax patterns were cast, finished, and seated on related abutment analogs. The number of adjustment times was recorded and analyzed by Kruskal–Wallis test. Frameworks were cemented on the corresponding analogs with zinc phosphate cement and tensile resistance test was used to measure retention value. Statistical Analysis Used: One-way analysis of variance (ANOVA) and post hoc Tukey tests were used for statistical analysis. Level of significance was set at P < 0.05. Results: The mean retentive values of 680.36 ± 21.93 N, 440.48 ± 85.98 N, and 407.23 ± 67.48 N were recorded for CAD/CAM, rapid prototyping, and conventional group, respectively. One-way ANOVA test revealed significant differences among the three groups (P < 0.001). The post hoc Tukey test showed significantly higher retention for CAD/CAM group (P < 0.001), while there was no significant difference between the two other groups (P = 0.54). CAD/CAM group required significantly more adjustments (P < 0.001). Conclusions: CAD/CAM-fabricated wax patterns showed significantly higher retention for implant-supported cement-retained frameworks; this could be a valuable help when there are limitations in the retention of single-unit implant restorations. PMID:29657528
Optical coherence tomography use in the diagnosis of enamel defects
NASA Astrophysics Data System (ADS)
Al-Azri, Khalifa; Melita, Lucia N.; Strange, Adam P.; Festy, Frederic; Al-Jawad, Maisoon; Cook, Richard; Parekh, Susan; Bozec, Laurent
2016-03-01
Molar incisor hypomineralization (MIH) affects the permanent incisors and molars, whose undermineralized matrix is evidenced by lesions ranging from white to yellow/brown opacities to crumbling enamel lesions incapable of withstanding normal occlusal forces and function. Diagnosing the condition involves clinical and radiographic examination of these teeth, with known limitations in determining the depth extent of the enamel defects in particular. Optical coherence tomography (OCT) is an emerging hard and soft tissue imaging technique, which was investigated as a new potential diagnostic method in dentistry. A comparison between the diagnostic potential of the conventional methods and OCT was conducted. Compared to conventional imaging methods, OCT gave more information on the structure of the enamel defects as well as the depth extent of the defects into the enamel structure. Different types of enamel defects were compared, each type presenting a unique identifiable pattern when imaged using OCT. Additionally, advanced methods of OCT image analysis including backscattered light intensity profile analysis and enface reconstruction were performed. Both methods confirmed the potential of OCT in enamel defects diagnosis. In conclusion, OCT imaging enabled the identification of the type of enamel defect and the determination of the extent of the enamel defects in MIH with the advantage of being a radiation free diagnostic technique.
[Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].
Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia
2008-07-01
Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.
Kuang, Ming-Jie; Du, Yuren; Ma, Jian-Xiong; He, Weiwei; Fu, Lin; Ma, Xin-Long
2017-04-01
Total knee arthroplasty (TKA) is gradually emerging as the treatment of choice for end-stage osteoarthritis. In the past, the method of liposomal bupivacaine by periarticular injection (PAI) showed better effects on pain reduction and opioid consumption after surgery. However, some recent studies have reported that liposomal bupivacaine by PAI did not improve pain control and functional recovery in patients undergoing TKA. Therefore, this meta-analysis was conducted to determine whether liposomal bupivacaine provides better pain relief and functional recovery after TKA. Web of Science, PubMed, Embase, and the Cochrane Library were comprehensively searched. Randomized controlled trials, controlled clinical trials, and cohort studies were included in our meta-analysis. Eleven studies that compared liposomal bupivacaine using the PAI technique with the conventional PAI method were included in our meta-analysis. The preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines and Cochrane Handbook were applied to assess the quality of the results published in all included studies to ensure that the results of our meta-analysis were reliable and veritable. Our pooled data analysis demonstrated that liposomal bupivacaine was as effective as the control group in terms of visual analog scale score at 24 hours (P = .46), 48 hours (P = .43), 72 hours (P = .21), total amount of opioid consumption (P = .25), range of motion (P = .28), length of hospital stay (P = .53), postoperative nausea (P = .34), and ambulation distance (P = .07). Compared with the conventional PAI method, liposomal bupivacaine shows similar pain control and functional recovery after TKA. Considering the cost for pain control, liposomal bupivacaine is not worthy of being recommended as a long-acting alternative analgesic agent using the PAI method. Copyright © 2016 Elsevier Inc. All rights reserved.
A New SEYHAN's Approach in Case of Heterogeneity of Regression Slopes in ANCOVA.
Ankarali, Handan; Cangur, Sengul; Ankarali, Seyit
2018-06-01
In this study, when the assumptions of linearity and homogeneity of regression slopes of conventional ANCOVA are not met, a new approach named as SEYHAN has been suggested to use conventional ANCOVA instead of robust or nonlinear ANCOVA. The proposed SEYHAN's approach involves transformation of continuous covariate into categorical structure when the relationship between covariate and dependent variable is nonlinear and the regression slopes are not homogenous. A simulated data set was used to explain SEYHAN's approach. In this approach, we performed conventional ANCOVA in each subgroup which is constituted according to knot values and analysis of variance with two-factor model after MARS method was used for categorization of covariate. The first model is a simpler model than the second model that includes interaction term. Since the model with interaction effect has more subjects, the power of test also increases and the existing significant difference is revealed better. We can say that linearity and homogeneity of regression slopes are not problem for data analysis by conventional linear ANCOVA model by helping this approach. It can be used fast and efficiently for the presence of one or more covariates.
Acquired Codes of Meaning in Data Visualization and Infographics: Beyond Perceptual Primitives.
Byrne, Lydia; Angus, Daniel; Wiles, Janet
2016-01-01
While information visualization frameworks and heuristics have traditionally been reluctant to include acquired codes of meaning, designers are making use of them in a wide variety of ways. Acquired codes leverage a user's experience to understand the meaning of a visualization. They range from figurative visualizations which rely on the reader's recognition of shapes, to conventional arrangements of graphic elements which represent particular subjects. In this study, we used content analysis to codify acquired meaning in visualization. We applied the content analysis to a set of infographics and data visualizations which are exemplars of innovative and effective design. 88% of the infographics and 71% of data visualizations in the sample contain at least one use of figurative visualization. Conventions on the arrangement of graphics are also widespread in the sample. In particular, a comparison of representations of time and other quantitative data showed that conventions can be specific to a subject. These results suggest that there is a need for information visualization research to expand its scope beyond perceptual channels, to include social and culturally constructed meaning. Our paper demonstrates a viable method for identifying figurative techniques and graphic conventions and integrating them into heuristics for visualization design.
Bicchi, Carlo; Liberto, Erica; Cagliero, Cecilia; Cordero, Chiara; Sgorbini, Barbara; Rubiolo, Patrizia
2008-11-28
The analysis of complex real-world samples of vegetable origin requires rapid and accurate routine methods, enabling laboratories to increase sample throughput and productivity while reducing analysis costs. This study examines shortening enantioselective-GC (ES-GC) analysis time following the approaches used in fast GC. ES-GC separations are due to a weak enantiomer-CD host-guest interaction and the separation is thermodynamically driven and strongly influenced by temperature. As a consequence, fast temperature rates can interfere with enantiomeric discrimination; thus the use of short and/or narrow bore columns is a possible approach to speeding-up ES-GC analyses. The performance of ES-GC with a conventional inner diameter (I.D.) column (25 m length x 0.25 mm I.D., 0.15 microm and 0.25 microm d(f)) coated with 30% of 2,3-di-O-ethyl-6-O-tert-butyldimethylsilyl-beta-cyclodextrin in PS-086 is compared to those of conventional I.D. short column (5m length x 0.25 mm I.D., 0.15 microm d(f)) and of different length narrow bore columns (1, 2, 5 and 10 m long x 0.10 mm I.D., 0.10 microm d(f)) in analysing racemate standards of pesticides and in the flavour and fragrance field and real-world-samples. Short conventional I.D. columns gave shorter analysis time and comparable or lower resolutions with the racemate standards, depending mainly on analyte volatility. Narrow-bore columns were tested under different analysis conditions; they provided shorter analysis time and resolutions comparable to those of conventional I.D. ES columns. The narrow-bore columns offering the most effective compromise between separation efficiency and analysis time are the 5 and 2m columns; in combination with mass spectrometry as detector, applied to lavender and bergamot essential oil analyses, these reduced analysis time by a factor of at least three while separation of chiral markers remained unaltered.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuo, Rui; Wu, C. F. Jeff
Many computer models contain unknown parameters which need to be estimated using physical observations. Furthermore, the calibration method based on Gaussian process models may lead to unreasonable estimate for imperfect computer models. In this work, we extend their study to calibration problems with stochastic physical data. We propose a novel method, called the L 2 calibration, and show its semiparametric efficiency. The conventional method of the ordinary least squares is also studied. Theoretical analysis shows that it is consistent but not efficient. Here, numerical examples show that the proposed method outperforms the existing ones.
Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A
2013-09-01
The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.
Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.
The multiple decrement life table: a unifying framework for cause-of-death analysis in ecology.
Carey, James R
1989-01-01
The multiple decrement life table is used widely in the human actuarial literature and provides statistical expressions for mortality in three different forms: i) the life table from all causes-of-death combined; ii) the life table disaggregated into selected cause-of-death categories; and iii) the life table with particular causes and combinations of causes eliminated. The purpose of this paper is to introduce the multiple decrement life table to the ecological literature by applying the methods to published death-by-cause information on Rhagoletis pomonella. Interrelations between the current approach and conventional tools used in basic and applied ecology are discussed including the conventional life table, Key Factor Analysis and Abbott's Correction used in toxicological bioassay.
Factors influencing the results of faculty evaluation in Isfahan University of Medical Sciences
Kamali, Farahnaz; Yamani, Nikoo; Changiz, Tahereh; Zoubin, Fatemeh
2018-01-01
OBJECTIVE: This study aimed to explore factors influencing the results of faculty member evaluation from the viewpoints of faculty members affiliated with Isfahan University of Medical Sciences, Isfahan, Iran. MATERIALS AND METHODS: This qualitative study was done using a conventional content analysis method. Participants were faculty members of Isfahan University of Medical Sciences who, considering maximum variation in sampling, were chosen with a purposive sampling method. Semi-structured interviews were held with 11 faculty members until data saturation was reached. The interviews were transcribed verbatim and analyzed with conventional content analysis method for theme development. Further, the MAXQDA software was used for data management. RESULTS: The data analysis led to the development of two main themes, namely, “characteristics of the educational system” and “characteristics of the faculty member evaluation system.” The first main theme consists of three categories, i.e. “characteristics of influential people in evaluation,” “features of the courses,” and “background characteristics.” The other theme has the following as its categories: “evaluation methods,” “evaluation tools,” “evaluation process,” and “application of evaluation results.” Each category will have its subcategories. CONCLUSIONS: Many factors affect the evaluation of faculty members that should be taken into account by educational policymakers for improving the quality of the educational process. In addition to the factors that directly influence the educational system, methodological problems in the evaluation system need special attention. PMID:29417073
Natural leathers from natural materials: progressing toward a new arena in leather processing.
Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari
2004-02-01
Globally, the leather industry is currently undergoing radical transformation due to pollution and discharge legislations. Thus, the leather industry is pressurized to look for cleaner options for processing the raw hides and skins. Conventional methods of pre-tanning, tanning and post-tanning processes are known to contribute more than 98% of the total pollution load from the leather processing. The conventional method of the tanning process involves the "do-undo" principle. Furthermore, the conventional methods employed in leather processing subject the skin/ hide to a wide variation in pH (2.8-13.0). This results in the emission of huge amounts of pollution loads such as BOD, COD, TDS, TS, sulfates, chlorides and chromium. In the approach illustrated here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0, pickle-free natural tanning employing vegetable tannins, and post-tanning using environmentally friendly chemicals. Hence, this process involves dehairing, fiber opening, and pickle-free natural tanning followed by ecofriendly post-tanning. It has been found that the extent of hair removal and opening up of fiber bundles is comparable to that of conventionally processed leathers. This has been substantiated through scanning electron microscopic analysis and softness measurements. Performance of the leathers is shown to be on par with conventionally chrome-tanned leathers through physical and hand evaluation. The process also exhibits zero metal (chromium) discharge and significant reduction in BOD, COD, TDS, and TS loads by 83, 69, 96, and 96%, respectively. Furthermore, the developed process seems to be economically viable.
Hey, Hwee Weng Dennis; Lau, Eugene Tze-Chun; Lim, Joel-Louis; Choong, Denise Ai-Wen; Tan, Chuen-Seng; Liu, Gabriel Ka-Po; Wong, Hee-Kit
2017-03-01
Flexion radiographs have been used to identify cases of spinal instability. However, current methods are not standardized and are not sufficiently sensitive or specific to identify instability. This study aimed to introduce a new slump sitting method for performing lumbar spine flexion radiographs and comparison of the angular range of motions (ROMs) and displacements between the conventional method and this new method. This study used is a prospective study on radiological evaluation of the lumbar spine flexion ROMs and displacements using dynamic radiographs. Sixty patients were recruited from a single spine tertiary center. Angular and displacement measurements of lumbar spine flexion were carried out. Participants were randomly allocated into two groups: those who did the new method first, followed by the conventional method versus those who did the conventional method first, followed by the new method. A comparison of the angular and displacement measurements of lumbar spine flexion between the conventional method and the new method was performed and tested for superiority and non-inferiority. The measurements of global lumbar angular ROM were, on average, 17.3° larger (p<.0001) using the new slump sitting method compared with the conventional method. They were most significant at the levels of L3-L4, L4-L5, and L5-S1 (p<.0001, p<.0001 and p=.001, respectively). There was no significant difference between both methods when measuring lumbar displacements (p=.814). The new method of slump sitting dynamic radiograph was shown to be superior to the conventional method in measuring the angular ROM and non-inferior to the conventional method in the measurement of displacement. Copyright © 2016 Elsevier Inc. All rights reserved.
The influence of the compression interface on the failure behavior and size effect of concrete
NASA Astrophysics Data System (ADS)
Kampmann, Raphael
The failure behavior of concrete materials is not completely understood because conventional test methods fail to assess the material response independent of the sample size and shape. To study the influence of strength and strain affecting test conditions, four typical concrete sample types were experimentally evaluated in uniaxial compression and analyzed for strength, deformational behavior, crack initiation/propagation, and fracture patterns under varying boundary conditions. Both low friction and conventional compression interfaces were assessed. High-speed video technology was used to monitor macrocracking. Inferential data analysis proved reliably lower strength results for reduced surface friction at the compression interfaces, regardless of sample shape. Reciprocal comparisons revealed statistically significant strength differences between most sample shapes. Crack initiation and propagation was found to differ for dissimilar compression interfaces. The principal stress and strain distributions were analyzed, and the strain domain was found to resemble the experimental results, whereas the stress analysis failed to explain failure for reduced end confinement. Neither stresses nor strains indicated strength reductions due to reduced friction, and therefore, buckling effects were considered. The high-speed video analysis revealed localize buckling phenomena, regardless of end confinement. Slender elements were the result of low friction, and stocky fragments developed under conventional confinement. The critical buckling load increased accordingly. The research showed that current test methods do not reflect the "true'' compressive strength and that concrete failure is strain driven. Ultimate collapse results from buckling preceded by unstable cracking.
High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets
NASA Astrophysics Data System (ADS)
Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong
2008-02-01
Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.
Kremen, Arie; Tsompanakis, Yiannis
2010-04-01
The slope-stability of a proposed vertical extension of a balefill was investigated in the present study, in an attempt to determine a geotechnically conservative design, compliant with New Jersey Department of Environmental Protection regulations, to maximize the utilization of unclaimed disposal capacity. Conventional geotechnical analytical methods are generally limited to well-defined failure modes, which may not occur in landfills or balefills due to the presence of preferential slip surfaces. In addition, these models assume an a priori stress distribution to solve essentially indeterminate problems. In this work, a different approach has been applied, which avoids several of the drawbacks of conventional methods. Specifically, the analysis was performed in a two-stage process: (a) calculation of stress distribution, and (b) application of an optimization technique to identify the most probable failure surface. The stress analysis was performed using a finite element formulation and the location of the failure surface was located by dynamic programming optimization method. A sensitivity analysis was performed to evaluate the effect of the various waste strength parameters of the underlying mathematical model on the results, namely the factor of safety of the landfill. Although this study focuses on the stability investigation of an expanded balefill, the methodology presented can easily be applied to general geotechnical investigations.
Sudo, Hirotaka; O'driscoll, Michael; Nishiwaki, Kenji; Kawamoto, Yuji; Gammell, Philip; Schramm, Gerhard; Wertli, Toni; Prinz, Heino; Mori, Atsuhide; Sako, Kazuhiro
2012-01-01
The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. Studies using ampoules filled with ethanol-based solution and with nitrogen in the headspace demonstrated that the head space analysis (HSA) method showed sufficient sensitivity in detecting an ampoule crack. The proposed method is the use of HSA in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate the oxygen flow through the crack in the ampoule. The method was examined in comparative studies with a conventional dye ingress method, and the results showed that the HSA method exhibits sensitivity superior to the dye method. The results indicate that the HSA method in combination with the bombing treatment provides potential application as a leak test for the detection of container defects not only for ampoule products with ethanol-based solutions, but also for testing lyophilized products in vials with nitrogen in the head space. The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. The proposed method is the use of head space analysis (HSA) in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate oxygen flow through the crack in the ampoule for use in routine production. The result of the comparative study with a conventional dye leak test method indicates that the HSA method in combination with the bombing treatment can be used as a leak test method, enabling detection of container defects.
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Multichannel analysis of surface waves (MASW) - Active and passive methods
Park, C.B.; Miller, R.D.; Xia, J.; Ivanov, J.
2007-01-01
The conventional seismic approaches for near-surface investigation have usually been either high-resolution reflection or refraction surveys that deal with a depth range of a few tens to hundreds meters. Seismic signals from these surveys consist of wavelets with frequencies higher than 50 Hz. The multichannel analysis of surface waves (MASW) method deals with surface waves in the lower frequencies (e.g., 1-30 Hz) and uses a much shallower depth range of investigation (e.g., a few to a few tens of meters). ?? 2007 Society of Exploration Geophysicists.
An analysis method for two-dimensional transonic viscous flow
NASA Technical Reports Server (NTRS)
Bavitz, P. C.
1975-01-01
A method for the approximate calculation of transonic flow over airfoils, including shock waves and viscous effects, is described. Numerical solutions are obtained by use of a computer program which is discussed in the appendix. The importance of including the boundary layer in the analysis is clearly demonstrated, as well as the need to improve on existing procedures near the trailing edge. Comparisons between calculations and experimental data are presented for both conventional and supercritical airfoils, emphasis being on the surface pressure distribution, and good agreement is indicated.
NASA Technical Reports Server (NTRS)
Nakazawa, Shohei
1989-01-01
The internal structure is discussed of the MHOST finite element program designed for 3-D inelastic analysis of gas turbine hot section components. The computer code is the first implementation of the mixed iterative solution strategy for improved efficiency and accuracy over the conventional finite element method. The control structure of the program is covered along with the data storage scheme and the memory allocation procedure and the file handling facilities including the read and/or write sequences.
Advances in biological dosimetry
NASA Astrophysics Data System (ADS)
Ivashkevich, A.; Ohnesorg, T.; Sparbier, C. E.; Elsaleh, H.
2017-01-01
Rapid retrospective biodosimetry methods are essential for the fast triage of persons occupationally or accidentally exposed to ionizing radiation. Identification and detection of a radiation specific molecular ‘footprint’ should provide a sensitive and reliable measurement of radiation exposure. Here we discuss conventional (cytogenetic) methods of detection and assessment of radiation exposure in comparison to emerging approaches such as gene expression signatures and DNA damage markers. Furthermore, we provide an overview of technical and logistic details such as type of sample required, time for sample preparation and analysis, ease of use and potential for a high throughput analysis.
MUTAGENIC PROPERTIES OF RIVER WATERS FLOWING THROUGH BIG CITY AREAS IN NORTH AMERICA
The hanging technique using Blue rayon, which specifically adsorbs mutagens with multicyclic planer structures, has advantages over the conventional method of bringing-large volumes of water back to the laboratory for extraction. Therefore, the analysis of many samples from remot...
USDA-ARS?s Scientific Manuscript database
Aflatoxins are secondary metabolites produced by certain fungal species of the Aspergillus genus. Aflatoxin contamination remains a problem in agricultural products due to its toxic and carcinogenic properties. Conventional chemical methods for aflatoxin detection are time-consuming and destructive....
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik
Scientists working in a particular domain often adhere to conventional data analysis and presentation methods and this leads to familiarity with these methods over time. But does high familiarity always lead to better analytical judgment? This question is especially relevant when visualizations are used in scientific tasks, as there can be discrepancies between visualization best practices and domain conventions. However, there is little empirical evidence of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their effect on scientific judgment. To address this gap and to study these factors, we focus on the climatemore » science domain, specifically on visualizations used for comparison of model performance. We present a comprehensive user study with 47 climate scientists where we explored the following factors: i) relationships between scientists’ familiarity, their perceived levels of com- fort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less
NASA Technical Reports Server (NTRS)
Achtemeier, Gary L.; Kidder, Stanley Q.; Scott, Robert W.
1988-01-01
The variational multivariate assimilation method described in a companion paper by Achtemeier and Ochs is applied to conventional and conventional plus satellite data. Ground-based and space-based meteorological data are weighted according to the respective measurement errors and blended into a data set that is a solution of numerical forms of the two nonlinear horizontal momentum equations, the hydrostatic equation, and an integrated continuity equation for a dry atmosphere. The analyses serve first, to evaluate the accuracy of the model, and second to contrast the analyses with and without satellite data. Evaluation criteria measure the extent to which: (1) the assimilated fields satisfy the dynamical constraints, (2) the assimilated fields depart from the observations, and (3) the assimilated fields are judged to be realistic through pattern analysis. The last criterion requires that the signs, magnitudes, and patterns of the hypersensitive vertical velocity and local tendencies of the horizontal velocity components be physically consistent with respect to the larger scale weather systems.
Rapid Prototyping Technology for Manufacturing GTE Turbine Blades
NASA Astrophysics Data System (ADS)
Balyakin, A. V.; Dobryshkina, E. M.; Vdovin, R. A.; Alekseev, V. P.
2018-03-01
The conventional approach to manufacturing turbine blades by investment casting is expensive and time-consuming, as it takes a lot of time to make geometrically precise and complex wax patterns. Turbine blade manufacturing in pilot production can be sped up by accelerating the casting process while keeping the geometric precision of the final product. This paper compares the rapid prototyping method (casting the wax pattern composition into elastic silicone molds) to the conventional technology. Analysis of the size precision of blade casts shows that silicon-mold casting features sufficient geometric precision. Thus, this method for making wax patterns can be a cost-efficient solution for small-batch or pilot production of turbine blades for gas-turbine units (GTU) and gas-turbine engines (GTE). The paper demonstrates how additive technology and thermographic analysis can speed up the cooling of wax patterns in silicone molds. This is possible at an optimal temperature and solidification time, which make the process more cost-efficient while keeping the geometric quality of the final product.
Fast focus estimation using frequency analysis in digital holography.
Oh, Seungtaik; Hwang, Chi-Young; Jeong, Il Kwon; Lee, Sung-Keun; Park, Jae-Hyeung
2014-11-17
A novel fast frequency-based method to estimate the focus distance of digital hologram for a single object is proposed. The focus distance is computed by analyzing the distribution of intersections of smoothed-rays. The smoothed-rays are determined by the directions of energy flow which are computed from local spatial frequency spectrum based on the windowed Fourier transform. So our method uses only the intrinsic frequency information of the optical field on the hologram and therefore does not require any sequential numerical reconstructions and focus detection techniques of conventional photography, both of which are the essential parts in previous methods. To show the effectiveness of our method, numerical results and analysis are presented as well.
The triangle of the urinary bladder in American mink (Mustela vision (Brisson, 1756)).
Gościcka, D; Krakowiak, E; Kepczyńska, M
1994-01-01
60 bladders of American minks were dissected according to conventional method. Biometrical analysis with the use of digital image analysis system was applied to the triangles of the bladders. It was found that these triangles differ both in shape (narrow, broad) and symmetry (considerable asymmetry). The ureteral orifices also showed a variety in shape (five types) and number (double orifices).
Wang, Liuqing; Yamashita, Yoko; Saito, Akiko; Ashida, Hitoshi
2017-07-01
Procyanidins belong to a family of flavan-3-ols, which consist of monomers, (+)-catechin and (-)-epicatechin, and their oligomers and polymers, and are distributed in many plant-derived foods. Procyanidins are reported to have many beneficial physiological activities, such as antihypertensive and anticancer effects. However, the bioavailability of procyanidins is not well understood owing to a lack of convenient and high-sensitive analysis methods. The aim of this study was to develop an improved method for determining procyanidin content in both food materials and biological samples. High performance liquid chromatography (HPLC) coupled with a fluorescence detector was used in this study. The limits of detection (LODs) of (+)-catechin, (-)-epicatechin, procyanidin B2, procyanidin C1, and cinnamtannin A2 were 3.0×10 -3 ng, 4.0×10 -3 ng, 14.0×10 -3 ng, 18.5×10 -3 ng, and 23.0×10 -3 ng, respectively; the limits of quantification (LOQs) were 10.0×10 -3 ng, 29.0×10 -3 ng, 28.5×10 -3 ng, 54.1×10 -3 ng, and 115.0×10 -3 ng, respectively. The LOD and LOQ values indicated that the sensitivity of the fluorescence detector method was around 1000 times higher than that of conventional HPLC coupled with a UV-detector. We applied the developed method to measure procyanidins in black soybean seed coat extract (BE) prepared from soybeans grown under three different fertilization conditions, namely, conventional farming, basal manure application, and intertillage. The amount of flavan-3-ols in these BEs decreased in the order intertillage > basal manure application > conventional farming. Commercially available BE was orally administered to mice at a dose of 250 mg/kg body weight, and we measured the blood flavan-3-ol content. Data from plasma analysis indicated that up to the tetramer oligomerization, procyanidins were detectable and flavan-3-ols mainly existed in conjugated forms in the plasma. In conclusion, we developed a highly sensitive and convenient analytical method for the analysis of flavan-3-ols, and applied this technique to investigate the bioavailability of flavan-3-ols in biological samples and to measure flavan-3-ol content in food material and plants. Copyright © 2017. Published by Elsevier B.V.
Kim, Sung Jae; Kim, Sung Hwan; Kim, Young Hwan; Chun, Yong Min
2015-01-01
The authors have observed a failure to achieve secure fixation in elderly patients when inserting a half-pin at the anteromedial surface of the tibia. The purpose of this study was to compare two methods for inserting a half-pin at tibia diaphysis in elderly patients. Twenty cadaveric tibias were divided into Group C or V. A half-pin was inserted into the tibias of Group C via the conventional method, from the anteromedial surface to the interosseous border of the tibia diaphysis, and into the tibias of Group V via the vertical method, from the anterior border to the posterior surface at the same level. The maximum insertion torque was measured during the bicortical insertion with a torque driver. The thickness of the cortex was measured by micro-computed tomography. The relationship between the thickness of the cortex engaged and the insertion torque was investigated. The maximum insertion torque and the thickness of the cortex were significantly higher in Group V than Group C. Both groups exhibited a statistically significant linear correlation between torque and thickness by Spearman's rank correlation analysis. Half-pins inserted by the vertical method achieved purchase of more cortex than those inserted by the conventional method. Considering that cortical thickness and insertion torque in Group V were significantly greater than those in Group C, we suggest that the vertical method of half-pin insertion may be an alternative to the conventional method in elderly patients.
Zhang, Xue; Zhang, Chong; Zhou, Qian-Qian; Zhang, Xiao-Fei; Wang, Li-Yan; Chang, Hai-Bo; Li, He-Ping; Oda, Yoshimitsu; Xing, Xin-Hui
2015-07-01
DNA damage is the dominant source of mutation, which is the driving force of evolution. Therefore, it is important to quantitatively analyze the DNA damage caused by different mutagenesis methods, the subsequent mutation rates, and their relationship. Atmospheric and room temperature plasma (ARTP) mutagenesis has been used for the mutation breeding of more than 40 microorganisms. However, ARTP mutagenesis has not been quantitatively compared with conventional mutation methods. In this study, the umu test using a flow-cytometric analysis was developed to quantify the DNA damage in individual viable cells using Salmonella typhimurium NM2009 as the model strain and to determine the mutation rate. The newly developed method was used to evaluate four different mutagenesis systems: a new ARTP tool, ultraviolet radiation, 4-nitroquinoline-1-oxide (4-NQO), and N-methyl-N'-nitro-N-nitrosoguanidine (MNNG) mutagenesis. The mutation rate was proportional to the corresponding SOS response induced by DNA damage. ARTP caused greater DNA damage to individual living cells than the other conventional mutagenesis methods, and the mutation rate was also higher. By quantitatively comparing the DNA damage and consequent mutation rate after different types of mutagenesis, we have shown that ARTP is a potentially powerful mutagenesis tool with which to improve the characteristics of microbial cell factories.
Gray, Nicola; Lewis, Matthew R; Plumb, Robert S; Wilson, Ian D; Nicholson, Jeremy K
2015-06-05
A new generation of metabolic phenotyping centers are being created to meet the increasing demands of personalized healthcare, and this has resulted in a major requirement for economical, high-throughput metabonomic analysis by liquid chromatography-mass spectrometry (LC-MS). Meeting these new demands represents an emerging bioanalytical problem that must be solved if metabolic phenotyping is to be successfully applied to large clinical and epidemiological sample sets. Ultraperformance (UP)LC-MS-based metabolic phenotyping, based on 2.1 mm i.d. LC columns, enables comprehensive metabolic phenotyping but, when employed for the analysis of thousands of samples, results in high solvent usage. The use of UPLC-MS employing 1 mm i.d. columns for metabolic phenotyping rather than the conventional 2.1 mm i.d. methodology shows that the resulting optimized microbore method provided equivalent or superior performance in terms of peak capacity, sensitivity, and robustness. On average, we also observed, when using the microbore scale separation, an increase in response of 2-3 fold over that obtained with the standard 2.1 mm scale method. When applied to the analysis of human urine, the 1 mm scale method showed no decline in performance over the course of 1000 analyses, illustrating that microbore UPLC-MS represents a viable alternative to conventional 2.1 mm i.d. formats for routine large-scale metabolic profiling studies while also resulting in a 75% reduction in solvent usage. The modest increase in sensitivity provided by this methodology also offers the potential to either reduce sample consumption or increase the number of metabolite features detected with confidence due to the increased signal-to-noise ratios obtained. Implementation of this miniaturized UPLC-MS method of metabolic phenotyping results in clear analytical, economic, and environmental benefits for large-scale metabolic profiling studies with similar or improved analytical performance compared to conventional UPLC-MS.
Nuclear reactor transient analysis via a quasi-static kinetics Monte Carlo method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jo, YuGwon; Cho, Bumhee; Cho, Nam Zin, E-mail: nzcho@kaist.ac.kr
2015-12-31
The predictor-corrector quasi-static (PCQS) method is applied to the Monte Carlo (MC) calculation for reactor transient analysis. To solve the transient fixed-source problem of the PCQS method, fission source iteration is used and a linear approximation of fission source distributions during a macro-time step is introduced to provide delayed neutron source. The conventional particle-tracking procedure is modified to solve the transient fixed-source problem via MC calculation. The PCQS method with MC calculation is compared with the direct time-dependent method of characteristics (MOC) on a TWIGL two-group problem for verification of the computer code. Then, the results on a continuous-energy problemmore » are presented.« less
Hori, Katsuhito; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi
2014-01-01
Supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry was applied to the profiling of sucrose fatty acid esters (SEs). The SFC conditions (column and modifier gradient) were optimized for the effective separation of SEs. In the column test, a silica gel reversed-phase column was selected. Then, the method was used for the detailed characterization of commercial SEs and the successful analysis of SEs containing different fatty acids. The present method allowed for fast and high-resolution separation of monoesters to tetra-esters within a shorter time (15 min) as compared to the conventional high-performance liquid chromatography. The applicability of our method for the analysis of SEs was thus demonstrated. PMID:26819875
Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.
2008-01-01
Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859
Cameron, Chris; Ewara, Emmanuel; Wilson, Florence R; Varu, Abhishek; Dyrda, Peter; Hutton, Brian; Ingham, Michael
2017-11-01
Adaptive trial designs present a methodological challenge when performing network meta-analysis (NMA), as data from such adaptive trial designs differ from conventional parallel design randomized controlled trials (RCTs). We aim to illustrate the importance of considering study design when conducting an NMA. Three NMAs comparing anti-tumor necrosis factor drugs for ulcerative colitis were compared and the analyses replicated using Bayesian NMA. The NMA comprised 3 RCTs comparing 4 treatments (adalimumab 40 mg, golimumab 50 mg, golimumab 100 mg, infliximab 5 mg/kg) and placebo. We investigated the impact of incorporating differences in the study design among the 3 RCTs and presented 3 alternative methods on how to convert outcome data derived from one form of adaptive design to more conventional parallel RCTs. Combining RCT results without considering variations in study design resulted in effect estimates that were biased against golimumab. In contrast, using the 3 alternative methods to convert outcome data from one form of adaptive design to a format more consistent with conventional parallel RCTs facilitated more transparent consideration of differences in study design. This approach is more likely to yield appropriate estimates of comparative efficacy when conducting an NMA, which includes treatments that use an alternative study design. RCTs based on adaptive study designs should not be combined with traditional parallel RCT designs in NMA. We have presented potential approaches to convert data from one form of adaptive design to more conventional parallel RCTs to facilitate transparent and less-biased comparisons.
Scerbo, Michelle H; Kaplan, Heidi B; Dua, Anahita; Litwin, Douglas B; Ambrose, Catherine G; Moore, Laura J; Murray, Col Clinton K; Wade, Charles E; Holcomb, John B
2016-06-01
Sepsis from bacteremia occurs in 250,000 cases annually in the United States, has a mortality rate as high as 60%, and is associated with a poorer prognosis than localized infection. Because of these high figures, empiric antibiotic administration for patients with systemic inflammatory response syndrome (SIRS) and suspected infection is the second most common indication for antibiotic administration in intensive care units (ICU)s. However, overuse of empiric antibiotics contributes to the development of opportunistic infections, antibiotic resistance, and the increase in multi-drug-resistant bacterial strains. The current method of diagnosing and ruling out bacteremia is via blood culture (BC) and Gram stain (GS) analysis. Conventional and molecular methods for diagnosing bacteremia were reviewed and compared. The clinical implications, use, and current clinical trials of polymerase chain reaction (PCR)-based methods to detect bacterial pathogens in the blood stream were detailed. BC/GS has several disadvantages. These include: some bacteria do not grow in culture media; others do not GS appropriately; and cultures can require up to 5 d to guide or discontinue antibiotic treatment. PCR-based methods can be potentially applied to detect rapidly, accurately, and directly microbes in human blood samples. Compared with the conventional BC/GS, particular advantages to molecular methods (specifically, PCR-based methods) include faster results, leading to possible improved antibiotic stewardship when bacteremia is not present.
A general numerical analysis of the superconducting quasiparticle mixer
NASA Technical Reports Server (NTRS)
Hicks, R. G.; Feldman, M. J.; Kerr, A. R.
1985-01-01
For very low noise millimeter-wave receivers, the superconductor-insulator-superconductor (SIS) quasiparticle mixer is now competitive with conventional Schottky mixers. Tucker (1979, 1980) has developed a quantum theory of mixing which has provided a basis for the rapid improvement in SIS mixer performance. The present paper is concerned with a general method of numerical analysis for SIS mixers which allows arbitrary terminating impedances for all the harmonic frequencies. This analysis provides an approach for an examination of the range of validity of the three-frequency results of the quantum mixer theory. The new method has been implemented with the aid of a Fortran computer program.
Revealing representational content with pattern-information fMRI--an introductory guide.
Mur, Marieke; Bandettini, Peter A; Kriegeskorte, Nikolaus
2009-03-01
Conventional statistical analysis methods for functional magnetic resonance imaging (fMRI) data are very successful at detecting brain regions that are activated as a whole during specific mental activities. The overall activation of a region is usually taken to indicate involvement of the region in the task. However, such activation analysis does not consider the multivoxel patterns of activity within a brain region. These patterns of activity, which are thought to reflect neuronal population codes, can be investigated by pattern-information analysis. In this framework, a region's multivariate pattern information is taken to indicate representational content. This tutorial introduction motivates pattern-information analysis, explains its underlying assumptions, introduces the most widespread methods in an intuitive way, and outlines the basic sequence of analysis steps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, M.; Ma, L.Q.
1998-11-01
It is critical to compare existing sample digestion methods for evaluating soil contamination and remediation. USEPA Methods 3050, 3051, 3051a, and 3052 were used to digest standard reference materials and representative Florida surface soils. Fifteen trace metals (Ag, As, Ba, Be, Cd, Cr, Cu, Hg, Mn, Mo, Ni, Pb, Sb, Se, and Za), and six macro elements (Al, Ca, Fe, K, Mg, and P) were analyzed. Precise analysis was achieved for all elements except for Cd, Mo, Se, and Sb in NIST SRMs 2704 and 2709 by USEPA Methods 3050 and 3051, and for all elements except for As, Mo,more » Sb, and Se in NIST SRM 2711 by USEPA Method 3052. No significant differences were observed for the three NIST SRMs between the microwave-assisted USEPA Methods 3051 and 3051A and the conventional USEPA Method 3050 Methods 3051 and 3051a and the conventional USEPA Method 3050 except for Hg, Sb, and Se. USEPA Method 3051a provided comparable values for NIST SRMs certified using USEPA Method 3050. However, for method correlation coefficients and elemental recoveries in 40 Florida surface soils, USEPA Method 3051a was an overall better alternative for Method 3050 than was Method 3051. Among the four digestion methods, the microwave-assisted USEPA Method 3052 achieved satisfactory recoveries for all elements except As and Mg using NIST SRM 2711. This total-total digestion method provided greater recoveries for 12 elements Ag, Be, Cr, Fe, K, Mn, Mo, Ni, Pb, Sb, Se, and Zn, but lower recoveries for Mg in Florida soils than did the total-recoverable digestion methods.« less
NASA Astrophysics Data System (ADS)
Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong
2018-02-01
Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.
Akay, Erdem; Yilmaz, Cagatay; Kocaman, Esat S; Turkmen, Halit S; Yildiz, Mehmet
2016-09-19
The significance of strain measurement is obvious for the analysis of Fiber-Reinforced Polymer (FRP) composites. Conventional strain measurement methods are sufficient for static testing in general. Nevertheless, if the requirements exceed the capabilities of these conventional methods, more sophisticated techniques are necessary to obtain strain data. Fiber Bragg Grating (FBG) sensors have many advantages for strain measurement over conventional ones. Thus, the present paper suggests a novel method for biaxial strain measurement using embedded FBG sensors during the fatigue testing of FRP composites. Poisson's ratio and its reduction were monitored for each cyclic loading by using embedded FBG sensors for a given specimen and correlated with the fatigue stages determined based on the variations of the applied fatigue loading and temperature due to the autogenous heating to predict an oncoming failure of the continuous fiber-reinforced epoxy matrix composite specimens under fatigue loading. The results show that FBG sensor technology has a remarkable potential for monitoring the evolution of Poisson's ratio on a cycle-by-cycle basis, which can reliably be used towards tracking the fatigue stages of composite for structural health monitoring purposes.
Akay, Erdem; Yilmaz, Cagatay; Kocaman, Esat S.; Turkmen, Halit S.; Yildiz, Mehmet
2016-01-01
The significance of strain measurement is obvious for the analysis of Fiber-Reinforced Polymer (FRP) composites. Conventional strain measurement methods are sufficient for static testing in general. Nevertheless, if the requirements exceed the capabilities of these conventional methods, more sophisticated techniques are necessary to obtain strain data. Fiber Bragg Grating (FBG) sensors have many advantages for strain measurement over conventional ones. Thus, the present paper suggests a novel method for biaxial strain measurement using embedded FBG sensors during the fatigue testing of FRP composites. Poisson’s ratio and its reduction were monitored for each cyclic loading by using embedded FBG sensors for a given specimen and correlated with the fatigue stages determined based on the variations of the applied fatigue loading and temperature due to the autogenous heating to predict an oncoming failure of the continuous fiber-reinforced epoxy matrix composite specimens under fatigue loading. The results show that FBG sensor technology has a remarkable potential for monitoring the evolution of Poisson’s ratio on a cycle-by-cycle basis, which can reliably be used towards tracking the fatigue stages of composite for structural health monitoring purposes. PMID:28773901
Wang, Weihao; Xing, Zhihua
2014-01-01
Objective. Xingnaojing injection (XNJ) is a well-known traditional Chinese patent medicine (TCPM) for stroke. The aim of this study is to assess the efficacy of XNJ for stroke including ischemic stroke, intracerebral hemorrhage (ICH), and subarachnoid hemorrhage (SAH). Methods. An extensive search was performed within using eight databases up to November 2013. Randomized controlled trials (RCTs) on XNJ for treatment of stroke were collected. Study selection, data extraction, quality assessment, and meta-analysis were conducted according to the Cochrane standards, and RevMan5.0 was used for meta-analysis. Results. This review included 13 RCTs and a total of 1,514 subjects. The overall methodological quality was poor. The meta-analysis showed that XNJ combined with conventional treatment was more effective for total efficacy, neurological deficit improvement, and reduction of TNF-α levels compared with those of conventional treatment alone. Three trials reported adverse events, of these one trial reported mild impairment of kidney and liver function, whereas the other two studies failed to report specific adverse events. Conclusion. Despite the limitations of this review, we suggest that XNJ in combination with conventional medicines might be beneficial for the treatment of stroke. Currently there are various methodological problems in the studies. Therefore, high-quality, large-scale RCTs are urgently needed. PMID:24707306
Analysis of an arched outer-race ball bearing considering centrifugal forces
NASA Technical Reports Server (NTRS)
Hamrock, B. J.; Anderson, W. J.
1972-01-01
A Newton-Raphson method of iteration was used in evaluating the radial and axial projection of the distance between the ball center and the outer raceway groove curvature center (V and W). Fatigue life evaluations were made. The similar analysis of a conventional bearing can be directly obtained from the arched bearing analysis by simply letting the amount of arching be zero (g = 0) and not considering equations related to the unloaded half of the outer race. The analysis was applied to a 150-mm angular contact ball bearing. Results for life, contact loads, and angles are shown for a conventional bearing (g = 0) and two arched bearings (g = 0.127 mm (0.005 in.), and 0.254 mm (0.010 in.)). The results indicate that an arched bearing is highly desirable for high speed applications. In particular, for a DN value of 3 million (20,000 rpm) and an applied axial load of 4448 N (1000 lb), an arched bearing shows an improvement in life of 306 percent over that of a conventional bearing. At 4.2 million DN (28,000 rpm), the corresponding improvement is 340 percent. It was also found for low speeds, the arched bearing does not offer the advantages that it does for high speed applications.
Pereira, Graziane Olímpio; Gimenez, Carla Maria Melleiro; Prieto, Lucas; Prieto, Marcos Gabriel do Lago; Basting, Roberta Tarkany
2016-01-01
ABSTRACT Objective: To evaluate stainless steel archwire static friction in active and passive self-ligating lingual and conventional brackets with second-order angulations. Methods: Two conventional lingual brackets for canines (STb light/Ormco; PSWb/Tecnident), and two self-ligating brackets, one active (In-Ovation L/GAC) and the other passive (3D/ Forestadent), were evaluated. A stainless steel archwire was used at 0°, 3° and 5° angulations. Metal ligatures, conventional elastic ligatures, and low friction elastic ligatures were also tested. A universal testing machine applied friction between brackets and wires, simulating sliding mechanics, to produce 2-mm sliding at 3 mm/minute speed. Results: Two-way analysis of variance demonstrated a significant effect of the interaction between brackets and angulations (p < 0.001). Tukey test indicated that the highest frictional resistance values were observed at 5° angulation for In-Ovation L, PSWb bracket with non conventional ligature, and STb bracket with metal ligature. As for 3D, PSWb with conventional or metal ligatures, and STb brackets with non conventional ligature, showed significantly lower static frictional resistance with 0° angulation. At 0° angulation, STb brackets with metal ties, In-Ovation L brackets and 3D brackets had the lowest frictional resistance. Conclusions: As the angulation increased from 0° to 3°, static friction resistance increased. When angulation increased from 3° to 5°, static friction resistance increased or remained the same. Self-ligating 3D and In-Ovation L brackets, as well as conventional STb brackets, seem to be the best option when sliding mechanics is used to perform lingual orthodontic treatment. PMID:27653262
NASA Astrophysics Data System (ADS)
Wang, Jing-peng; Zhang, Yi-min; Huang, Jing; Liu, Tao
2018-04-01
The leaching kinetics of the vanadium leaching process were investigated by the comparison of microwave heating and conventional heating methods. Microwave heating with CaF2 had a synergistic effect and improved the vanadium leaching efficiency. In contrast to conventional heating leaching, microwave heating accelerated the vanadium leaching rate by approximately 1-3% and by approximately 15% when CaF2 was also used. The kinetics analysis showed that the calculated activation energy decreased in the microwave heating method in the presence and absence of CaF2. The control procedure of leaching also changed from a chemical reaction control step to a mixed chemical diffusion control step upon the addition of CaF2. Microwave heating was shown to be suitable for leaching systems with diffusion or mixed chemical diffusion control steps when the target mineral does not have a microwave absorbing ability.
ASPECTS: an automation-assisted SPE method development system.
Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu
2013-07-01
A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.
CNN based approach for activity recognition using a wrist-worn accelerometer.
Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R
2017-07-01
In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.
Petigny, Loïc; Périno, Sandrine; Minuti, Matteo; Visinoni, Francesco; Wajsman, Joël; Chemat, Farid
2014-01-01
Microwave extraction and separation has been used to increase the concentration of the extract compared to the conventional method with the same solid/liquid ratio, reducing extraction time and separate at the same time Volatile Organic Compounds (VOC) from non-Volatile Organic Compounds (NVOC) of boldo leaves. As preliminary study, a response surface method has been used to optimize the extraction of soluble material and the separation of VOC from the plant in laboratory scale. The results from the statistical analysis revealed that the optimized conditions were: microwave power 200 W, extraction time 56 min and solid liquid ratio of 7.5% of plants in water. Lab scale optimized microwave method is compared to conventional distillation, and requires a power/mass ratio of 0.4 W/g of water engaged. This power/mass ratio is kept in order to upscale from lab to pilot plant. PMID:24776762
Real-Time PCR in Clinical Microbiology: Applications for Routine Laboratory Testing
Espy, M. J.; Uhl, J. R.; Sloan, L. M.; Buckwalter, S. P.; Jones, M. F.; Vetter, E. A.; Yao, J. D. C.; Wengenack, N. L.; Rosenblatt, J. E.; Cockerill, F. R.; Smith, T. F.
2006-01-01
Real-time PCR has revolutionized the way clinical microbiology laboratories diagnose many human microbial infections. This testing method combines PCR chemistry with fluorescent probe detection of amplified product in the same reaction vessel. In general, both PCR and amplified product detection are completed in an hour or less, which is considerably faster than conventional PCR detection methods. Real-time PCR assays provide sensitivity and specificity equivalent to that of conventional PCR combined with Southern blot analysis, and since amplification and detection steps are performed in the same closed vessel, the risk of releasing amplified nucleic acids into the environment is negligible. The combination of excellent sensitivity and specificity, low contamination risk, and speed has made real-time PCR technology an appealing alternative to culture- or immunoassay-based testing methods for diagnosing many infectious diseases. This review focuses on the application of real-time PCR in the clinical microbiology laboratory. PMID:16418529
Denoising time-domain induced polarisation data using wavelet techniques
NASA Astrophysics Data System (ADS)
Deo, Ravin N.; Cull, James P.
2016-05-01
Time-domain induced polarisation (TDIP) methods are routinely used for near-surface evaluations in quasi-urban environments harbouring networks of buried civil infrastructure. A conventional technique for improving signal to noise ratio in such environments is by using analogue or digital low-pass filtering followed by stacking and rectification. However, this induces large distortions in the processed data. In this study, we have conducted the first application of wavelet based denoising techniques for processing raw TDIP data. Our investigation included laboratory and field measurements to better understand the advantages and limitations of this technique. It was found that distortions arising from conventional filtering can be significantly avoided with the use of wavelet based denoising techniques. With recent advances in full-waveform acquisition and analysis, incorporation of wavelet denoising techniques can further enhance surveying capabilities. In this work, we present the rationale for utilising wavelet denoising methods and discuss some important implications, which can positively influence TDIP methods.
Tuning fork enhanced interferometric photoacoustic spectroscopy: a new method for trace gas analysis
NASA Astrophysics Data System (ADS)
Köhring, M.; Pohlkötter, A.; Willer, U.; Angelmahr, M.; Schade, W.
2011-01-01
A photoacoustic trace gas sensor based on an optical read-out method of a quartz tuning fork is shown. Instead of conventional piezoelectric signal read-out, as applied in well-known quartz-enhanced photoacoustic spectroscopy (QEPAS), an interferometric read-out method for measurement of the tuning fork's oscillation is presented. To demonstrate the potential of the optical read-out of tuning forks in photoacoustics, a comparison between the performances of a sensor with interferometric read-out and conventional QEPAS with piezoelectric read-out is reported. The two sensors show similar characteristics. The detection limit (L) for the optical read-out is determined to be L opt=(2598±84) ppm (1 σ) compared to L elec=(2579±78) ppm (1 σ) for piezoelectric read-out. In both cases the detection limit is defined by the thermal noise of the tuning fork.
High-rate deposition of LiNb 1- xTa xO 3 films by thermal plasma spray CVD
NASA Astrophysics Data System (ADS)
Majima, T.; Yamamoto, H.; Kulinich, S. A.; Terashima, K.
2000-12-01
LiNb 1- xTa xO 3 films were prepared by a thermal plasma spray CVD method using liquid source materials. Preferentially (0 0 1)-oriented LiNb 1- xTa xO 3 films with satisfactory in-plane and out-of-plane alignment were fabricated on sapphire (0 0 1) substrates. The full-width at half-maximum (FWHM) of the (0 0 6) rocking curve could achieve 0.12°, which was comparable with those of LiNbO 3 and LiTaO 3 films prepared by other conventional vapor-phase deposition methods. The deposition rate was up to 0.07 μm/min, which was 5-40 times faster than those for most other conventional vapor-phase deposition methods. From inductively coupled plasma atomic emission spectroscopy analysis, x values of these films were estimated to be 0.36-0.49.
NASA Astrophysics Data System (ADS)
Wang, Jing-peng; Zhang, Yi-min; Huang, Jing; Liu, Tao
2018-06-01
The leaching kinetics of the vanadium leaching process were investigated by the comparison of microwave heating and conventional heating methods. Microwave heating with CaF2 had a synergistic effect and improved the vanadium leaching efficiency. In contrast to conventional heating leaching, microwave heating accelerated the vanadium leaching rate by approximately 1-3% and by approximately 15% when CaF2 was also used. The kinetics analysis showed that the calculated activation energy decreased in the microwave heating method in the presence and absence of CaF2. The control procedure of leaching also changed from a chemical reaction control step to a mixed chemical diffusion control step upon the addition of CaF2. Microwave heating was shown to be suitable for leaching systems with diffusion or mixed chemical diffusion control steps when the target mineral does not have a microwave absorbing ability.
STUDY ON SYNTHESIS AND EVOLUTION OF NANOCRYSTALLINE Mg4Ta2O9 BY AQUEOUS SOL-GEL PROCESS
NASA Astrophysics Data System (ADS)
Wu, H. T.; Yang, C. H.; Wu, W. B.; Yue, Y. L.
2012-06-01
Nanosized and highly reactive Mg4Ta2O9 were successfully synthesized by aqueous sol-gel method compared with conventional solid-state method. Ta-Mg-citric acid solution was first formed and then evaporated resulting in a dry gel for calcination in the temperature ranging from 600°C to 800°C for crystallization in oxygen atmosphere. The crystallization process from the gel to crystalline Mg4Ta2O9 was identified by thermal analysis and phase evolution of powders was studied using X-ray diffraction (XRD) technique during calcinations. Particle size and morphology were examined by transmission electron microscopy (TEM) and high resolution scanning electron microscopy (HR-SEM). The results revealed that sol-gel process showed great advantages over conventional solid-state method and Mg4Ta2O9 nanopowders with the size of 20-30 nm were obtained at 800°C.
Model-based spectral estimation of Doppler signals using parallel genetic algorithms.
Solano González, J; Rodríguez Vázquez, K; García Nocetti, D F
2000-05-01
Conventional spectral analysis methods use a fast Fourier transform (FFT) on consecutive or overlapping windowed data segments. For Doppler ultrasound signals, this approach suffers from an inadequate frequency resolution due to the time segment duration and the non-stationarity characteristics of the signals. Parametric or model-based estimators can give significant improvements in the time-frequency resolution at the expense of a higher computational complexity. This work describes an approach which implements in real-time a parametric spectral estimator method using genetic algorithms (GAs) in order to find the optimum set of parameters for the adaptive filter that minimises the error function. The aim is to reduce the computational complexity of the conventional algorithm by using the simplicity associated to GAs and exploiting its parallel characteristics. This will allow the implementation of higher order filters, increasing the spectrum resolution, and opening a greater scope for using more complex methods.
Lead Pipe Scale Analysis Using Broad-Beam Argon Ion Milling to Elucidate Drinking Water Corrosion
Herein, we compared the characterization of lead pipe scale removed from a drinking water distribution system using two different cross section methods (conventional polishing and argon ion beam etching). The pipe scale solids were analyzed using scanning electron microscopy (SEM...
Prediction of crude protein and oil content of soybeans using Raman spectroscopy
USDA-ARS?s Scientific Manuscript database
While conventional chemical analysis methods for food nutrients require time-consuming, labor-intensive, and invasive pretreatment procedures, Raman spectroscopy can be used to measure a variety of food components rapidly and non-destructively and does not require supervision from experts. The purpo...
Semi-volatile compounds present special analytical challenges not met by conventional methods for analysis of ambient particulate matter (PM). Accurate quantification of PM-associated organic compounds requires validation of the laboratory procedures for recovery over a wide v...
Adi-Dako, Ofosua; Oppong Bekoe, Samuel; Ofori-Kwakye, Kwabena; Appiah, Enoch; Peprah, Paul
2017-01-01
An isocratic sensitive and precise reverse phase high-performance liquid chromatography (RP-HPLC) method was developed and validated for the determination and quantification of hydrocortisone in controlled-release and conventional (tablets and injections) pharmaceutical preparations. Chromatographic separation was achieved on an ODS (C18), 5 μ m, 4.6 × 150 mm, with an isocratic elution using a freshly prepared mobile phase of composition methanol : water : acetic acid (60 : 30 : 10, v/v/v) at a flow rate of 1.0 ml/min. The detection of the drug was successfully achieved at a wavelength of 254 nm. The retention time obtained for the drug was 2.26 min. The proposed method produced linear detectable responses in the concentration range of 0.02 to 0.4 mg/ml of hydrocortisone. High recoveries of 98-101% were attained at concentration levels of 80%, 100%, and 120%. The intraday and interday precision (RSD) were 0.19-0.55% and 0.33-0.71%, respectively. A comparison of hydrocortisone analyses data from the developed method and the official USP method showed no significant difference ( p > 0.05) at a 95% confidence interval. The method was successfully applied to the determination and quantification of hydrocortisone in six controlled-release and fifteen conventional release pharmaceutical preparations.
Oppong Bekoe, Samuel; Appiah, Enoch; Peprah, Paul
2017-01-01
An isocratic sensitive and precise reverse phase high-performance liquid chromatography (RP-HPLC) method was developed and validated for the determination and quantification of hydrocortisone in controlled-release and conventional (tablets and injections) pharmaceutical preparations. Chromatographic separation was achieved on an ODS (C18), 5 μm, 4.6 × 150 mm, with an isocratic elution using a freshly prepared mobile phase of composition methanol : water : acetic acid (60 : 30 : 10, v/v/v) at a flow rate of 1.0 ml/min. The detection of the drug was successfully achieved at a wavelength of 254 nm. The retention time obtained for the drug was 2.26 min. The proposed method produced linear detectable responses in the concentration range of 0.02 to 0.4 mg/ml of hydrocortisone. High recoveries of 98–101% were attained at concentration levels of 80%, 100%, and 120%. The intraday and interday precision (RSD) were 0.19–0.55% and 0.33–0.71%, respectively. A comparison of hydrocortisone analyses data from the developed method and the official USP method showed no significant difference (p > 0.05) at a 95% confidence interval. The method was successfully applied to the determination and quantification of hydrocortisone in six controlled-release and fifteen conventional release pharmaceutical preparations. PMID:28660092
Full waveform inversion in the frequency domain using classified time-domain residual wavefields
NASA Astrophysics Data System (ADS)
Son, Woohyun; Koo, Nam-Hyung; Kim, Byoung-Yeop; Lee, Ho-Young; Joo, Yonghwan
2017-04-01
We perform the acoustic full waveform inversion in the frequency domain using residual wavefields that have been separated in the time domain. We sort the residual wavefields in the time domain according to the order of absolute amplitudes. Then, the residual wavefields are separated into several groups in the time domain. To analyze the characteristics of the residual wavefields, we compare the residual wavefields of conventional method with those of our residual separation method. From the residual analysis, the amplitude spectrum obtained from the trace before separation appears to have little energy at the lower frequency bands. However, the amplitude spectrum obtained from our strategy is regularized by the separation process, which means that the low-frequency components are emphasized. Therefore, our method helps to emphasize low-frequency components of residual wavefields. Then, we generate the frequency-domain residual wavefields by taking the Fourier transform of the separated time-domain residual wavefields. With these wavefields, we perform the gradient-based full waveform inversion in the frequency domain using back-propagation technique. Through a comparison of gradient directions, we confirm that our separation method can better describe the sub-salt image than the conventional approach. The proposed method is tested on the SEG/EAGE salt-dome model. The inversion results show that our algorithm is better than the conventional gradient based waveform inversion in the frequency domain, especially for deeper parts of the velocity model.
Coformer screening using thermal analysis based on binary phase diagrams.
Yamashita, Hiroyuki; Hirakura, Yutaka; Yuda, Masamichi; Terada, Katsuhide
2014-08-01
The advent of cocrystals has demonstrated a growing need for efficient and comprehensive coformer screening in search of better development forms, including salt forms. Here, we investigated a coformer screening system for salts and cocrystals based on binary phase diagrams using thermal analysis and examined the effectiveness of the method. Indomethacin and tenoxicam were used as models of active pharmaceutical ingredients (APIs). Physical mixtures of an API and 42 kinds of coformers were analyzed using Differential Scanning Calorimetry (DSC) and X-ray DSC. We also conducted coformer screening using a conventional slurry method and compared these results with those from the thermal analysis method and previous studies. Compared with the slurry method, the thermal analysis method was a high-performance screening system, particularly for APIs with low solubility and/or propensity to form solvates. However, this method faced hurdles for screening coformers combined with an API in the presence of kinetic hindrance for salt or cocrystal formation during heating or if there is degradation near the metastable eutectic temperature. The thermal analysis and slurry methods are considered complementary to each other for coformer screening. Feasibility of the thermal analysis method in drug discovery practice is ensured given its small scale and high throughput.
Alavi, Shiva; Kachuie, Marzie
2017-01-01
Background: This study was conducted to assess the hardness of orthodontic brackets produced by metal injection molding (MIM) and conventional methods and different orthodontic wires (stainless steel, nickel-titanium [Ni-Ti], and beta-titanium alloys) for better clinical results. Materials and Methods: A total of 15 specimens from each brand of orthodontic brackets and wires were examined. The brackets (Elite Opti-Mim which is produced by MIM process and Ultratrimm which is produced by conventional brazing method) and the wires (stainless steel, Ni-Ti, and beta-titanium) were embedded in epoxy resin, followed by grinding, polishing, and coating. Then, X-ray energy dispersive spectroscopy (EDS) microanalysis was applied to assess their elemental composition. The same specimen surfaces were repolished and used for Vickers microhardness assessment. Hardness was statistically analyzed with Kruskal–Wallis test, followed by Mann–Whitney test at the 0.05 level of significance. Results: The X-ray EDS analysis revealed different ferrous or co-based alloys in each bracket. The maximum mean hardness values of the wires were achieved for stainless steel (SS) (529.85 Vickers hardness [VHN]) versus the minimum values for beta-titanium (334.65 VHN). Among the brackets, Elite Opti-Mim exhibited significantly higher VHN values (262.66 VHN) compared to Ultratrimm (206.59 VHN). VHN values of wire alloys were significantly higher than those of the brackets. Conclusion: MIM orthodontic brackets exhibited hardness values much lower than those of SS orthodontic archwires and were more compatible with NiTi and beta-titanium archwires. A wide range of microhardness values has been reported for conventional orthodontic brackets and it should be considered that the manufacturing method might be only one of the factors affecting the mechanical properties of orthodontic brackets including hardness. PMID:28928783
Shahabi, Sima; Assadian, Hadi; Mahmoudi Nahavandi, Alireza; Nokhbatolfoghahaei, Hanieh
2018-01-01
Introduction: The demand for esthetic dental treatments is increasing in recent years mainly due to improved oral hygiene and better maintenance of oral health and teeth in older individuals. Bleaching of discolored anterior teeth is the most popular among esthetic dental treatments. Even individuals with sound teeth and adequate esthetics seek to have whiter teeth in the anterior region. The aim of this study was to evaluate tooth color changes following conventional in-office bleaching techniques compared to light-activated methods using different light sources. Methods: Seventy sound anterior teeth (devoided of caries and/or fracture), extracted for periodontal and orthodontic reasons were selected and allocated to 7 groups: (A) control, (B) conventional bleaching (C) LED-activated bleaching, (D) KTP laser-activated bleaching, (E) diode laser-activated bleaching, (F) Nd:YAG laser-activated bleaching and (G) CO2 laser-activated bleaching. Colorimetric evaluation was carried out before and after treatment using a spectrophotoradiometer. Data were analyzed by one- and two-way analysis of variance (ANOVA) as well as multiple comparison methods. Results: The results showed that all bleaching procedures were effective in reducing the yellowness index. However, the KTP laser-activated bleaching was significantly more effective than the other techniques in 95% confidence level. It was also seen that CO2 laser activated method has outperformed groups E, F and G and the conventional bleaching without light activation was not effective at all and represented similar results with the control group. Furthermore, the groups E and G had almost the same results in decreasing the yellowness index. Conclusion: The results showed that all bleaching techniques were effective however, the KTP laser-activated bleaching was significantly more efficient, closely followed by the CO2 laser-activated bleaching technique.
NASA Astrophysics Data System (ADS)
Hassan, Said A.; Elzanfaly, Eman S.; Salem, Maissa Y.; El-Zeany, Badr A.
2016-01-01
A novel spectrophotometric method was developed for determination of ternary mixtures without previous separation, showing significant advantages over conventional methods. The new method is based on mean centering of double divisor ratio spectra. The mathematical explanation of the procedure is illustrated. The method was evaluated by determination of model ternary mixture and by the determination of Amlodipine (AML), Aliskiren (ALI) and Hydrochlorothiazide (HCT) in laboratory prepared mixtures and in a commercial pharmaceutical preparation. For proper presentation of the advantages and applicability of the new method, a comparative study was established between the new mean centering of double divisor ratio spectra (MCDD) and two similar methods used for analysis of ternary mixtures, namely mean centering (MC) and double divisor of ratio spectra-derivative spectrophotometry (DDRS-DS). The method was also compared with a reported one for analysis of the pharmaceutical preparation. The method was validated according to the ICH guidelines and accuracy, precision, repeatability and robustness were found to be within the acceptable limits.
NASA Astrophysics Data System (ADS)
Huerta, F. V.; Granados, I.; Aguirre, J.; Carrera, R. Á.
2017-12-01
Nowadays, in hydrocarbon industry, there is a need to optimize and reduce exploration costs in the different types of reservoirs, motivating the community specialized in the search and development of alternative exploration geophysical methods. This study show the reflection response obtained from a shale gas / oil deposit through the method of seismic interferometry of ambient vibrations in combination with Wavelet analysis and conventional seismic reflection techniques (CMP & NMO). The method is to generate seismic responses from virtual sources through the process of cross-correlation of records of Ambient Seismic Vibrations (ASV), collected in different receivers. The seismic response obtained is interpreted as the response that would be measured in one of the receivers considering a virtual source in the other. The acquisition of ASV records was performed in northern of Mexico through semi-rectangular arrays of multi-component geophones with instrumental response of 10 Hz. The in-line distance between geophones was 40 m while in cross-line was 280 m, the sampling used during the data collection was 2 ms and the total duration of the records was 6 hours. The results show the reflection response of two lines in the in-line direction and two in the cross-line direction for which the continuity of coherent events have been identified and interpreted as reflectors. There is certainty that the events identified correspond to reflections because the time-frequency analysis performed with the Wavelet Transform has allowed to identify the frequency band in which there are body waves. On the other hand, the CMP and NMO techniques have allowed to emphasize and correct the reflection response obtained during the correlation processes in the frequency band of interest. The results of the processing and analysis of ASV records through the seismic interferometry method have allowed us to see interesting results in light of the cross-correlation process in combination with the Wavelet analysis and conventional seismic reflection techniques. Therefore it was possible to recover the seismic response on each analyzed source-receiver pair, allowing us to obtain the reflection response of each analyzed seismic line.
Off-line real-time FTIR analysis of a process step in imipenem production
NASA Astrophysics Data System (ADS)
Boaz, Jhansi R.; Thomas, Scott M.; Meyerhoffer, Steven M.; Staskiewicz, Steven J.; Lynch, Joseph E.; Egan, Richard S.; Ellison, Dean K.
1992-08-01
We have developed an FT-IR method, using a Spectra-Tech Monit-IR 400 systems, to monitor off-line the completion of a reaction in real-time. The reaction is moisture-sensitive and analysis by more conventional methods (normal-phase HPLC) is difficult to reproduce. The FT-IR method is based on the shift of a diazo band when a conjugated beta-diketone is transformed into a silyl enol ether during the reaction. The reaction mixture is examined directly by IR and does not require sample workup. Data acquisition time is less than one minute. The method has been validated for specificity, precision and accuracy. The results obtained by the FT-IR method for known mixtures and in-process samples compare favorably with those from a normal-phase HPLC method.
Konishi, Takahiro; Nakajima, Kenichi; Okuda, Koichi; Yoneyama, Hiroto; Matsuo, Shinro; Shibutani, Takayuki; Onoguchi, Masahisa; Kinuya, Seigo
2017-07-01
Although IQ-single-photon emission computed tomography (SPECT) provides rapid acquisition and attenuation-corrected images, the unique technology may create characteristic distribution different from the conventional imaging. This study aimed to compare the diagnostic performance of IQ-SPECT using Japanese normal databases (NDBs) with that of the conventional SPECT for thallium-201 ( 201 Tl) myocardial perfusion imaging (MPI). A total of 36 patients underwent 1-day 201 Tl adenosine stress-rest MPI. Images were acquired with IQ-SPECT at approximately one-quarter of the standard time of conventional SPECT. Projection data acquired with the IQ-SPECT system were reconstructed via an ordered subset conjugate gradient minimizer method with or without scatter and attenuation correction (SCAC). Projection data obtained using the conventional SPECT were reconstructed via a filtered back projection method without SCAC. The summed stress score (SSS) was calculated using NDBs created by the Japanese Society of Nuclear Medicine working group, and scores were compared between IQ-SPECT and conventional SPECT using the acquisition condition-matched NDBs. The diagnostic performance of the methods for the detection of coronary artery disease was also compared. SSSs were 6.6 ± 8.2 for the conventional SPECT, 6.6 ± 9.4 for IQ-SPECT without SCAC, and 6.5 ± 9.7 for IQ-SPECT with SCAC (p = n.s. for each comparison). The SSS showed a strong positive correlation between conventional SPECT and IQ-SPECT (r = 0.921 and p < 0.0001), and the correlation between IQ-SPECT with and without SCAC was also good (r = 0.907 and p < 0.0001). Regarding diagnostic performance, the sensitivity, specificity, and accuracy were 80.8, 78.9, and 79.4%, respectively, for the conventional SPECT; 80.8, 80.3, and 82.0%, respectively, for IQ-SPECT without SCAC; and 88.5, 86.8, and 87.3%, respectively, for IQ-SPECT with SCAC, respectively. The area under the curve obtained via receiver operating characteristic analysis were 0.77, 0.80, and 0.86 for conventional SPECT, IQ-SPECT without SCAC, and IQ-SPECT with SCAC, respectively (p = n.s. for each comparison). When appropriate NDBs were used, the diagnostic performance of 201 Tl IQ-SPECT was comparable with that of the conventional system regardless of different characteristics of myocardial accumulation in the conventional system.
Wangroongsarb, Piyada; Kohda, Tomoko; Jittaprasartsin, Chutima; Suthivarakom, Karun; Kamthalang, Thanitchi; Umeda, Kaoru; Sawanpanyalert, Pathom; Kozaki, Shunji; Ikuta, Kazuyoshi
2014-01-01
Background Thailand has had several foodborne outbreaks of botulism, one of the biggest being in 2006 when laboratory investigations identified the etiologic agent as Clostridium botulinum type A. Identification of the etiologic agent from outbreak samples is laborious using conventional microbiological methods and the neurotoxin mouse bioassay. Advances in molecular techniques have added enormous information regarding the etiology of outbreaks and characterization of isolates. We applied these methods in three outbreaks of botulism in Thailand in 2010. Methodology/Principal Findings A total of 19 cases were involved (seven each in Lampang and Saraburi and five in Maehongson provinces). The first outbreak in Lampang province in April 2010 was associated with C. botulinum type F, which was detected by conventional methods. Outbreaks in Saraburi and Maehongson provinces occurred in May and December were due to C. botulinum type A1(B) and B that were identified by conventional methods and molecular techniques, respectively. The result of phylogenetic sequence analysis showed that C. botulinum type A1(B) strain Saraburi 2010 was close to strain Iwate 2007. Molecular analysis of the third outbreak in Maehongson province showed C. botulinum type B8, which was different from B1–B7 subtype. The nontoxic component genes of strain Maehongson 2010 revealed that ha33, ha17 and botR genes were close to strain Okra (B1) while ha70 and ntnh genes were close to strain 111 (B2). Conclusion/Significance This study demonstrates the utility of molecular genotyping of C. botulinum and how it contributes to our understanding the epidemiology and variation of boNT gene. Thus, the recent botulism outbreaks in Thailand were induced by various C. botulinum types. PMID:24475015
Wangroongsarb, Piyada; Kohda, Tomoko; Jittaprasartsin, Chutima; Suthivarakom, Karun; Kamthalang, Thanitchi; Umeda, Kaoru; Sawanpanyalert, Pathom; Kozaki, Shunji; Ikuta, Kazuyoshi
2014-01-01
Thailand has had several foodborne outbreaks of botulism, one of the biggest being in 2006 when laboratory investigations identified the etiologic agent as Clostridium botulinum type A. Identification of the etiologic agent from outbreak samples is laborious using conventional microbiological methods and the neurotoxin mouse bioassay. Advances in molecular techniques have added enormous information regarding the etiology of outbreaks and characterization of isolates. We applied these methods in three outbreaks of botulism in Thailand in 2010. A total of 19 cases were involved (seven each in Lampang and Saraburi and five in Maehongson provinces). The first outbreak in Lampang province in April 2010 was associated with C. botulinum type F, which was detected by conventional methods. Outbreaks in Saraburi and Maehongson provinces occurred in May and December were due to C. botulinum type A1(B) and B that were identified by conventional methods and molecular techniques, respectively. The result of phylogenetic sequence analysis showed that C. botulinum type A1(B) strain Saraburi 2010 was close to strain Iwate 2007. Molecular analysis of the third outbreak in Maehongson province showed C. botulinum type B8, which was different from B1-B7 subtype. The nontoxic component genes of strain Maehongson 2010 revealed that ha33, ha17 and botR genes were close to strain Okra (B1) while ha70 and ntnh genes were close to strain 111 (B2). This study demonstrates the utility of molecular genotyping of C. botulinum and how it contributes to our understanding the epidemiology and variation of boNT gene. Thus, the recent botulism outbreaks in Thailand were induced by various C. botulinum types.
Lee, Wan-Ning; Huang, Ching-Hua; Zhu, Guangxuan
2018-08-01
Chlorine sanitizers used in washing fresh and fresh-cut produce can lead to generation of disinfection by-products (DBPs) that are harmful to human health. Monitoring of DBPs is necessary to protect food safety but comprehensive analytical methods have been lacking. This study has optimized three U.S. Environmental Protection Agency methods for drinking water DBPs to improve their performance for produce wash water. The method development encompasses 40 conventional and emerging DBPs. Good recoveries (60-130%) were achieved for most DBPs in deionized water and in lettuce, strawberry and cabbage wash water. The method detection limits are in the range of 0.06-0.58 μg/L for most DBPs and 10-24 ng/L for nitrosamines in produce wash water. Preliminary results revealed the formation of many DBPs when produce is washed with chlorine. The optimized analytical methods by this study effectively reduce matrix interference and can serve as useful tools for future research on food DBPs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech
2012-12-01
To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.
Unsedated transnasal small-caliber esophagogastroduodenoscopy in elderly and bedridden patients
Yuki, Mika; Amano, Yuji; Komazawa, Yoshinori; Fukuhara, Hiroyuki; Shizuku, Toshihiro; Yamamoto, Shun; Kinoshita, Yoshikazu
2009-01-01
AIM: To evaluate the safety of unsedated transnasal small-caliber esophagogastroduodenoscopy (EGD) for elderly and critically ill bedridden patients. METHODS: One prospective randomized comparative study and one crossover comparative study between transnasal small-caliber EGD and transoral conventional EGD was done (Study 1). For the comparative study, we enrolled 240 elderly patients aged > 65 years old. For the crossover analysis, we enrolled 30 bedridden patients with percutaneous endoscopic gastrostomy (PEG) (Study 2). We evaluated cardiopulmonary effects by measuring arterial oxygen saturation (SpO2) and calculating the rate-pressure product (RPP) (pulse rate × systolic blood pressure/100) at baseline, 2 and 5 min after endoscopic intubation in Study 1. To assess the risk for endoscopy-related aspiration pneumonia during EGD, we also measured blood leukocyte counts and serum C-reactive protein (CRP) levels before and 3 d after EGD in Study 2. RESULTS: In Study 1, we observed significant decreases in SpO2 during conventional transoral EGD, but not during transnasal small-caliber EGD (0.24% vs -0.24% after 2 min, and 0.18% vs -0.29% after 5 min, P = 0.034, P = 0.044). Significant differences of the RPP were not found between conventional transoral and transnasal small-caliber EGD. In Study 2, crossover analysis showed statistically significant increases of the RPP at 2 min after intubation and the end of endoscopy (26.8 and 34.6 vs 3.1 and 15.2, P = 0.044, P = 0.046), and decreases of SpO2 (-0.8% vs -0.1%, P = 0.042) during EGD with transoral conventional in comparison with transnasal small-caliber endoscopy. Thus, for bedridden patients with PEG feeding, who were examined in the supine position, transoral conventional EGD more severely suppressed cardiopulmonary function than transnasal small-caliber EGD. There were also significant increases in the markers of inflammation, blood leukocyte counts and serum CRP values, in bedridden patients after transoral conventional EGD, but not after transnasal small-caliber EGD performed with the patient in the supine position. Leukocyte count increased from 6053 ± 1975/L to 6900 ± 3392/L (P = 0.0008) and CRP values increased from 0.93 ± 0.24 to 2.49 ± 0.91 mg/dL (P = 0.0005) at 3 d after transoral conventional EGD. Aspiration pneumonia, possibly caused by the endoscopic examination, was found subsequently in two of 30 patients after transoral conventional EGD. CONCLUSION: Transnasal small-caliber EGD is a safer method than transoral conventional EGD in critically ill, bedridden patients who are undergoing PEG feeding. PMID:19938199
Khan, Nazmul Abedin; Haque, Enamul; Jhung, Sung Hwa
2010-03-20
A typical MOF material, Cu-BTC has been synthesized with microwave and conventional electric heating in various conditions to elucidate, for the first time, the quantitative acceleration in the synthesis of a MOF by microwaves. The acceleration by microwaves is mainly due to rapid nucleation rather than rapid crystal growth, even though both stages are accelerated. The acceleration in the nucleation stage by microwaves is due to the very large pre-exponential factor (about 1.4 x 10(10) times that of conventional synthesis) in the Arrhenius plot. However, the activation energy for the nucleation in the case of microwave synthesis is higher than the activation energy of conventional synthesis. The large acceleration in the nucleation, compared with that in the crystal growth, is observed once again by the syntheses in two-steps (changing heating methods from microwave into conventional heating or from conventional heating into microwave heating just after the nucleation is completed). The crystal size of Cu-BTC obtained by microwave-nucleation is generally smaller than the Cu-BTC made by conventional-nucleation, probably due to rapid nucleation and the small size of nuclei with microwave-nucleation.
Marin-Oyaga, Victor A; Salavati, Ali; Houshmand, Sina; Pasha, Ahmed Khurshid; Gharavi, Mohammad; Saboury, Babak; Basu, Sandip; Torigian, Drew A; Alavi, Abass
2015-01-01
Treatment of malignant pleural mesothelioma (MPM) remains very challenging. Assessment of response to treatment is necessary for modifying treatment and using new drugs. Global disease assessment (GDA) by implementing image processing methods to extract more information out of positron emission tomography (PET) images may provide reliable information. In this study we show the feasibility of this method of semi-quantification in patients with mesothelioma, and compare it with the conventional methods. We also present a review of the literature about this topic. Nineteen subjects with histologically proven MPM who had undergone fluoride-18-fluorodeoxyglucose PET/computed tomography ((18)F-FDG PET/CT) before and after treatment were included in this study. An adaptive contrast-oriented thresholding algorithm was used for the image analysis and semi-quantification. Metabolic tumor volume (MTV), maximum and mean standardized uptake volume (SUVmax, SUVmean) and total lesion glycolysis (TLG) were calculated for each region of interest. The global tumor glycolysis (GTG) was obtained by summing up all TLG. Treatment response was assessed by the European Organisation for Research and Treatment of Cancer (EORTC) criteria and the changes of GTG. Agreement between global disease assessment and conventional method was also determined. In patients with progressive disease based on EORTC criteria, GTG showed an increase of 150.7 but in patients with stable or partial response, GTG showed a decrease of 433.1. The SUVmax of patients before treatment was 5.95 (SD: 2.93) and after the treatment it increased to 6.38 (SD: 3.19). Overall concordance of conventional method with GDA method was 57%. Concordance of progression of disease based on conventional method was 44%, stable disease was 85% and partial response was 33%. Discordance was 55%, 14% and 66%. Adaptive contrast-oriented thresholding algorithm is a promising method to quantify the whole tumor glycolysis in patients with mesothelioma. We are able to assess the total metabolic lesion volume, lesion glycolysis, SUVmax, tumor SUVmean and GTG for this particular tumor. Also we were able to demonstrate the potential use of this technique in the monitoring of treatment response. More studies comparing this technique with conventional and other global disease assessment methods are needed in order to clarify its role in the assessment of treatment response and prognosis of these patients.
Effect of Sling Exercise Training on Balance in Patients with Stroke: A Meta-Analysis
Peng, Qiyuan; Chen, Jingjie; Zou, Yucong; Liu, Gang
2016-01-01
Objective This study aims to evaluate the effect of sling exercise training (SET) on balance in patients with stroke. Methods PubMed, Cochrane Library, Ovid LWW, CBM, CNKI, WanFang, and VIP databases were searched for randomized controlled trials of the effect of SET on balance in patients with stroke. The study design and participants were subjected to metrological analysis. Berg balance Scale (BBS), Barthel index score (BI), and Fugl-Meyer Assessment (FMA) were used as independent parameters for evaluating balance function, activities of daily living(ADL) and motor function after stroke respectively, and were subjected to meta-analysis by RevMan5.3 software. Results Nine studies with 460 participants were analyzed. Results of meta-analysis showed that the SET treatment combined with conventional rehabilitation was superior to conventional rehabilitation treatments, with increased degrees of BBS (WMD = 3.81, 95% CI [0.15, 7.48], P = 0.04), BI (WMD = 12.98, 95% CI [8.39, 17.56], P < 0.00001), and FMA (SMD = 0.76, 95% CI [0.41, 1.11], P < 0.0001). Conclusion Based on limited evidence from 9 trials, the SET treatment combined with conventional rehabilitation was superior to conventional rehabilitation treatments, with increased degrees of BBS, BI and FMA, So the SET treatment can improvement of balance function after stroke, but the interpretation of our findings is required to be made with caution due to limitations in included trials such as small sample sizes and the risk of bias. Therefore, more multi-center and large-sampled randomized controlled trials are needed to confirm its clinical applications. PMID:27727288
Theory of viscous transonic flow over airfoils at high Reynolds number
NASA Technical Reports Server (NTRS)
Melnik, R. E.; Chow, R.; Mead, H. R.
1977-01-01
This paper considers viscous flows with unseparated turbulent boundary layers over two-dimensional airfoils at transonic speeds. Conventional theoretical methods are based on boundary layer formulations which do not account for the effect of the curved wake and static pressure variations across the boundary layer in the trailing edge region. In this investigation an extended viscous theory is developed that accounts for both effects. The theory is based on a rational analysis of the strong turbulent interaction at airfoil trailing edges. The method of matched asymptotic expansions is employed to develop formal series solutions of the full Reynolds equations in the limit of Reynolds numbers tending to infinity. Procedures are developed for combining the local trailing edge solution with numerical methods for solving the full potential flow and boundary layer equations. Theoretical results indicate that conventional boundary layer methods account for only about 50% of the viscous effect on lift, the remaining contribution arising from wake curvature and normal pressure gradient effects.
Shareef, Hussain; Mohamed, Azah
2017-01-01
The electric vehicle (EV) is considered a premium solution to global warming and various types of pollution. Nonetheless, a key concern is the recharging of EV batteries. Therefore, this study proposes a novel approach that considers the costs of transportation loss, buildup, and substation energy loss and that incorporates harmonic power loss into optimal rapid charging station (RCS) planning. A novel optimization technique, called binary lightning search algorithm (BLSA), is proposed to solve the optimization problem. BLSA is also applied to a conventional RCS planning method. A comprehensive analysis is conducted to assess the performance of the two RCS planning methods by using the IEEE 34-bus test system as the power grid. The comparative studies show that the proposed BLSA is better than other optimization techniques. The daily total cost in RCS planning of the proposed method, including harmonic power loss, decreases by 10% compared with that of the conventional method. PMID:29220396
Islam, Md Mainul; Shareef, Hussain; Mohamed, Azah
2017-01-01
The electric vehicle (EV) is considered a premium solution to global warming and various types of pollution. Nonetheless, a key concern is the recharging of EV batteries. Therefore, this study proposes a novel approach that considers the costs of transportation loss, buildup, and substation energy loss and that incorporates harmonic power loss into optimal rapid charging station (RCS) planning. A novel optimization technique, called binary lightning search algorithm (BLSA), is proposed to solve the optimization problem. BLSA is also applied to a conventional RCS planning method. A comprehensive analysis is conducted to assess the performance of the two RCS planning methods by using the IEEE 34-bus test system as the power grid. The comparative studies show that the proposed BLSA is better than other optimization techniques. The daily total cost in RCS planning of the proposed method, including harmonic power loss, decreases by 10% compared with that of the conventional method.
Schmidberger, Andreas; Durner, Bernhard; Gehrmeyer, David; Schupfner, Robert
2018-06-19
The age determination of elephant ivory provides necessary and crucial information for all criminal prosecution authorities enforcing the Convention on International Trade in Endangered Species of Wild Fauna and Flora. The knowledge of the age of ivory allows to distinguish between pre-convention, hence legal material and ivory deriving from recent, illegal poaching incidents. The commonly applied method to determine the age of ivory is radiocarbon dating in the form of bomb pulse dating, which however will fade out soon. This work provides an enhancement of the radiocarbon dating method by supplementary determination of the isotope profile of 90-Sr and the two thorium isotopes 228-Th and 232-Th. This combined analysis allows for a precise and unambiguous age determination of ivory. We provided calibration curves for all involved radionuclides by analyzing ivory samples with known age and investigated a new method for the extraction of strontium from ivory. Copyright © 2018 Elsevier B.V. All rights reserved.
Linear and nonlinear dynamic analysis of redundant load path bearingless rotor systems
NASA Technical Reports Server (NTRS)
Murthy, V. R.
1985-01-01
The bearingless rotorcraft offers reduced weight, less complexity and superior flying qualities. Almost all the current industrial structural dynamic programs of conventional rotors which consist of single load path rotor blades employ the transfer matrix method to determine natural vibration characteristics because this method is ideally suited for one dimensional chain like structures. This method is extended to multiple load path rotor blades without resorting to an equivalent single load path approximation. Unlike the conventional blades, it isk necessary to introduce the axial-degree-of-freedom into the solution process to account for the differential axial displacements in the different load paths. With the present extension, the current rotor dynamic programs can be modified with relative ease to account for the multiple load paths without resorting to the equivalent single load path modeling. The results obtained by the transfer matrix method are validated by comparing with the finite element solutions. A differential stiffness matrix due to blade rotation is derived to facilitate the finite element solutions.
Design of a compact disk-like microfluidic platform for enzyme-linked immunosorbent assay.
Lai, Siyi; Wang, Shengnian; Luo, Jun; Lee, L James; Yang, Shang-Tian; Madou, Marc J
2004-04-01
This paper presents an integrated microfluidic device on a compact disk (CD) that performs an enzyme-linked immunosorbent assay (ELISA) for rat IgG from a hybridoma cell culture. Centrifugal and capillary forces were used to control the flow sequence of different solutions involved in the ELISA process. The microfluidic device was fabricated on a plastic CD. Each step of the ELISA process was carried out automatically by controlling the rotation speed of the CD. The work on analysis of rat IgG from hybridoma culture showed that the microchip-based ELISA has the same detection range as the conventional method on the 96-well microtiter plate but has advantages such as less reagent consumption and shorter assay time over the conventional method.
Characterization of Organic and Conventional Coffee Using Neutron Activation Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. A. De Nadai Fernandes; P. Bode; F. S. Tagliaferro
2000-11-12
Countries importing organic coffee are facing the difficulty of assessing the quality of the product to distinguish original organic coffee from other coffees, thereby eliminating possible fraud. Many analytical methods are matrix sensitive and require matrix-matching reference materials for validation, which are currently nonexistent. This work aims to establish the trace element characterization of organic and conventional Brazilian coffees and to establish correlations with the related soil and the type of fertilizer and agrochemicals applied. It was observed that the variability in element concentrations between the various types of coffee is not so large, which emphasizes the need for analyticalmore » methods of high accuracy, reproducibility, and a well-known uncertainty. Moreover, the analyses indicate that sometimes the coffee packages may contain some soil remnants.« less
A technology roadmap of smart biosensors from conventional glucose monitoring systems.
Shende, Pravin; Sahu, Pratiksha; Gaud, Ram
2017-06-01
The objective of this review article is to focus on technology roadmap of smart biosensors from a conventional glucose monitoring system. The estimation of glucose with commercially available devices involves analysis of blood samples that are obtained by pricking finger or extracting blood from the forearm. Since pain and discomfort are associated with invasive methods, the non-invasive measurement techniques have been investigated. The non-invasive methods show advantages like non-exposure to sharp objects such as needles and syringes, due to which there is an increase in testing frequency, improved control of glucose concentration and absence of pain and biohazard materials. This review study is aimed to describe recent invasive techniques and major noninvasive techniques, viz. biosensors, optical techniques and sensor-embedded contact lenses for glucose estimation.
Bai, Xiaomei; Wen, Zhongming; An, Shaoshan; Li, Bicheng
2015-01-01
Evaluating the sustainability of cropland use is essential for guaranteeing a secure food supply and accomplishing agriculture sustainable development. This study was conducted in the ecologically vulnerable Loess Plateau region of China to evaluate the sustainability of cropland use based on an ecological footprint model that integrates emergy analysis. One modified method proposed in 2005 is known as the emergetic ecological footprint (EEF). We enhanced the method by accounting for both the surface soil energy in the carrying capacity calculation and the net topsoil loss for human consumption in the EF calculation. This paper evaluates whether the cropland of the study area was overloaded or sustainably managed during the period from 1981 to 2009. Toward this end, the final results obtained from EEF were compared to conventional EF and previous methods. The results showed that the cropland of Yuanzhou County has not been used sustainably since 1983, and the conventional EF analysis provided similar results. In contrast, a deficit did not appear during this time period when previous calculation methods of others were used. Additionally, the ecological sustainable index (ESI) from three models indicated that the recently used cropland system is unlikely to be unsustainable. PMID:25738289
Jiménez-Pajares, María Soledad; Herrera, Laura; Valverde, Azucena; Saiz, Pilar; Sáez-Nieto, Juan Antonio
2005-05-01
Mycobacterium kansasii is an opportunistic pathogen that mainly causes pulmonary infections. This species accounted for 9.7% of Mycobacteria other than tuberculosis complex identified in the reference laboratory of the Spanish Centro Nacional de Microbiologia during the period of 2000-2003. In this study we analyzed the phenotypic and genotypic characteristics of 298 M. kansasii strains isolated over this 4-year period. The phenotypic characteristics were determined by conventional methods: biochemical testing, culture and morphological study. Genotypic characteristics were studied using PCR restriction fragment analysis of a fragment of the hsp65 gene and digestion with BstEII and HaeIII, according to the method of Telenti. Among the total of tested strains, 57.4% had the typical phenotypic characteristics described for M. kansasii. The rest had atypical patterns that we grouped into 17 biotypes. Strains belonging to six of the seven described genotypes were identified, with 86.6% of the strains falling into genotype I. Analysis of the phenotypic characteristics of M. kansasii showed a higher discrimination index for intraspecific differentiation than genotypic methods. Nevertheless, the high variability of phenotypic characteristics, some of which were very specific for the species (e.g., photochromogenicity), could complicate their identification. Hence both conventional and molecular methods should be used to accurately identify the atypical isolates.
Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping
2017-05-01
To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3 U ml -1 , a 70% higher yield of glucoamylase than its parent strain.
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
The Flight Optimization System Weights Estimation Method
NASA Technical Reports Server (NTRS)
Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.
2017-01-01
FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)
2001-01-01
Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.
Optimization for Peptide Sample Preparation for Urine Peptidomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sigdel, Tara K.; Nicora, Carrie D.; Hsieh, Szu-Chuan
2014-02-25
Analysis of native or endogenous peptides in biofluids can provide valuable insights into disease mechanisms. Furthermore, the detected peptides may also have utility as potential biomarkers for non-invasive monitoring of human diseases. The non-invasive nature of urine collection and the abundance of peptides in the urine makes analysis by high-throughput ‘peptidomics’ methods , an attractive approach for investigating the pathogenesis of renal disease. However, urine peptidomics methodologies can be problematic with regards to difficulties associated with sample preparation. The urine matrix can provide significant background interference in making the analytical measurements that it hampers both the identification of peptides andmore » the depth of the peptidomics read when utilizing LC-MS based peptidome analysis. We report on a novel adaptation of the standard solid phase extraction (SPE) method to a modified SPE (mSPE) approach for improved peptide yield and analysis sensitivity with LC-MS based peptidomics in terms of time, cost, clogging of the LC-MS column, peptide yield, peptide quality, and number of peptides identified by each method. Expense and time requirements were comparable for both SPE and mSPE, but more interfering contaminants from the urine matrix were evident in the SPE preparations (e.g., clogging of the LC-MS columns, yellowish background coloration of prepared samples due to retained urobilin, lower peptide yields) when compared to the mSPE method. When we compared data from technical replicates of 4 runs, the mSPE method provided significantly improved efficiencies for the preparation of samples from urine (e.g., mSPE peptide identification 82% versus 18% with SPE; p = 8.92E-05). Additionally, peptide identifications, when applying the mSPE method, highlighted the biology of differential activation of urine peptidases during acute renal transplant rejection with distinct laddering of specific peptides, which was obscured for most proteins when utilizing the conventional SPE method. In conclusion, the mSPE method was found to be superior to the conventional, standard SPE method for urine peptide sample preparation when applying LC-MS peptidomics analysis due to the optimized sample clean up that provided improved experimental inference from the confidently identified peptides.« less
Efficient biotechnological approach for lentiviral transduction of induced pluripotent stem cells.
Zare, Mehrak; Soleimani, Masoud; Mohammadian, Mozhdeh; Akbarzadeh, Abolfazl; Havasi, Parvaneh; Zarghami, Nosratollah
2016-01-01
Induced pluripotent stem (iPS) cells are generated from differentiated adult somatic cells by reprogramming them. Unlimited self-renewal, and the potential to differentiate into any cell type, make iPS cells very promising candidates for basic and clinical research. Furthermore, iPS cells can be genetically manipulated for use as therapeutic tools. DNA can be introduced into iPS cells, using lentiviral vectors, which represent a helpful choice for efficient transduction and stable integration of transgenes. In this study, we compare two methods of lentiviral transduction of iPS cells, namely, the suspension method and the hanging drop method. In contrast to the conventional suspension method, in the hanging drop method, embryoid body (EB) formation and transduction occur concurrently. The iPS cells were cultured to form EBs, and then transduced with lentiviruses, using the conventional suspension method and the hanging drop method, to express miR-128 and green fluorescent protein (GFP). The number of transduced cells were assessed by fluorescent microscopy and flow cytometry. MTT assay and real-time PCR were performed to determine the cell viability and transgene expression, respectively. Morphologically, GFP+ cells were more detectable in the hanging drop method, and this finding was quantified by flow cytometric analysis. According to the results of the MTT assay, cell viability was considerably higher in the hanging drop method, and real-time PCR represented a higher relative expression of miR-128 in the iPS cells introduced with lentiviruses in drops. Altogether, it seems that lentiviral transduction of challenging iPS cells using the hanging drop method offers a suitable and sufficient strategy in their gene transfer, with less toxicity than the conventional suspension method.
Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.
Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G
2018-06-01
This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.
Wang, X-H; Zhang, G; Fan, Y-Y; Yang, X; Sui, W-J; Lu, X-X
2013-03-01
Rapid identification of bacterial pathogens from clinical specimens is essential to establish an adequate empirical antibiotic therapy to treat urinary tract infections (UTIs). We used matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) combined with UF-1000i urine flow cytometry of urine specimens to quickly and accurately identify bacteria causing UTIs. We divided each urine sample into three aliquots for conventional identification, UF-1000i, and MALDI-TOF MS, respectively. We compared the results of the conventional method with those of MALDI-TOF MS combined with UF-1000i, and discrepancies were resolved by 16S rRNA gene sequencing. We analyzed 1456 urine samples from patients with UTI symptoms, and 932 (64.0%) were negative using each of the three testing methods. The combined method used UF-1000i to eliminate negative specimens and then MALDI-TOF MS to identify the remaining positive samples. The combined method was consistent with the conventional method in 1373 of 1456 cases (94.3%), and gave the correct result in 1381 of 1456 cases (94.8%). Therefore, the combined method described here can directly provide a rapid, accurate, definitive bacterial identification for the vast majority of urine samples, though the MALDI-TOF MS software analysis capabilities should be improved, with regard to mixed bacterial infection. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar
2017-09-01
The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.
Single-Cell RNA Sequencing of Glioblastoma Cells.
Sen, Rajeev; Dolgalev, Igor; Bayin, N Sumru; Heguy, Adriana; Tsirigos, Aris; Placantonakis, Dimitris G
2018-01-01
Single-cell RNA sequencing (sc-RNASeq) is a recently developed technique used to evaluate the transcriptome of individual cells. As opposed to conventional RNASeq in which entire populations are sequenced in bulk, sc-RNASeq can be beneficial when trying to better understand gene expression patterns in markedly heterogeneous populations of cells or when trying to identify transcriptional signatures of rare cells that may be underrepresented when using conventional bulk RNASeq. In this method, we describe the generation and analysis of cDNA libraries from single patient-derived glioblastoma cells using the C1 Fluidigm system. The protocol details the use of the C1 integrated fluidics circuit (IFC) for capturing, imaging and lysing cells; performing reverse transcription; and generating cDNA libraries that are ready for sequencing and analysis.
2015-01-01
for IC fault detection . This section provides background information on inversion methods. Conventional inversion techniques and their shortcomings are...physical techniques, electron beam imaging/analysis, ion beam techniques, scanning probe techniques. Electrical tests are used to detect faults in 13 an...hand, there is also the second harmonic technique through which duty cycle degradation faults are detected by collecting the magnitude and the phase of
Fungicide residue identification and discrimination using a conducting polymer electronic-nose
Alphus D. Wilson
2013-01-01
The identification of fungicide residues on crop foliage is necessary to make periodic pest management decisions. The determination of fungicide residue identities currently is difficult and time consuming using conventional chemical analysis methods such as gas chromatography-mass spectroscopy. Different fungicide types produce unique electronic aroma signature...
Using Mixed-Effects Structural Equation Models to Study Student Academic Development.
ERIC Educational Resources Information Center
Pike, Gary R.
1992-01-01
A study at the University of Tennessee Knoxville used mixed-effect structural equation models incorporating latent variables as an alternative to conventional methods of analyzing college students' (n=722) first-year-to-senior academic gains. Results indicate, contrary to previous analysis, that coursework and student characteristics interact to…
ERIC Educational Resources Information Center
Sun, Jerry Chih-Yuan; Wu, Yu-Ting
2016-01-01
This study aimed to investigate the effectiveness of two different teaching methods on learning effectiveness. OpenCourseWare was integrated into the flipped classroom model (experimental group) and distance learning (control group). Learning effectiveness encompassed learning achievement, teacher-student interactions, and learning satisfaction.…
Using Visualization and Computation in the Analysis of Separation Processes
ERIC Educational Resources Information Center
Joo, Yong Lak; Choudhary, Devashish
2006-01-01
For decades, every chemical engineer has been asked to have a background in separations. The required separations course can, however, be uninspiring and superficial because understanding many separation processes involves conventional graphical methods and commercial process simulators. We utilize simple, user-friendly mathematical software,…
DOT National Transportation Integrated Search
2010-01-01
Current AASHTO provisions for the conventional load rating of flat slab bridges rely on the equivalent strip method : of analysis for determining live load effects, this is generally regarded as overly conservative by many professional : engineers. A...
A PDE approach for quantifying and visualizing tumor progression and regression
NASA Astrophysics Data System (ADS)
Sintay, Benjamin J.; Bourland, J. Daniel
2009-02-01
Quantification of changes in tumor shape and size allows physicians the ability to determine the effectiveness of various treatment options, adapt treatment, predict outcome, and map potential problem sites. Conventional methods are often based on metrics such as volume, diameter, or maximum cross sectional area. This work seeks to improve the visualization and analysis of tumor changes by simultaneously analyzing changes in the entire tumor volume. This method utilizes an elliptic partial differential equation (PDE) to provide a roadmap of boundary displacement that does not suffer from the discontinuities associated with other measures such as Euclidean distance. Streamline pathways defined by Laplace's equation (a commonly used PDE) are used to track tumor progression and regression at the tumor boundary. Laplace's equation is particularly useful because it provides a smooth, continuous solution that can be evaluated with sub-pixel precision on variable grid sizes. Several metrics are demonstrated including maximum, average, and total regression and progression. This method provides many advantages over conventional means of quantifying change in tumor shape because it is observer independent, stable for highly unusual geometries, and provides an analysis of the entire three-dimensional tumor volume.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heyman, Heino M.; Zhang, Xing; Tang, Keqi
2016-02-16
Metabolomics is the quantitative analysis of all metabolites in a given sample. Due to the chemical complexity of the metabolome, optimal separations are required for comprehensive identification and quantification of sample constituents. This chapter provides an overview of both conventional and advanced separations methods in practice for reducing the complexity of metabolite extracts delivered to the mass spectrometer detector, and covers gas chromatography (GC), liquid chromatography (LC), capillary electrophoresis (CE), supercritical fluid chromatography (SFC) and ion mobility spectrometry (IMS) separation techniques coupled with mass spectrometry (MS) as both uni-dimensional and as multi-dimensional approaches.
Composite turbine blade design options for Claude (open) cycle OTEC power systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Penney, T R
1985-11-01
Small-scale turbine rotors made from composites offer several technical advantages for a Claude (open) cycle ocean thermal energy conversion (OTEC) power system. Westinghouse Electric Corporation has designed a composite turbine rotor/disk using state-of-the-art analysis methods for large-scale (100-MW/sub e/) open cycle OTEC applications. Near-term demonstrations using conventional low-pressure turbine blade shapes with composite material would achieve feasibility and modern credibility of the open cycle OTEC power system. Application of composite blades for low-pressure turbo-machinery potentially improves the reliability of conventional metal blades affected by stress corrosion.
NASA Astrophysics Data System (ADS)
Mehrpooya, Mehdi; Ansarinasab, Hojat; Moftakhari Sharifzadeh, Mohammad Mehdi; Rosen, Marc A.
2017-10-01
An integrated power plant with a net electrical power output of 3.71 × 105 kW is developed and investigated. The electrical efficiency of the process is found to be 60.1%. The process includes three main sub-systems: molten carbonate fuel cell system, heat recovery section and cryogenic carbon dioxide capturing process. Conventional and advanced exergoeconomic methods are used for analyzing the process. Advanced exergoeconomic analysis is a comprehensive evaluation tool which combines an exergetic approach with economic analysis procedures. With this method, investment and exergy destruction costs of the process components are divided into endogenous/exogenous and avoidable/unavoidable parts. Results of the conventional exergoeconomic analyses demonstrate that the combustion chamber has the largest exergy destruction rate (182 MW) and cost rate (13,100 /h). Also, the total process cost rate can be decreased by reducing the cost rate of the fuel cell and improving the efficiency of the combustion chamber and heat recovery steam generator. Based on the total avoidable endogenous cost rate, the priority for modification is the heat recovery steam generator, a compressor and a turbine of the power plant, in rank order. A sensitivity analysis is done to investigate the exergoeconomic factor parameters through changing the effective parameter variations.
Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.
2008-01-01
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.
Ahn, T; Moon, S; Youk, Y; Jung, Y; Oh, K; Kim, D
2005-05-30
A novel mode analysis method and differential mode delay (DMD) measurement technique for a multimode optical fiber based on optical frequency domain reflectometry has been proposed for the first time. We have used a conventional OFDR with a tunable external cavity laser and a Michelson interferometer. A few-mode optical multimode fiber was prepared to test our proposed measurement technique. We have also compared the OFDR measurement results with those obtained using a traditional time-domain measurement method.
Kaplan, Heidi B.; Dua, Anahita; Litwin, Douglas B.; Ambrose, Catherine G.; Moore, Laura J.; Murray, COL Clinton K.; Wade, Charles E.; Holcomb, John B.
2016-01-01
Abstract Background: Sepsis from bacteremia occurs in 250,000 cases annually in the United States, has a mortality rate as high as 60%, and is associated with a poorer prognosis than localized infection. Because of these high figures, empiric antibiotic administration for patients with systemic inflammatory response syndrome (SIRS) and suspected infection is the second most common indication for antibiotic administration in intensive care units (ICU)s. However, overuse of empiric antibiotics contributes to the development of opportunistic infections, antibiotic resistance, and the increase in multi-drug-resistant bacterial strains. The current method of diagnosing and ruling out bacteremia is via blood culture (BC) and Gram stain (GS) analysis. Methods: Conventional and molecular methods for diagnosing bacteremia were reviewed and compared. The clinical implications, use, and current clinical trials of polymerase chain reaction (PCR)-based methods to detect bacterial pathogens in the blood stream were detailed. Results: BC/GS has several disadvantages. These include: some bacteria do not grow in culture media; others do not GS appropriately; and cultures can require up to 5 d to guide or discontinue antibiotic treatment. PCR-based methods can be potentially applied to detect rapidly, accurately, and directly microbes in human blood samples. Conclusions: Compared with the conventional BC/GS, particular advantages to molecular methods (specifically, PCR-based methods) include faster results, leading to possible improved antibiotic stewardship when bacteremia is not present. PMID:26918696
Sugimura, Natsuhiko; Igarashi, Yoko; Aoyama, Reiko; Shibue, Toshimichi
2017-02-01
Analysis of the fragmentation pathways of molecules in mass spectrometry gives a fundamental insight into gas-phase ion chemistry. However, the conventional intrinsic reaction coordinates method requires knowledge of the transition states of ion structures in the fragmentation pathways. Herein, we use the nudged elastic band method, using only the initial and final state ion structures in the fragmentation pathways, and report the advantages and limitations of the method. We found a minimum energy path of p-benzoquinone ion fragmentation with two saddle points and one intermediate structure. The primary energy barrier, which corresponded to the cleavage of the C-C bond adjacent to the CO group, was calculated to be 1.50 eV. An additional energy barrier, which corresponded to the cleavage of the CO group, was calculated to be 0.68 eV. We also found an energy barrier of 3.00 eV, which was the rate determining step of the keto-enol tautomerization in CO elimination from the molecular ion of phenol. The nudged elastic band method allowed the determination of a minimum energy path using only the initial and final state ion structures in the fragmentation pathways, and it provided faster than the conventional intrinsic reaction coordinates method. In addition, this method was found to be effective in the analysis of the charge structures of the molecules during the fragmentation in mass spectrometry.
Wong, M S; Cheng, C Y; Ng, B K W; Lam, T P; Chiu, S W
2006-01-01
Spinal orthoses are commonly prescribed to patients with moderate AIS for prevention of further deterioration. In a conventional manufacturing method, plaster bandages are used to get patient's body contour and plaster cast is rectified manually. With the introduction of CAD/CAM system, a series of automated processes from body scanning to digital rectification and milling of positive model can be performed in a fast and accurate fashion. This project is to study the impact of CAD/CAM method as compared with the conventional method. In assessing the 147 recruited subjects fitted with spinal orthoses (43 subjects using conventional method and 104 subjects using CAD/CAM method), significant decreases (p<0.05) were found in the Cobb angles when comparing the pre-intervention data with that of the first year of intervention. Regarding the learning curve, Orthotists are getting more competent with the CAD/CAM technique in four years time. The mean productivity of the CAD/CAM method is 2.75 times higher than that of the conventional method. The CAD/CAM method could achieve similar clinical outcomes and with its high efficiency, could be considered as substitute for conventional methods in fabricating spinal orthoses for patients with AIS.
Biointervention makes leather processing greener: an integrated cleansing and tanning system.
Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari
2003-06-01
The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.
Combining real-time monitoring and knowledge-based analysis in MARVEL
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.
1993-01-01
Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.
Improved dynamic analysis method using load-dependent Ritz vectors
NASA Technical Reports Server (NTRS)
Escobedo-Torres, J.; Ricles, J. M.
1993-01-01
The dynamic analysis of large space structures is important in order to predict their behavior under operating conditions. Computer models of large space structures are characterized by having a large number of degrees of freedom, and the computational effort required to carry out the analysis is very large. Conventional methods of solution utilize a subset of the eigenvectors of the system, but for systems with many degrees of freedom, the solution of the eigenproblem is in many cases the most costly phase of the analysis. For this reason, alternate solution methods need to be considered. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. The load dependent Ritz vector method is presented as an alternative to the classical normal mode methods for obtaining dynamic responses of large space structures. A simplified model of a space station is used to compare results. Results show that the load dependent Ritz vector method predicts the dynamic response better than the classical normal mode method. Even though this alternate method is very promising, further studies are necessary to fully understand its attributes and limitations.
Efficient calibration for imperfect computer models
Tuo, Rui; Wu, C. F. Jeff
2015-12-01
Many computer models contain unknown parameters which need to be estimated using physical observations. Furthermore, the calibration method based on Gaussian process models may lead to unreasonable estimate for imperfect computer models. In this work, we extend their study to calibration problems with stochastic physical data. We propose a novel method, called the L 2 calibration, and show its semiparametric efficiency. The conventional method of the ordinary least squares is also studied. Theoretical analysis shows that it is consistent but not efficient. Here, numerical examples show that the proposed method outperforms the existing ones.
Vina, Andres; Peters, Albert J.; Ji, Lei
2003-01-01
There is a global concern about the increase in atmospheric concentrations of greenhouse gases. One method being discussed to encourage greenhouse gas mitigation efforts is based on a trading system whereby carbon emitters can buy effective mitigation efforts from farmers implementing conservation tillage practices. These practices sequester carbon from the atmosphere, and such a trading system would require a low-cost and accurate method of verification. Remote sensing technology can offer such a verification technique. This paper is focused on the use of standard image processing procedures applied to a multispectral Ikonos image, to determine whether it is possible to validate that farmers have complied with agreements to implement conservation tillage practices. A principal component analysis (PCA) was performed in order to isolate image variance in cropped fields. Analyses of variance (ANOVA) statistical procedures were used to evaluate the capability of each Ikonos band and each principal component to discriminate between conventional and conservation tillage practices. A logistic regression model was implemented on the principal component most effective in discriminating between conventional and conservation tillage, in order to produce a map of the probability of conventional tillage. The Ikonos imagery, in combination with ground-reference information, proved to be a useful tool for verification of conservation tillage practices.
Simpson, Michael R.; Oltmann, Richard N.
1993-01-01
Discharge measurement of large rivers and estuaries is difficult, time consuming, and sometimes dangerous. Frequently, discharge measurements cannot be made in tide-affected rivers and estuaries using conventional discharge-measurement techniques because of dynamic discharge conditions. The acoustic Doppler discharge-measurement system (ADDMS) was developed by the U.S. Geological Survey using a vessel-mounted acoustic Doppler current profiler coupled with specialized computer software to measure horizontal water velocity at 1-meter vertical intervals in the water column. The system computes discharge from water-and vessel-velocity data supplied by the ADDMS using vector-algebra algorithms included in the discharge-measurement software. With this system, a discharge measurement can be obtained by engaging the computer software and traversing a river or estuary from bank to bank; discharge in parts of the river or estuarine cross sections that cannot be measured because of ADDMS depth limitations are estimated by the system. Comparisons of ADDMS-measured discharges with ultrasonic-velocity-meter-measured discharges, along with error-analysis data, have confirmed that discharges provided by the ADDMS are at least as accurate as those produced using conventional methods. In addition, the advantage of a much shorter measurement time (2 minutes using the ADDMS compared with 1 hour or longer using conventional methods) has enabled use of the ADDMS for several applications where conventional discharge methods could not have been used with the required accuracy because of dynamic discharge conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janssens, Geert O., E-mail: g.janssens@rther.umcn.nl; Jansen, Marc H.; Lauwers, Selmer J.
2013-02-01
Purpose: Despite conventional radiation therapy, 54 Gy in single doses of 1.8 Gy (54/1.8 Gy) over 6 weeks, most children with diffuse intrinsic pontine glioma (DIPG) will die within 1 year after diagnosis. To reduce patient burden, we investigated the role of hypofractionation radiation therapy given over 3 to 4 weeks. A 1:1 matched-cohort analysis with conventional radiation therapy was performed to assess response and survival. Methods and Materials: Twenty-seven children, aged 3 to 14, were treated according to 1 of 2 hypofractionation regimens over 3 to 4 weeks (39/3 Gy, n=16 or 44.8/2.8 Gy, n=11). All patients had symptomsmore » for {<=}3 months, {>=}2 signs of the neurologic triad (cranial nerve deficit, ataxia, long tract signs), and characteristic features of DIPG on magnetic resonance imaging. Twenty-seven patients fulfilling the same diagnostic criteria and receiving at least 50/1.8 to 2.0 Gy were eligible for the matched-cohort analysis. Results: With hypofractionation radiation therapy, the overall survival at 6, 9, and 12 months was 74%, 44%, and 22%, respectively. Progression-free survival at 3, 6, and 9 months was 77%, 43%, and 12%, respectively. Temporary discontinuation of steroids was observed in 21 of 27 (78%) patients. No significant difference in median overall survival (9.0 vs 9.4 months; P=.84) and time to progression (5.0 vs 7.6 months; P=.24) was observed between hypofractionation vs conventional radiation therapy, respectively. Conclusions: For patients with newly diagnosed DIPG, a hypofractionation regimen, given over 3 to 4 weeks, offers equal overall survival with less treatment burden compared with a conventional regimen of 6 weeks.« less
Comparison of Accuracy Between a Conventional and Two Digital Intraoral Impression Techniques.
Malik, Junaid; Rodriguez, Jose; Weisbloom, Michael; Petridis, Haralampos
To compare the accuracy (ie, precision and trueness) of full-arch impressions fabricated using either a conventional polyvinyl siloxane (PVS) material or one of two intraoral optical scanners. Full-arch impressions of a reference model were obtained using addition silicone impression material (Aquasil Ultra; Dentsply Caulk) and two optical scanners (Trios, 3Shape, and CEREC Omnicam, Sirona). Surface matching software (Geomagic Control, 3D Systems) was used to superimpose the scans within groups to determine the mean deviations in precision and trueness (μm) between the scans, which were calculated for each group and compared statistically using one-way analysis of variance with post hoc Bonferroni (trueness) and Games-Howell (precision) tests (IBM SPSS ver 24, IBM UK). Qualitative analysis was also carried out from three-dimensional maps of differences between scans. Means and standard deviations (SD) of deviations in precision for conventional, Trios, and Omnicam groups were 21.7 (± 5.4), 49.9 (± 18.3), and 36.5 (± 11.12) μm, respectively. Means and SDs for deviations in trueness were 24.3 (± 5.7), 87.1 (± 7.9), and 80.3 (± 12.1) μm, respectively. The conventional impression showed statistically significantly improved mean precision (P < .006) and mean trueness (P < .001) compared to both digital impression procedures. There were no statistically significant differences in precision (P = .153) or trueness (P = .757) between the digital impressions. The qualitative analysis revealed local deviations along the palatal surfaces of the molars and incisal edges of the anterior teeth of < 100 μm. Conventional full-arch PVS impressions exhibited improved mean accuracy compared to two direct optical scanners. No significant differences were found between the two digital impression methods.
Effect of different mixing methods on the bacterial microleakage of calcium-enriched mixture cement.
Shahi, Shahriar; Jeddi Khajeh, Soniya; Rahimi, Saeed; Yavari, Hamid R; Jafari, Farnaz; Samiei, Mohammad; Ghasemi, Negin; Milani, Amin S
2016-10-01
Calcium-enriched mixture (CEM) cement is used in the field of endodontics. It is similar to mineral trioxide aggregate in its main ingredients. The present study investigated the effect of different mixing methods on the bacterial microleakage of CEM cement. A total of 55 human single-rooted human permanent teeth were decoronated so that 14-mm-long samples were obtained and obturated with AH26 sealer and gutta-percha using lateral condensation technique. Three millimeters of the root end were cut off and randomly divided into 3 groups of 15 each (3 mixing methods of amalgamator, ultrasonic and conventional) and 2 negative and positive control groups (each containing 5 samples). BHI (brain-heart infusion agar) suspension containing Enterococcus faecalis was used for bacterial leakage assessment. Statistical analysis was carried out using descriptive statistics, Kaplan-Meier survival analysis with censored data and log rank test. Statistical significance was set at P<0.05. The survival means for conventional, amalgamator and ultrasonic methods were 62.13±12.44, 68.87±12.79 and 77.53±12.52 days, respectively. The log rank test showed no significant differences between the groups. Based on the results of the present study it can be concluded that different mixing methods had no significant effect on the bacterial microleakage of CEM cement.
Pérez-Olmos, R; Rios, A; Fernández, J R; Lapa, R A; Lima, J L
2001-01-05
In this paper, the construction and evaluation of an electrode selective to nitrate with improved sensitivity, constructed like a conventional electrode (ISE) but using an operational amplifier to sum the potentials supplied by four membranes (ESOA) is described. The two types of electrodes, without an inner reference solution, were constructed using tetraoctylammonium bromide as sensor, dibutylphthalate as solvent mediator and PVC as plastic matrix, the membranes obtained directly applied onto a conductive epoxy resin support. After the comparative evaluation of their working characteristics they were used in the determination of nitrate in different types of tobacco. The limit of detection of the direct potentiometric method developed was found to be 0.18 g kg(-1) and the precision and accuracy of the method, when applied to eight different samples of tobacco, expressed in terms of mean R.S.D. and average percentage of spike recovery was 0.6 and 100.3%, respectively. The comparison of variances showed, on all ocassions, that the results obtained by the ESOA were similar to those obtained by the conventional ISE, but with higher precision. Linear regression analysis showed good agreement (r=0.9994) between the results obtained by the developed potentiometric method and those of a spectrophotometric method based on brucine, adopted as reference method, when applied simultaneously to 32 samples of different types of tobacco.
Seixas, Fábio Heredia; Estrela, Carlos; Bueno, Mike Reis; Sousa-Neto, Manoel Damião; Pécora, Jesus Djalma
2015-06-01
The aim of this study was to determine the root canal area before and after the instrumentation 1 mm short of the apical foramen by clinical and cone beam computed tomography (CBCT) methods, and to evaluate the cleanliness of the apical region in mesiodistal flattened teeth by using optical microscopy. Forty-two human single-canal mandibular incisors were instrumented using the Free Tip Preparation technique up to three, four or five instruments from the initial. Cone beam computed tomography scans were acquired of the samples before and after root canal preparation (RCP). Irrigation was performed by conventional or hydrodynamic means, using 2.5% sodium hypochlorite. The samples were prepared for observation under an optical microscope. Images were digitally obtained, analyzed and the results were submitted to statistical analysis (two-way ANOVA complemented by Bonferroni's post-test). There was no significant difference between the studied anatomical areas with both CBCT and clinical methods. There were no differences between irrigation methods. It was verified differences between instrumentation techniques. Instrumentation with four instruments from the initial instrument determined a significant increase in the contact area when compared to preparation with three instruments, but RCP with 5 instruments did not result in a better cleanliness. The analysis with CBCT was not capable to determine the precise shape of surgical apical area comparing to the clinical method. Both the conventional and hydrodynamic irrigation techniques were not able to promote root canals debris-free. The instruments action in root canal walls was proportional to the number of instruments used from the initial apical instrument.
Franco, Érika Mendonça Fernandes; Valarelli, Fabrício Pinelli; Fernandes, João Batista; Cançado, Rodrigo Hermont; de Freitas, Karina Maria Salvatore
2015-01-01
Abstract Objective: The aim of this study was to compare torque expression in active and passive self-ligating and conventional brackets. Methods: A total of 300 segments of stainless steel wire 0.019 x 0.025-in and six different brands of brackets (Damon 3MX, Portia, In-Ovation R, Bioquick, Roth SLI and Roth Max) were used. Torque moments were measured at 12°, 24°, 36° and 48°, using a wire torsion device associated with a universal testing machine. The data obtained were compared by analysis of variance followed by Tukey test for multiple comparisons. Regression analysis was performed by the least-squares method to generate the mathematical equation of the optimal curve for each brand of bracket. Results: Statistically significant differences were observed in the expression of torque among all evaluated bracket brands in all evaluated torsions (p < 0.05). It was found that Bioquick presented the lowest torque expression in all tested torsions; in contrast, Damon 3MX bracket presented the highest torque expression up to 36° torsion. Conclusions: The connection system between wire/bracket (active, passive self-ligating or conventional with elastic ligature) seems not to interfere in the final torque expression, the latter being probably dependent on the interaction between the wire and the bracket chosen for orthodontic mechanics. PMID:26691972
Momose, Haruka; Mizukami, Takuo; Kuramitsu, Madoka; Takizawa, Kazuya; Masumi, Atsuko; Araki, Kumiko; Furuhata, Keiko; Yamaguchi, Kazunari; Hamaguchi, Isao
2015-01-01
We have previously identified 17 biomarker genes which were upregulated by whole virion influenza vaccines, and reported that gene expression profiles of these biomarker genes had a good correlation with conventional animal safety tests checking body weight and leukocyte counts. In this study, we have shown that conventional animal tests showed varied and no dose-dependent results in serially diluted bulk materials of influenza HA vaccines. In contrast, dose dependency was clearly shown in the expression profiles of biomarker genes, demonstrating higher sensitivity of gene expression analysis than the current animal safety tests of influenza vaccines. The introduction of branched DNA based-concurrent expression analysis could simplify the complexity of multiple gene expression approach, and could shorten the test period from 7 days to 3 days. Furthermore, upregulation of 10 genes, Zbp1, Mx2, Irf7, Lgals9, Ifi47, Tapbp, Timp1, Trafd1, Psmb9, and Tap2, was seen upon virosomal-adjuvanted vaccine treatment, indicating that these biomarkers could be useful for the safety control of virosomal-adjuvanted vaccines. In summary, profiling biomarker gene expression could be a useful, rapid, and highly sensitive method of animal safety testing compared with conventional methods, and could be used to evaluate the safety of various types of influenza vaccines, including adjuvanted vaccine. PMID:25909814
A dose-response model for the conventional phototherapy of the newborn.
Osaku, Nelson Ossamu; Lopes, Heitor Silvério
2006-06-01
Jaundice of the newborn is a common problem as a consequence of the rapid increment of blood bilirubin in the first days of live. In most cases, it is considered a physiological transient situation, but unmanaged hyperbilirubinemia can lead to death or serious injuries for the survivors. For decades, phototherapy has been used as the main method for prevention and treatment of hyperbilirubinaemia of the newborn. This work aims at finding a predictive model for the decrement of blood bilirubin for patients submitted to conventional phototherapy. Data from the phototherapy of 90 term newborns were collected and used in a multiple regression method. A rigorous statistical analysis was done in order to guarantee a correct and valid model. The obtained model was able to explain 78% of the variation of the dependent variable. We show that it is possible to predict the total serum bilirubin of the patient under conventional phototherapy by knowing its birth weight, bilirubin level at the beginning of treatment and the radiant energy density (dose). Besides, it is possible to infer the time necessary for a given decrement of bilirubin, under approximately constant irradiance. Statistical analysis of the obtained model shows that it is valid for several ranges of birth weight, initial bilirubin level, and radiant energy density. It is expected that the proposed model can be useful in the clinical management of hyperbilirubinemia of the newborn.
Direct transesterification of fresh microalgal cells.
Liu, Jiao; Liu, Yanan; Wang, Haitao; Xue, Song
2015-01-01
Transesterification of lipids is a vital step during the processes of both biodiesel production and fatty acid analysis. By comparing the yields and fatty acid profiles obtained from microalgal oil and dry microalgal cells, the reliability of method for the transesterification of micro-scale samples was tested. The minimum amount of microalgal cells needed for accurate analysis was found to be approximately 300μg dry cells. This direct transesterification method of fresh cells was applied to eight microalgal species, and the results indicate that the efficiency of the developed method is identical to that of conventional method, except for Spirulina whose lipid content is very low, which means the total lipid content should been considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
A method based on coffee-ring deposition confocal Raman spectroscopy of analysis of melamine in milk
NASA Astrophysics Data System (ADS)
Tan, Zong; Chen, Da
2016-10-01
In this work, an economical and high-efficiency method for detection of melamine in milk was developed. The enrichment effect of coffee-ring was combined with the micro-region analysis of confocal Raman spectroscopy, in addition, assisted with chemometric algorithmthe. Consequently, a desired result was obtained that the LOD of melamine in this method was 1 ppm, which was excellent because the sensitivity of conventional Raman detection was generally low. Furthermore, the whole process were processed in an easily available condition with almost no chemical reagents consumption, and the chosen substrates for the formation of coffee-ring were reusable. Thus, the method is environmental friendly and has a great potential application in food safety inspection.
ERIC Educational Resources Information Center
Eshleman, Winston Hull
Compared were programed materials and conventional methods for teaching two units of eighth grade science. Programed materials used were linear programed books requiring constructed responses. The conventional methods included textbook study, written exercises, lectures, discussions, demonstrations, experiments, chalkboard drawings, films,…
Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals
NASA Astrophysics Data System (ADS)
Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam
A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.
Polychromatic spectral pattern analysis of ultra-weak photon emissions from a human body.
Kobayashi, Masaki; Iwasa, Torai; Tada, Mika
2016-06-01
Ultra-weak photon emission (UPE), often designated as biophoton emission, is generally observed in a wide range of living organisms, including human beings. This phenomenon is closely associated with reactive oxygen species (ROS) generated during normal metabolic processes and pathological states induced by oxidative stress. Application of UPE extracting the pathophysiological information has long been anticipated because of its potential non-invasiveness, facilitating its diagnostic use. Nevertheless, its weak intensity and UPE mechanism complexity hinder its use for practical applications. Spectroscopy is crucially important for UPE analysis. However, filter-type spectroscopy technique, used as a conventional method for UPE analysis, intrinsically limits its performance because of its monochromatic scheme. To overcome the shortcomings of conventional methods, the authors developed a polychromatic spectroscopy system for UPE spectral pattern analysis. It is based on a highly efficient lens systems and a transmission-type diffraction grating with a highly sensitive, cooled, charge-coupled-device (CCD) camera. Spectral pattern analysis of the human body was done for a fingertip using the developed system. The UPE spectrum covers the spectral range of 450-750nm, with a dominant emission region of 570-670nm. The primary peak is located in the 600-650nm region. Furthermore, application of UPE source exploration was demonstrated with the chemiluminescence spectrum of melanin and coexistence with oxidized linoleic acid. Copyright © 2016 Elsevier B.V. All rights reserved.
Qiu, Jin; Cheng, Jiajing; Wang, Qingying; Hua, Jie
2014-01-01
Background The aim of this study was to compare the effects of the levonorgestrel-releasing intrauterine system (LNG-IUS) with conventional medical treatment in reducing heavy menstrual bleeding. Material/Methods Relevant studies were identified by a search of MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials, and clinical trials registries (from inception to April 2014). Randomized controlled trials comparing the LNG-IUS with conventional medical treatment (mefenamic acid, tranexamic acid, norethindrone, medroxyprogesterone acetate injection, or combined oral contraceptive pills) in patients with menorrhagia were included. Results Eight randomized controlled trials that included 1170 women (LNG-IUS, n=562; conventional medical treatment, n=608) met inclusion criteria. The LNG-IUS was superior to conventional medical treatment in reducing menstrual blood loss (as measured by the alkaline hematin method or estimated by pictorial bleeding assessment chart scores). More women were satisfied with the LNG-IUS than with the use of conventional medical treatment (odds ratio [OR] 5.19, 95% confidence interval [CI] 2.73–9.86). Compared with conventional medical treatment, the LNG-IUS was associated with a lower rate of discontinuation (14.6% vs. 28.9%, OR 0.39, 95% CI 0.20–0.74) and fewer treatment failures (9.2% vs. 31.0%, OR 0.18, 95% CI 0.10–0.34). Furthermore, quality of life assessment favored LNG-IUS over conventional medical treatment, although use of various measurements limited our ability to pool the data for more powerful evidence. Serious adverse events were statistically comparable between treatments. Conclusions The LNG-IUS was the more effective first choice for management of menorrhagia compared with conventional medical treatment. Long-term, randomized trials are required to further investigate patient-based outcomes and evaluate the cost-effectiveness of the LNG-IUS and other medical treatments. PMID:25245843
Method and apparatus for chromatographic quantitative analysis
Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella
1981-06-09
An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.
Woynaroski, Tiffany; Oller, D Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-03-01
Theory and research suggest that vocal development predicts "useful speech" in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently "in development" and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. Autism Res 2017, 10: 508-519. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.
Risk analysis for veterinary biologicals released into the environment.
Silva, S V; Samagh, B S; Morley, R S
1995-12-01
All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.
Cost-utility analysis on telemonitoring of users with pacemakers: The PONIENTE study.
Lopez-Villegas, Antonio; Catalan-Matamoros, Daniel; Robles-Musso, Emilio; Bautista-Mesa, Rafael; Peiro, Salvador
2018-01-01
Introduction Few studies have confirmed the cost-saving of telemonitoring of users with pacemakers (PMs). The purpose of this controlled, non-randomised, non-masked clinical trial was to perform an economic assessment of telemonitoring (TM) of users with PMs and check whether TM offers a cost-utility alternative to conventional follow-up in hospital. Methods Eighty-two patients implanted with an internet-based transmission PM were selected to receive either conventional follow-up in hospital ( n = 52) or TM ( n = 30) from their homes. The data were collected during 12 months while patients were being monitored. The economic assessment of the PONIENTE study was performed as per the perspectives of National Health Service (NHS) and patients. A cost-utility analysis was conducted to measure whether the TM of patients with PMs is cost-effective in terms of costs per gained quality-adjusted life years (QALYs). Results There was a significant cost-saving for participants in the TM group in comparison with the participants in the conventional follow-up group. From the NHS's perspective, the patients in the TM group gained 0.09 QALYs more than the patients in the conventional follow-up group over 12 months, with a cost saving of 57.64% (€46.51 versus €109.79, respectively; p < 0.001) per participant per year. In-office visits were reduced by 52.49% in the TM group. The costs related to the patient perspective were lower in the TM group than in the conventional follow-up group (€31.82 versus €73.48, respectively; p < 0.005). The costs per QALY were 61.68% higher in the in-office monitoring group. Discussion The cost-utility analysis performed in the PONIENTE study showed that the TM of users with PMs appears to be a significant cost-effective alternative to conventional follow-up in hospital.
Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J
2017-11-24
Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Yang, Li; Wang, Guobao; Qi, Jinyi
2016-04-01
Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.
Size and shape measurement in contemporary cephalometrics.
McIntyre, Grant T; Mossey, Peter A
2003-06-01
The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.