Sample records for conventional methods require

  1. Some Findings Concerning Requirements in Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  2. A Comparison of Programed Instruction with Conventional Methods for Teaching Two Units of Eighth Grade Science.

    ERIC Educational Resources Information Center

    Eshleman, Winston Hull

    Compared were programed materials and conventional methods for teaching two units of eighth grade science. Programed materials used were linear programed books requiring constructed responses. The conventional methods included textbook study, written exercises, lectures, discussions, demonstrations, experiments, chalkboard drawings, films,…

  3. Emergent surgical airway: comparison of the three-step method and conventional cricothyroidotomy utilizing high-fidelity simulation.

    PubMed

    Quick, Jacob A; MacIntyre, Allan D; Barnes, Stephen L

    2014-02-01

    Surgical airway creation has a high potential for disaster. Conventional methods can be cumbersome and require special instruments. A simple method utilizing three steps and readily available equipment exists, but has yet to be adequately tested. Our objective was to compare conventional cricothyroidotomy with the three-step method utilizing high-fidelity simulation. Utilizing a high-fidelity simulator, 12 experienced flight nurses and paramedics performed both methods after a didactic lecture, simulator briefing, and demonstration of each technique. Six participants performed the three-step method first, and the remaining 6 performed the conventional method first. Each participant was filmed and timed. We analyzed videos with respect to the number of hand repositions, number of airway instrumentations, and technical complications. Times to successful completion were measured from incision to balloon inflation. The three-step method was completed faster (52.1 s vs. 87.3 s; p = 0.007) as compared with conventional surgical cricothyroidotomy. The two methods did not differ statistically regarding number of hand movements (3.75 vs. 5.25; p = 0.12) or instrumentations of the airway (1.08 vs. 1.33; p = 0.07). The three-step method resulted in 100% successful airway placement on the first attempt, compared with 75% of the conventional method (p = 0.11). Technical complications occurred more with the conventional method (33% vs. 0%; p = 0.05). The three-step method, using an elastic bougie with an endotracheal tube, was shown to require fewer total hand movements, took less time to complete, resulted in more successful airway placement, and had fewer complications compared with traditional cricothyroidotomy. Published by Elsevier Inc.

  4. The Sine Method: An Alternative Height Measurement Technique

    Treesearch

    Don C. Bragg; Lee E. Frelich; Robert T. Leverett; Will Blozan; Dale J. Luthringer

    2011-01-01

    Height is one of the most important dimensions of trees, but few observers are fully aware of the consequences of the misapplication of conventional height measurement techniques. A new approach, the sine method, can improve height measurement by being less sensitive to the requirements of conventional techniques (similar triangles and the tangent method). We studied...

  5. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  6. A simple, less invasive stripper micropipetter-based technique for day 3 embryo biopsy.

    PubMed

    Cedillo, Luciano; Ocampo-Bárcenas, Azucena; Maldonado, Israel; Valdez-Morales, Francisco J; Camargo, Felipe; López-Bayghen, Esther

    2016-01-01

    Preimplantation genetic screening (PGS) is an important procedure for in vitro fertilization (IVF). A key step of PGS, blastomere removal, is abundant with many technical issues. The aim of this study was to compare a more simple procedure based on the Stipper Micropipetter, named S-biopsy, to the conventional aspiration method. On Day 3, 368 high-quality embryos (>7 cells on Day3 with <10% fragmentation) were collected from 38 women. For each patient, their embryos were equally separated between the conventional method ( n  = 188) and S-biopsy method ( n  = 180). The conventional method was performed using a standardized protocol. For the S-biopsy method, a laser was used to remove a significantly smaller portion of the zona pellucida. Afterwards, the complete embryo was aspirated with a Stripper Micropipetter, forcing the removal of the blastomere. Selected blastomeres went to PGS using CGH microarrays. Embryo integrity and blastocyst formation were assessed on Day 5. Differences between groups were assessed by either the Mann-Whitney test or Fisher Exact test. Both methods resulted in the removal of only one blastomere. The S-biopsy and the conventional method did not differ in terms of affecting embryo integrity (95.0% vs. 95.7%) or blastocyst formation (72.7% vs. 70.7%). PGS analysis indicated that aneuploidy rate were similar between the two methods (63.1% vs. 65.2%). However, the time required to perform the S-biopsy method (179.2 ± 17.5 s) was significantly shorter (5-fold) than the conventional method. The S-biopsy method is comparable to the conventional method that is used to remove a blastomere for PGS, but requires less time. Furthermore, due to the simplicity of the S-biopsy technique, this method is more ideal for IVF laboratories.

  7. Self-calibration method without joint iteration for distributed small satellite SAR systems

    NASA Astrophysics Data System (ADS)

    Xu, Qing; Liao, Guisheng; Liu, Aifei; Zhang, Juan

    2013-12-01

    The performance of distributed small satellite synthetic aperture radar systems degrades significantly due to the unavoidable array errors, including gain, phase, and position errors, in real operating scenarios. In the conventional method proposed in (IEEE T Aero. Elec. Sys. 42:436-451, 2006), the spectrum components within one Doppler bin are considered as calibration sources. However, it is found in this article that the gain error estimation and the position error estimation in the conventional method can interact with each other. The conventional method may converge to suboptimal solutions in large position errors since it requires the joint iteration between gain-phase error estimation and position error estimation. In addition, it is also found that phase errors can be estimated well regardless of position errors when the zero Doppler bin is chosen. In this article, we propose a method obtained by modifying the conventional one, based on these two observations. In this modified method, gain errors are firstly estimated and compensated, which eliminates the interaction between gain error estimation and position error estimation. Then, by using the zero Doppler bin data, the phase error estimation can be performed well independent of position errors. Finally, position errors are estimated based on the Taylor-series expansion. Meanwhile, the joint iteration between gain-phase error estimation and position error estimation is not required. Therefore, the problem of suboptimal convergence, which occurs in the conventional method, can be avoided with low computational method. The modified method has merits of faster convergence and lower estimation error compared to the conventional one. Theoretical analysis and computer simulation results verified the effectiveness of the modified method.

  8. K-space data processing for magnetic resonance elastography (MRE).

    PubMed

    Corbin, Nadège; Breton, Elodie; de Mathelin, Michel; Vappou, Jonathan

    2017-04-01

    Magnetic resonance elastography (MRE) requires substantial data processing based on phase image reconstruction, wave enhancement, and inverse problem solving. The objective of this study is to propose a new, fast MRE method based on MR raw data processing, particularly adapted to applications requiring fast MRE measurement or high elastogram update rate. The proposed method allows measuring tissue elasticity directly from raw data without prior phase image reconstruction and without phase unwrapping. Experimental feasibility is assessed both in a gelatin phantom and in the liver of a porcine model in vivo. Elastograms are reconstructed with the raw MRE method and compared to those obtained using conventional MRE. In a third experiment, changes in elasticity are monitored in real-time in a gelatin phantom during its solidification by using both conventional MRE and raw MRE. The raw MRE method shows promising results by providing similar elasticity values to the ones obtained with conventional MRE methods while decreasing the number of processing steps and circumventing the delicate step of phase unwrapping. Limitations of the proposed method are the influence of the magnitude on the elastogram and the requirement for a minimum number of phase offsets. This study demonstrates the feasibility of directly reconstructing elastograms from raw data.

  9. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  10. Confocal laser induced fluorescence with comparable spatial localization to the conventional method

    NASA Astrophysics Data System (ADS)

    Thompson, Derek S.; Henriquez, Miguel F.; Scime, Earl E.; Good, Timothy N.

    2017-10-01

    We present measurements of ion velocity distributions obtained by laser induced fluorescence (LIF) using a single viewport in an argon plasma. A patent pending design, which we refer to as the confocal fluorescence telescope, combines large objective lenses with a large central obscuration and a spatial filter to achieve high spatial localization along the laser injection direction. Models of the injection and collection optics of the two assemblies are used to provide a theoretical estimate of the spatial localization of the confocal arrangement, which is taken to be the full width at half maximum of the spatial optical response. The new design achieves approximately 1.4 mm localization at a focal length of 148.7 mm, improving on previously published designs by an order of magnitude and approaching the localization achieved by the conventional method. The confocal method, however, does so without requiring a pair of separated, perpendicular optical paths. The confocal technique therefore eases the two window access requirement of the conventional method, extending the application of LIF to experiments where conventional LIF measurements have been impossible or difficult, or where multiple viewports are scarce.

  11. Genetic improvement of olive (Olea europaea L.) by conventional and in vitro biotechnology methods.

    PubMed

    Rugini, E; Cristofori, V; Silvestri, C

    2016-01-01

    In olive (Olea europaea L.) traditional methods of genetic improvement have up to now produced limited results. Intensification of olive growing requires appropriate new cultivars for fully mechanized groves, but among the large number of the traditional varieties very few are suitable. High-density and super high-density hedge row orchards require genotypes with reduced size, reduced apical dominance, a semi-erect growth habit, easy to propagate, resistant to abiotic and biotic stresses, with reliably high productivity and quality of both fruits and oil. Innovative strategies supported by molecular and biotechnological techniques are required to speed up novel hybridisation methods. Among traditional approaches the Gene Pool Method seems a reasonable option, but it requires availability of widely diverse germplasm from both cultivated and wild genotypes, supported by a detailed knowledge of their genetic relationships. The practice of "gene therapy" for the most important existing cultivars, combined with conventional methods, could accelerate achievement of the main goals, but efforts to overcome some technical and ideological obstacles are needed. The present review describes the benefits that olive and its products may obtain from genetic improvement using state of the art of conventional and unconventional methods, and includes progress made in the field of in vitro techniques. The uses of both traditional and modern technologies are discussed with recommendations. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: Theory and applications.

    PubMed

    Hayashi, Ryusuke; Watanabe, Osamu; Yokoyama, Hiroki; Nishida, Shin'ya

    2017-06-01

    Characterization of the functional relationship between sensory inputs and neuronal or observers' perceptual responses is one of the fundamental goals of systems neuroscience and psychophysics. Conventional methods, such as reverse correlation and spike-triggered data analyses are limited in their ability to resolve complex and inherently nonlinear neuronal/perceptual processes because these methods require input stimuli to be Gaussian with a zero mean. Recent studies have shown that analyses based on a generalized linear model (GLM) do not require such specific input characteristics and have advantages over conventional methods. GLM, however, relies on iterative optimization algorithms and its calculation costs become very expensive when estimating the nonlinear parameters of a large-scale system using large volumes of data. In this paper, we introduce a new analytical method for identifying a nonlinear system without relying on iterative calculations and yet also not requiring any specific stimulus distribution. We demonstrate the results of numerical simulations, showing that our noniterative method is as accurate as GLM in estimating nonlinear parameters in many cases and outperforms conventional, spike-triggered data analyses. As an example of the application of our method to actual psychophysical data, we investigated how different spatiotemporal frequency channels interact in assessments of motion direction. The nonlinear interaction estimated by our method was consistent with findings from previous vision studies and supports the validity of our method for nonlinear system identification.

  13. Supercritical Fluid Technologies to Fabricate Proliposomes.

    PubMed

    Falconer, James R; Svirskis, Darren; Adil, Ali A; Wu, Zimei

    2015-01-01

    Proliposomes are stable drug carrier systems designed to form liposomes upon addition of an aqueous phase. In this review, current trends in the use of supercritical fluid (SCF) technologies to prepare proliposomes are discussed. SCF methods are used in pharmaceutical research and industry to address limitations associated with conventional methods of pro/liposome fabrication. The SCF solvent methods of proliposome preparation are eco-friendly (known as green technology) and, along with the SCF anti-solvent methods, could be advantageous over conventional methods; enabling better design of particle morphology (size and shape). The major hurdles of SCF methods include poor scalability to industrial manufacturing which may result in variable particle characteristics. In the case of SCF anti-solvent methods, another hurdle is the reliance on organic solvents. However, the amount of solvent required is typically less than that used by the conventional methods. Another hurdle is that most of the SCF methods used have complicated manufacturing processes, although once the setup has been completed, SCF technologies offer a single-step process in the preparation of proliposomes compared to the multiple steps required by many other methods. Furthermore, there is limited research into how proliposomes will be converted into liposomes for the end-user, and how such a product can be prepared reproducibly in terms of vesicle size and drug loading. These hurdles must be overcome and with more research, SCF methods, especially where the SCF acts as a solvent, have the potential to offer a strong alternative to the conventional methods to prepare proliposomes.

  14. Comparison of the lysis centrifugation method with the conventional blood culture method in cases of sepsis in a tertiary care hospital.

    PubMed

    Parikh, Harshal R; De, Anuradha S; Baveja, Sujata M

    2012-07-01

    Physicians and microbiologists have long recognized that the presence of living microorganisms in the blood of a patient carries with it considerable morbidity and mortality. Hence, blood cultures have become critically important and frequently performed test in clinical microbiology laboratories for diagnosis of sepsis. To compare the conventional blood culture method with the lysis centrifugation method in cases of sepsis. Two hundred nonduplicate blood cultures from cases of sepsis were analyzed using two blood culture methods concurrently for recovery of bacteria from patients diagnosed clinically with sepsis - the conventional blood culture method using trypticase soy broth and the lysis centrifugation method using saponin by centrifuging at 3000 g for 30 minutes. Overall bacteria recovered from 200 blood cultures were 17.5%. The conventional blood culture method had a higher yield of organisms, especially Gram positive cocci. The lysis centrifugation method was comparable with the former method with respect to Gram negative bacilli. The sensitivity of lysis centrifugation method in comparison to conventional blood culture method was 49.75% in this study, specificity was 98.21% and diagnostic accuracy was 89.5%. In almost every instance, the time required for detection of the growth was earlier by lysis centrifugation method, which was statistically significant. Contamination by lysis centrifugation was minimal, while that by conventional method was high. Time to growth by the lysis centrifugation method was highly significant (P value 0.000) as compared to time to growth by the conventional blood culture method. For the diagnosis of sepsis, combination of the lysis centrifugation method and the conventional blood culture method with trypticase soy broth or biphasic media is advocable, in order to achieve faster recovery and a better yield of microorganisms.

  15. A NEW METHOD OF SWEAT TESTING: THE CF QUANTUM® SWEAT TEST

    PubMed Central

    Rock, Michael J.; Makholm, Linda; Eickhoff, Jens

    2015-01-01

    Background Conventional methods of sweat testing are time consuming and have many steps that can and do lead to errors. This study compares conventional sweat testing to a new quantitative method, the CF Quantum® (CFQT) sweat test. This study tests the diagnostic accuracy and analytic validity of the CFQT. Methods Previously diagnosed CF patients and patients who required a sweat test for clinical indications were invited to have the CFQT test performed. Both conventional sweat testing and the CFQT were performed bilaterally on the same day. Pairs of data from each test are plotted as a correlation graph and Bland Altman plot. Sensitivity and specificity were calculated as well as the means and coefficient of variation by test and by extremity. After completing the study, subjects or their parents were asked for their preference of the CFQT and conventional sweat testing. Results The correlation coefficient between the CFQT and conventional sweat testing was 0.98 (95% confidence interval: 0.97–0.99). The sensitivity and specificity of the CFQT in diagnosing CF was 100% (95% confidence interval: 94–100%) and 96% (95% confidence interval: 89–99%), respectively. In one center in this three center multicenter study, there were higher sweat chloride values in patients with CF and also more tests that were invalid due to discrepant values between the two extremities. The percentage of invalid tests was higher in the CFQT method (16.5%) compared to conventional sweat testing (3.8%)(p < 0.001). In the post-test questionnaire, 88% of subjects/parents preferred the CFQT test. Conclusions The CFQT is a fast and simple method of quantitative sweat chloride determination. This technology requires further refinement to improve the analytic accuracy at higher sweat chloride values and to decrease the number of invalid tests. PMID:24862724

  16. Job requirements compared to medical school education: differences between graduates from problem-based learning and conventional curricula

    PubMed Central

    2010-01-01

    Background Problem-based Learning (PBL) has been suggested as a key educational method of knowledge acquisition to improve medical education. We sought to evaluate the differences in medical school education between graduates from PBL-based and conventional curricula and to what extent these curricula fit job requirements. Methods Graduates from all German medical schools who graduated between 1996 and 2002 were eligible for this study. Graduates self-assessed nine competencies as required at their day-to-day work and as taught in medical school on a 6-point Likert scale. Results were compared between graduates from a PBL-based curriculum (University Witten/Herdecke) and conventional curricula. Results Three schools were excluded because of low response rates. Baseline demographics between graduates of the PBL-based curriculum (n = 101, 49% female) and the conventional curricula (n = 4720, 49% female) were similar. No major differences were observed regarding job requirements with priorities for "Independent learning/working" and "Practical medical skills". All competencies were rated to be better taught in PBL-based curriculum compared to the conventional curricula (all p < 0.001), except for "Medical knowledge" and "Research competence". Comparing competencies required at work and taught in medical school, PBL was associated with benefits in "Interdisciplinary thinking" (Δ + 0.88), "Independent learning/working" (Δ + 0.57), "Psycho-social competence" (Δ + 0.56), "Teamwork" (Δ + 0.39) and "Problem-solving skills" (Δ + 0.36), whereas "Research competence" (Δ - 1.23) and "Business competence" (Δ - 1.44) in the PBL-based curriculum needed improvement. Conclusion Among medical graduates in Germany, PBL demonstrated benefits with regard to competencies which were highly required in the job of physicians. Research and business competence deserve closer attention in future curricular development. PMID:20074350

  17. SPACE PROPULSION SYSTEM PHASED-MISSION PROBABILITY ANALYSIS USING CONVENTIONAL PRA METHODS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; James Knudsen

    As part of a series of papers on the topic of advance probabilistic methods, a benchmark phased-mission problem has been suggested. This problem consists of modeling a space mission using an ion propulsion system, where the mission consists of seven mission phases. The mission requires that the propulsion operate for several phases, where the configuration changes as a function of phase. The ion propulsion system itself consists of five thruster assemblies and a single propellant supply, where each thruster assembly has one propulsion power unit and two ion engines. In this paper, we evaluate the probability of mission failure usingmore » the conventional methodology of event tree/fault tree analysis. The event tree and fault trees are developed and analyzed using Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE). While the benchmark problem is nominally a "dynamic" problem, in our analysis the mission phases are modeled in a single event tree to show the progression from one phase to the next. The propulsion system is modeled in fault trees to account for the operation; or in this case, the failure of the system. Specifically, the propulsion system is decomposed into each of the five thruster assemblies and fed into the appropriate N-out-of-M gate to evaluate mission failure. A separate fault tree for the propulsion system is developed to account for the different success criteria of each mission phase. Common-cause failure modeling is treated using traditional (i.e., parametrically) methods. As part of this paper, we discuss the overall results in addition to the positive and negative aspects of modeling dynamic situations with non-dynamic modeling techniques. One insight from the use of this conventional method for analyzing the benchmark problem is that it requires significant manual manipulation to the fault trees and how they are linked into the event tree. The conventional method also requires editing the resultant cut sets to obtain the correct results. While conventional methods may be used to evaluate a dynamic system like that in the benchmark, the level of effort required may preclude its use on real-world problems.« less

  18. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  19. A comparison between conventional and LANDSAT based hydrologic modeling: The Four Mile Run case study

    NASA Technical Reports Server (NTRS)

    Ragan, R. M.; Jackson, T. J.; Fitch, W. N.; Shubinski, R. P.

    1976-01-01

    Models designed to support the hydrologic studies associated with urban water resources planning require input parameters that are defined in terms of land cover. Estimating the land cover is a difficult and expensive task when drainage areas larger than a few sq. km are involved. Conventional and LANDSAT based methods for estimating the land cover based input parameters required by hydrologic planning models were compared in a case study of the 50.5 sq. km (19.5 sq. mi) Four Mile Run Watershed in Virginia. Results of the study indicate that the LANDSAT based approach is highly cost effective for planning model studies. The conventional approach to define inputs was based on 1:3600 aerial photos, required 110 man-days and a total cost of $14,000. The LANDSAT based approach required 6.9 man-days and cost $2,350. The conventional and LANDSAT based models gave similar results relative to discharges and estimated annual damages expected from no flood control, channelization, and detention storage alternatives.

  20. OPTiM: Optical projection tomography integrated microscope using open-source hardware and software

    PubMed Central

    Andrews, Natalie; Davis, Samuel; Bugeon, Laurence; Dallman, Margaret D.; McGinty, James

    2017-01-01

    We describe the implementation of an OPT plate to perform optical projection tomography (OPT) on a commercial wide-field inverted microscope, using our open-source hardware and software. The OPT plate includes a tilt adjustment for alignment and a stepper motor for sample rotation as required by standard projection tomography. Depending on magnification requirements, three methods of performing OPT are detailed using this adaptor plate: a conventional direct OPT method requiring only the addition of a limiting aperture behind the objective lens; an external optical-relay method allowing conventional OPT to be performed at magnifications >4x; a remote focal scanning and region-of-interest method for improved spatial resolution OPT (up to ~1.6 μm). All three methods use the microscope’s existing incoherent light source (i.e. arc-lamp) and all of its inherent functionality is maintained for day-to-day use. OPT acquisitions are performed on in vivo zebrafish embryos to demonstrate the implementations’ viability. PMID:28700724

  1. The Misgav Ladach method: a step forward in the operative technique of caesarean section.

    PubMed

    Poonam; Banerjee, B; Singh, S N; Raina, A

    2006-01-01

    Caesarean delivery remains the most common intraperitoneal surgical procedure in obstetric and gynaecologic practice. Since time immemorial there have been countless efforts to improve the technique of caesarean section. One such innovative breakthrough technique is the Misgav Ladach method of caesarean of section. The objective of this trial was to compare the intraoperative and short-term postoperative outcomes between the conventional and the Misgav-Ladach technique for caesarean section. The randomized prospective comparative study was carried out in the department of Obstetrics and Gynaecology, B.P Koirala Institute of Health Sciences, Dharan Nepal. Four hundred patients were randomized to either Misgav Ladach or the Conventional method of caesarean section. Only term pregnancies with singleton foetuses' were included whereas pregnancies with previous caesarean section were excluded from the study. The study period was from September 2001 to September 2004. There was not much difference in the demographic variables between the two groups. The age of the patients ranged between 18-40 years. The mean age of patients in Misgav Ladach and conventional group was 24.5 and 23.6 years respectively. Foetal distress was the commonest indication for caesarean section followed by non progress of labour. The mean incision to delivery interval, operating time and blood loss in the Misgav Ladach group was 1 minute 30 seconds, 16 minutes and 35 0ml as compared to 3 minutes, 28 minutes and 600 ml in the conventional group respectively. 3.5%of patients in the Misgav Ladach group showed febrile morbidity as compared to 7% in the conventional group. 19% from conventional group and only 4%from Misgav Ladach group required added analgesia. Almost equal number of patients (10-12) in each group experienced significant headache.).0.1%in the Misgav group and 5% in the Conventional group required post operative blood transfusion. Four patients from the conventional group had their wound gaped. The number of neonates requiring intensive care was sixteen (8% ) in the conventional group and 3 (1.5%) in the Misgav group.6.5% from conventional group and 2% from Misgav Ladach group required maternal intensive care admissions. Misgav-Ladach technique has been be associated with shorter operative time, quicker recovery, and lesser need for postoperative medications, when compared with traditional caesarean section. It has also been shown to be more cost-effective. A further advantage of the technique may be the shorter time taken for the delivery of the child.

  2. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  3. Pain score of patients undergoing single spot, short pulse laser versus conventional laser for diabetic retinopathy.

    PubMed

    Mirshahi, Ahmad; Lashay, Alireza; Roozbahani, Mehdi; Fard, Masoud Aghsaei; Molaie, Saber; Mireshghi, Meysam; Zaferani, Mohamad Mehdi

    2013-04-01

    To compare pain score of single spot short duration time (20 milliseconds) panretinal photocoagulation (PRP) with conventional (100 milliseconds) PRP in diabetic retinopathy. Sixty-six eyes from 33 patients with symmetrical severe non-proliferative diabetic retinopathy (non-PDR) or proliferative diabetic retinopathy (PDR) were enrolled in this prospective randomized controlled trial. One eye of each patient was randomized to undergo conventional and the other eye to undergo short time PRP. Spot size of 200 μm was used in both laser types, and energy was adjusted to achieve moderate burn on the retina. Patients were asked to mark the level of pain felt during the PRP session for each eye on the visual analog scale (VAS) and were examined at 1 week, and at 1, 2, 4 and 6 months. Sixteen women and 17 men with mean age 58.9 ± 7.8 years were evaluated. The conventional method required a mean power of 273 ± 107 mW, whereas the short duration method needed 721 ± 406 mW (P = 0.001). An average of 1,218 ± 441 spots were delivered with the conventional method and an average of 2,125 ± 503 spots were required with the short duration method (P = 0.001). Average pain score was 7.5 ± 1.14 in conventional group and 1.75 ± 0.87 in the short duration group (P = 0.001). At 1 week, 1 month, and 4 months following PRP, the mean changes of central macular thickness (CMT) from baseline in the conventional group remained 29.2 μm (P = 0.008), 40.0 μm (P = 0.001), and 40.2 μm (P = 0.007) greater than the changes in CMT for short time group. Patient acceptance of short time single spot PRP was high, and well-tolerated in a single session by all patients. Moreover, this method is significantly less painful than but just as effective as conventional laser during 6 months of follow-up. The CMT change was more following conventional laser than short time laser.

  4. Comparison between piezosurgery and conventional osteotomy in cranioplasty with fronto-orbital advancement.

    PubMed

    Martini, Markus; Röhrig, Andreas; Reich, Rudolf Hermann; Messing-Jünger, Martina

    2017-03-01

    Cranioplasty of patients with craniosynostosis requires rapid, precise and gentle osteotomy of the skull to avoid complications and benefit the healing process. The aim of this prospective clinical study was to compare two different methods of osteotomy. Piezosurgery and conventional osteotomy were compared using an oscillating saw and high speed drill while performing cranioplasties with fronto-orbital advancement. Thirty-four children who required cranioplasty with fronto-orbital advancement were recruited consecutively. The operations were conducted using piezosurgery or a conventional surgical technique, alternately. Operative time, blood count, CRP and transfusion rate, as well as soft tissue injuries, postoperative edema, pain development and secondary bone healing were investigated. The average age of patients was 9.7 months. The following indications for craniosynostosis were surgically corrected: trigonocephaly (23), anterior plagiocephaly (8), brachycephaly (1), and syndromic craniosynostosis (2). Piezosurgery was utilized in 18 cases. There were no group differences with regard to the incidence of soft tissue injuries (dura, periorbita), pain, swelling, blood loss or bony integration. The duration of osteotomy was significantly longer in the piezosurgery group, leading to slightly increased blood loss, while the postoperative CRP increase was higher using the conventional method. The piezosurgery method is a comparatively safe surgical method for conducting osteotomy during cranioplasty. With regard to soft tissue protection and postoperative clinical course, the same procedural precautions and controls are necessary as those needed for conventional methods. The osteotomy duration is considerably longer using piezosurgery, although it is accompanied by lower initial postoperative CRP values. Copyright © 2016 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  5. Novel strategy of endoscopic submucosal dissection using an insulation-tipped knife for early gastric cancer: near-side approach method

    PubMed Central

    Mori, Genki; Nonaka, Satoru; Oda, Ichiro; Abe, Seiichiro; Suzuki, Haruhisa; Yoshinaga, Shigetaka; Nakajima, Takeshi; Saito, Yutaka

    2015-01-01

    Background and study aims: Endoscopic submucosal dissection (ESD) using insulation-tipped knives (IT knives) to treat gastric lesions located on the greater curvature of the gastric body remains technically challenging because of the associated bleeding, control of which can be difficult and time consuming. To eliminate these difficulties, we developed a novel strategy which we have called the “near-side approach method” and assessed its utility. Patients and methods: We reviewed patients who underwent ESD for solitary early gastric cancer located on the greater curvature of the gastric body from January 2003 to September 2014. The technical results of ESD were compared between the group treated with the novel near-side approach method and the group treated with the conventional method. Results: This study included 238 patients with 238 lesions, 118 of which were removed using the near-side approach method and 120 of which were removed using the conventional method. The median procedure time was 92 minutes for the near-side approach method and 120 minutes for the conventional method. The procedure time was significantly shorter in the near-side approach method arm. Although, the procedure time required by an experienced endoscopist was not significantly different between the two groups (100 vs. 110 minutes), the near-side approach group showed significantly shorter procedure time for a less-experienced endoscopist (90 vs. 120 minutes). Conclusions: The near-side approach method appears to require less time to complete gastric ESD than the conventional method using IT knives for technically challenging lesions located on the greater curvature of the gastric body, especially if the procedure is performed by less-experienced endoscopists. PMID:26528496

  6. Quantitative photoacoustic microscopy of optical absorption coefficients from acoustic spectra in the optical diffusive regime

    NASA Astrophysics Data System (ADS)

    Guo, Zijian; Favazza, Christopher; Garcia-Uribe, Alejandro; Wang, Lihong V.

    2012-06-01

    Photoacoustic (PA) microscopy (PAM) can image optical absorption contrast with ultrasonic spatial resolution in the optical diffusive regime. Conventionally, accurate quantification in PAM requires knowledge of the optical fluence attenuation, acoustic pressure attenuation, and detection bandwidth. We circumvent this requirement by quantifying the optical absorption coefficients from the acoustic spectra of PA signals acquired at multiple optical wavelengths. With the acoustic spectral method, the absorption coefficients of an oxygenated bovine blood phantom at 560, 565, 570, and 575 nm were quantified with errors of <3%. We also quantified the total hemoglobin concentration and hemoglobin oxygen saturation in a live mouse. Compared with the conventional amplitude method, the acoustic spectral method provides greater quantification accuracy in the optical diffusive regime. The limitations of the acoustic spectral method was also discussed.

  7. Quantitative photoacoustic microscopy of optical absorption coefficients from acoustic spectra in the optical diffusive regime

    PubMed Central

    Guo, Zijian; Favazza, Christopher; Garcia-Uribe, Alejandro

    2012-01-01

    Abstract. Photoacoustic (PA) microscopy (PAM) can image optical absorption contrast with ultrasonic spatial resolution in the optical diffusive regime. Conventionally, accurate quantification in PAM requires knowledge of the optical fluence attenuation, acoustic pressure attenuation, and detection bandwidth. We circumvent this requirement by quantifying the optical absorption coefficients from the acoustic spectra of PA signals acquired at multiple optical wavelengths. With the acoustic spectral method, the absorption coefficients of an oxygenated bovine blood phantom at 560, 565, 570, and 575 nm were quantified with errors of <3%. We also quantified the total hemoglobin concentration and hemoglobin oxygen saturation in a live mouse. Compared with the conventional amplitude method, the acoustic spectral method provides greater quantification accuracy in the optical diffusive regime. The limitations of the acoustic spectral method was also discussed. PMID:22734767

  8. Quantitative photoacoustic microscopy of optical absorption coefficients from acoustic spectra in the optical diffusive regime.

    PubMed

    Guo, Zijian; Favazza, Christopher; Garcia-Uribe, Alejandro; Wang, Lihong V

    2012-06-01

    Photoacoustic (PA) microscopy (PAM) can image optical absorption contrast with ultrasonic spatial resolution in the optical diffusive regime. Conventionally, accurate quantification in PAM requires knowledge of the optical fluence attenuation, acoustic pressure attenuation, and detection bandwidth. We circumvent this requirement by quantifying the optical absorption coefficients from the acoustic spectra of PA signals acquired at multiple optical wavelengths. With the acoustic spectral method, the absorption coefficients of an oxygenated bovine blood phantom at 560, 565, 570, and 575 nm were quantified with errors of <3%. We also quantified the total hemoglobin concentration and hemoglobin oxygen saturation in a live mouse. Compared with the conventional amplitude method, the acoustic spectral method provides greater quantification accuracy in the optical diffusive regime. The limitations of the acoustic spectral method was also discussed.

  9. Injected polyurethane slab jacking : interim report

    DOT National Transportation Integrated Search

    2000-09-01

    Conventional methods for raising in-place concrete slabs to align roadway sections or to counteract subsidence requires pressure-injecting grout under the slab. As other transportation organizations have had success with the URETEK Method, which util...

  10. Injected polyurethane slab jacking : final report.

    DOT National Transportation Integrated Search

    2002-06-01

    Conventional methods for raising in-place concrete slabs to align roadway sections or to counteract subsidence requires pressure-injecting grout under the slab. As other transportation organizations have had success with the URETEK Method, which util...

  11. Optical data transmission technology for fixed and drag-on STS payloads umbilicals. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    St.denis, R. W.

    1981-01-01

    The feasibility of using optical data handling methods to transmit payload checkout and telemetry is discussed. Optical communications are superior to conventional communication systems for the following reasons: high data capacity optical channels; small and light weight optical cables; and optical signal immunity to electromagnetic interference. Task number one analyzed the ground checkout data requirements that may be expected from the payload community. Task number two selected the optical approach based on the interface requirements, the location of the interface, the amount of time required to reconfigure hardware, and the method of transporting the optical signal. Task number three surveyed and selected optical components for the two payload data link. Task number four makes a qualitative comparison of the conventional electrical communication system and the proposed optical communication system.

  12. A new method of sweat testing: the CF Quantum®sweat test.

    PubMed

    Rock, Michael J; Makholm, Linda; Eickhoff, Jens

    2014-09-01

    Conventional methods of sweat testing are time consuming and have many steps that can and do lead to errors. This study compares conventional sweat testing to a new quantitative method, the CF Quantum® (CFQT) sweat test. This study tests the diagnostic accuracy and analytic validity of the CFQT. Previously diagnosed CF patients and patients who required a sweat test for clinical indications were invited to have the CFQT test performed. Both conventional sweat testing and the CFQT were performed bilaterally on the same day. Pairs of data from each test are plotted as a correlation graph and Bland-Altman plot. Sensitivity and specificity were calculated as well as the means and coefficient of variation by test and by extremity. After completing the study, subjects or their parents were asked for their preference of the CFQT and conventional sweat testing. The correlation coefficient between the CFQT and conventional sweat testing was 0.98 (95% confidence interval: 0.97-0.99). The sensitivity and specificity of the CFQT in diagnosing CF was 100% (95% confidence interval: 94-100%) and 96% (95% confidence interval: 89-99%), respectively. In one center in this three center multicenter study, there were higher sweat chloride values in patients with CF and also more tests that were invalid due to discrepant values between the two extremities. The percentage of invalid tests was higher in the CFQT method (16.5%) compared to conventional sweat testing (3.8%) (p < 0.001). In the post-test questionnaire, 88% of subjects/parents preferred the CFQT test. The CFQT is a fast and simple method of quantitative sweat chloride determination. This technology requires further refinement to improve the analytic accuracy at higher sweat chloride values and to decrease the number of invalid tests. Copyright © 2014 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  13. Fibre Optic Sensors for Selected Wastewater Characteristics

    PubMed Central

    Chong, Su Sin; Abdul Aziz, A. R.; Harun, Sulaiman W.

    2013-01-01

    Demand for online and real-time measurements techniques to meet environmental regulation and treatment compliance are increasing. However the conventional techniques, which involve scheduled sampling and chemical analysis can be expensive and time consuming. Therefore cheaper and faster alternatives to monitor wastewater characteristics are required as alternatives to conventional methods. This paper reviews existing conventional techniques and optical and fibre optic sensors to determine selected wastewater characteristics which are colour, Chemical Oxygen Demand (COD) and Biological Oxygen Demand (BOD). The review confirms that with appropriate configuration, calibration and fibre features the parameters can be determined with accuracy comparable to conventional method. With more research in this area, the potential for using FOS for online and real-time measurement of more wastewater parameters for various types of industrial effluent are promising. PMID:23881131

  14. BIOMASS DRYING TECHNOLOGIES

    EPA Science Inventory

    The report examines the technologies used for drying of biomass and the energy requirements of biomass dryers. Biomass drying processes, drying methods, and the conventional types of dryers are surveyed generally. Drying methods and dryer studies using superheated steam as the d...

  15. A randomized controlled trial of the different impression methods for the complete denture fabrication: Patient reported outcomes.

    PubMed

    Jo, Ayami; Kanazawa, Manabu; Sato, Yusuke; Iwaki, Maiko; Akiba, Norihisa; Minakuchi, Shunsuke

    2015-08-01

    To compare the effect of conventional complete dentures (CD) fabricated using two different impression methods on patient-reported outcomes in a randomized controlled trial (RCT). A cross-over RCT was performed with edentulous patients, required maxillomandibular CDs. Mandibular CDs were fabricated using two different methods. The conventional method used a custom tray border moulded with impression compound and a silicone. The simplified used a stock tray and an alginate. Participants were randomly divided into two groups. The C-S group had the conventional method used first, followed by the simplified. The S-C group was in the reverse order. Adjustment was performed four times. A wash out period was set for 1 month. The primary outcome was general patient satisfaction, measured using visual analogue scales, and the secondary outcome was oral health-related quality of life, measured using the Japanese version of the Oral Health Impact Profile for edentulous (OHIP-EDENT-J) questionnaire scores. Twenty-four participants completed the trial. With regard to general patient satisfaction, the conventional method was significantly more acceptable than the simplified. No significant differences were observed between the two methods in the OHIP-EDENT-J scores. This study showed CDs fabricated with a conventional method were significantly more highly rated for general patient satisfaction than a simplified. CDs, fabricated with the conventional method that included a preliminary impression made using alginate in a stock tray and subsequently a final impression made using silicone in a border moulded custom tray resulted in higher general patient satisfaction. UMIN000009875. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. An Improved Azimuth Angle Estimation Method with a Single Acoustic Vector Sensor Based on an Active Sonar Detection System.

    PubMed

    Zhao, Anbang; Ma, Lin; Ma, Xuefei; Hui, Juan

    2017-02-20

    In this paper, an improved azimuth angle estimation method with a single acoustic vector sensor (AVS) is proposed based on matched filtering theory. The proposed method is mainly applied in an active sonar detection system. According to the conventional passive method based on complex acoustic intensity measurement, the mathematical and physical model of this proposed method is described in detail. The computer simulation and lake experiments results indicate that this method can realize the azimuth angle estimation with high precision by using only a single AVS. Compared with the conventional method, the proposed method achieves better estimation performance. Moreover, the proposed method does not require complex operations in frequencydomain and achieves computational complexity reduction.

  17. Development of an Active Flow Control Technique for an Airplane High-Lift Configuration

    NASA Technical Reports Server (NTRS)

    Shmilovich, Arvin; Yadlin, Yoram; Dickey, Eric D.; Hartwich, Peter M.; Khodadoust, Abdi

    2017-01-01

    This study focuses on Active Flow Control methods used in conjunction with airplane high-lift systems. The project is motivated by the simplified high-lift system, which offers enhanced airplane performance compared to conventional high-lift systems. Computational simulations are used to guide the implementation of preferred flow control methods, which require a fluidic supply. It is first demonstrated that flow control applied to a high-lift configuration that consists of simple hinge flaps is capable of attaining the performance of the conventional high-lift counterpart. A set of flow control techniques has been subsequently considered to identify promising candidates, where the central requirement is that the mass flow for actuation has to be within available resources onboard. The flow control methods are based on constant blowing, fluidic oscillators, and traverse actuation. The simulations indicate that the traverse actuation offers a substantial reduction in required mass flow, and it is especially effective when the frequency of actuation is consistent with the characteristic time scale of the flow.

  18. Precision chemical heating for diagnostic devices.

    PubMed

    Buser, J R; Diesburg, S; Singleton, J; Guelig, D; Bishop, J D; Zentner, C; Burton, R; LaBarre, P; Yager, P; Weigl, B H

    2015-12-07

    Decoupling nucleic acid amplification assays from infrastructure requirements such as grid electricity is critical for providing effective diagnosis and treatment at the point of care in low-resource settings. Here, we outline a complete strategy for the design of electricity-free precision heaters compatible with medical diagnostic applications requiring isothermal conditions, including nucleic acid amplification and lysis. Low-cost, highly energy dense components with better end-of-life disposal options than conventional batteries are proposed as an alternative to conventional heating methods to satisfy the unique needs of point of care use.

  19. THE PSTD ALGORITHM: A TIME-DOMAIN METHOD REQUIRING ONLY TWO CELLS PER WAVELENGTH. (R825225)

    EPA Science Inventory

    A pseudospectral time-domain (PSTD) method is developed for solutions of Maxwell's equations. It uses the fast Fourier transform (FFT), instead of finite differences on conventional finite-difference-time-domain (FDTD) methods, to represent spatial derivatives. Because the Fourie...

  20. Generally astigmatic Gaussian beam representation and optimization using skew rays

    NASA Astrophysics Data System (ADS)

    Colbourne, Paul D.

    2014-12-01

    Methods are presented of using skew rays to optimize a generally astigmatic optical system to obtain the desired Gaussian beam focus and minimize aberrations, and to calculate the propagating generally astigmatic Gaussian beam parameters at any point. The optimization method requires very little computation beyond that of a conventional ray optimization, and requires no explicit calculation of the properties of the propagating Gaussian beam. Unlike previous methods, the calculation of beam parameters does not require matrix calculations or the introduction of non-physical concepts such as imaginary rays.

  1. 48 CFR 14.501 - General.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... unduly restrictive statement of the Government's requirements, including an adequate technical data package, so that subsequent acquisitions may be made by conventional sealed bidding. This method is...

  2. 48 CFR 14.501 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... unduly restrictive statement of the Government's requirements, including an adequate technical data package, so that subsequent acquisitions may be made by conventional sealed bidding. This method is...

  3. 48 CFR 14.501 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... unduly restrictive statement of the Government's requirements, including an adequate technical data package, so that subsequent acquisitions may be made by conventional sealed bidding. This method is...

  4. 48 CFR 14.501 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... unduly restrictive statement of the Government's requirements, including an adequate technical data package, so that subsequent acquisitions may be made by conventional sealed bidding. This method is...

  5. 48 CFR 14.501 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... unduly restrictive statement of the Government's requirements, including an adequate technical data package, so that subsequent acquisitions may be made by conventional sealed bidding. This method is...

  6. Manufacturing implant supported auricular prostheses by rapid prototyping techniques.

    PubMed

    Karatas, Meltem Ozdemir; Cifter, Ebru Demet; Ozenen, Didem Ozdemir; Balik, Ali; Tuncer, Erman Bulent

    2011-08-01

    Maxillofacial prostheses are usually fabricated on the models obtained following the impression procedures. Disadvantages of conventional impression techniques used in production of facial prosthesis are deformation of soft tissues caused by impression material and disturbance of the patient due to. Additionally production of prosthesis by conventional methods takes longer time. Recently, rapid prototyping techniques have been developed for extraoral prosthesis in order to reduce these disadvantages of conventional methods. Rapid prototyping technique has the potential to simplify the procedure and decrease the laboratory work required. It eliminates the need for measurement impression procedures and preparation of wax model to be performed by prosthodontists themselves In the near future this technology will become a standard for fabricating maxillofacial prostheses.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkler, Jon; Booten, Chuck

    Residential building codes and voluntary labeling programs are continually increasing the energy efficiency requirements of residential buildings. Improving a building's thermal enclosure and installing energy-efficient appliances and lighting can result in significant reductions in sensible cooling loads leading to smaller air conditioners and shorter cooling seasons. However due to fresh air ventilation requirements and internal gains, latent cooling loads are not reduced by the same proportion. Thus, it's becoming more challenging for conventional cooling equipment to control indoor humidity at part-load cooling conditions and using conventional cooling equipment in a non-conventional building poses the potential risk of high indoor humidity.more » The objective of this project was to investigate the impact the chosen design condition has on the calculated part-load cooling moisture load, and compare calculated moisture loads and the required dehumidification capacity to whole-building simulations. Procedures for sizing whole-house supplemental dehumidification equipment have yet to be formalized; however minor modifications to current Air-Conditioner Contractors of America (ACCA) Manual J load calculation procedures are appropriate for calculating residential part-load cooling moisture loads. Though ASHRAE 1% DP design conditions are commonly used to determine the dehumidification requirements for commercial buildings, an appropriate DP design condition for residential buildings has not been investigated. Two methods for sizing supplemental dehumidification equipment were developed and tested. The first method closely followed Manual J cooling load calculations; whereas the second method made more conservative assumptions impacting both sensible and latent loads.« less

  8. Applying high resolution remote sensing image and DEM to falling boulder hazard assessment

    NASA Astrophysics Data System (ADS)

    Huang, Changqing; Shi, Wenzhong; Ng, K. C.

    2005-10-01

    Boulder fall hazard assessing generally requires gaining the boulder information. The extensive mapping and surveying fieldwork is a time-consuming, laborious and dangerous conventional method. So this paper proposes an applying image processing technology to extract boulder and assess boulder fall hazard from high resolution remote sensing image. The method can replace the conventional method and extract the boulder information in high accuracy, include boulder size, shape, height and the slope and aspect of its position. With above boulder information, it can be satisfied for assessing, prevention and cure boulder fall hazard.

  9. Method for rapidly producing microporous and mesoporous materials

    DOEpatents

    Coronado, Paul R.; Poco, John F.; Hrubesh, Lawrence W.; Hopper, Robert W.

    1997-01-01

    An improved, rapid process is provided for making microporous and mesoporous materials, including aerogels and pre-ceramics. A gel or gel precursor is confined in a sealed vessel to prevent structural expansion of the gel during the heating process. This confinement allows the gelation and drying processes to be greatly accelerated, and significantly reduces the time required to produce a dried aerogel compared to conventional methods. Drying may be performed either by subcritical drying with a pressurized fluid to expel the liquid from the gel pores or by supercritical drying. The rates of heating and decompression are significantly higher than for conventional methods.

  10. Performance of two alternative methods for Listeria detection throughout Serro Minas cheese ripening.

    PubMed

    Mata, Gardênia Márcia Silva Campos; Martins, Evandro; Machado, Solimar Gonçalves; Pinto, Maximiliano Soares; de Carvalho, Antônio Fernandes; Vanetti, Maria Cristina Dantas

    2016-01-01

    The ability of pathogens to survive cheese ripening is a food-security concern. Therefore, this study aimed to evaluate the performance of two alternative methods of analysis of Listeria during the ripening of artisanal Minas cheese. These methods were tested and compared with the conventional method: Lateral Flow System™, in cheeses produced on laboratory scale using raw milk collected from different farms and inoculated with Listeria innocua; and VIDAS(®)-LMO, in cheese samples collected from different manufacturers in Serro, Minas Gerais, Brazil. These samples were also characterized in terms of lactic acid bacteria, coliforms and physical-chemical analysis. In the inoculated samples, L. innocua was detected by Lateral Flow System™ method with 33% false-negative and 68% accuracy results. L. innocua was only detected in the inoculated samples by the conventional method at 60-days of cheese ripening. L. monocytogenes was not detected by the conventional and the VIDAS(®)-LMO methods in cheese samples collected from different manufacturers, which impairs evaluating the performance of this alternative method. We concluded that the conventional method provided a better recovery of L. innocua throughout cheese ripening, being able to detect L. innocua at 60-day, aging period which is required by the current legislation. Copyright © 2016 Sociedade Brasileira de Microbiologia. Published by Elsevier Editora Ltda. All rights reserved.

  11. Economic comparison of conventional maintenance and electrochemical oxidation to warrant water safety in dental unit water lines

    PubMed Central

    Fischer, Sebastian; Meyer, Georg; Kramer, Axel

    2012-01-01

    Background: In preparation for implementation of a central water processing system at a dental department, we analyzed the costs of conventional decentralized disinfection of dental units against a central water treatment concept based on electrochemical disinfection. Methods: The cost evaluation included only the costs of annually required antimicrobial consumables and additional water usage of a decentralize conventional maintenance system for dental water lines build in the respective dental units and the central electrochemical water disinfection system, BLUE SAFETY™ Technologies. Results: In total, analysis of costs of 6 dental departments reviled additional annual costs for hygienic preventive measures of € 4,448.37. For the BLUE SAFETY™ Technology, the additional annual total agent consumption costs were € 2.18, accounting for approximately 0.05% of the annual total agent consumption costs of the conventional maintenance system. For both water processing concepts, the additional costs for energy could not be calculated, since the required data was not obtainable from the manufacturers. Discussion: For both concepts, the investment and maintenance costs were not calculated due to lack of manufacturer's data. Therefore, the results indicate the difference of costs for the required consumables only. Aside of the significantly lower annual costs for required consumables and disinfectants; a second advantage for the BLUE SAFETY™ Technology is its constant and automatic operation, which does not require additional staff resources. This not only safety human resources, but add additionally to cost saving. Conclusion: Since the antimicrobial disinfection capacity of the BLUE SAFETY™ was demonstrated previously and is well known, this technology, which is comparable or even superior in its non-corrosive effect, may be regarded as method of choice for continuous disinfection and prevention of biofilm formation in dental units’ water lines. PMID:22558042

  12. A study of methods to estimate debris flow velocity

    USGS Publications Warehouse

    Prochaska, A.B.; Santi, P.M.; Higgins, J.D.; Cannon, S.H.

    2008-01-01

    Debris flow velocities are commonly back-calculated from superelevation events which require subjective estimates of radii of curvature of bends in the debris flow channel or predicted using flow equations that require the selection of appropriate rheological models and material property inputs. This research investigated difficulties associated with the use of these conventional velocity estimation methods. Radii of curvature estimates were found to vary with the extent of the channel investigated and with the scale of the media used, and back-calculated velocities varied among different investigated locations along a channel. Distinct populations of Bingham properties were found to exist between those measured by laboratory tests and those back-calculated from field data; thus, laboratory-obtained values would not be representative of field-scale debris flow behavior. To avoid these difficulties with conventional methods, a new preliminary velocity estimation method is presented that statistically relates flow velocity to the channel slope and the flow depth. This method presents ranges of reasonable velocity predictions based on 30 previously measured velocities. ?? 2008 Springer-Verlag.

  13. Detection of shigella in lettuce by the use of a rapid molecular assay with increased sensitivity

    PubMed Central

    Jiménez, Kenia Barrantes; McCoy², Clyde B.; Achí, Rosario

    2010-01-01

    A Multiplex Polymerase Chain Reaction (PCR) assay to be used as an alternative to the conventional culture method in detecting Shigella and enteroinvasive Escherichia coli (EIEC) virulence genes ipaH and ial in lettuce was developed. Efficacy and rapidity of the molecular method were determined as compared to the conventional culture. Lettuce samples were inoculated with different Shigella flexneri concentrations (from 10 CFU/ml to 107 CFU/ml). DNA was extracted directly from lettuce after inoculation (direct-PCR) and after an enrichment step (enrichment PCR). Multiplex PCR detection limit was 104CFU/ml, diagnostic sensitivity and specificity were 100% accurate. An internal amplification control (IAC) of 100 bp was used in order to avoid false negative results. This method produced results in 1 to 2 days while the conventional culture method required 5 to 6 days. Also, the culture method detection limit was 106 CFU/ml, diagnostic sensitivity was 53% and diagnostic specificity was 100%. In this study a Multiplex PCR method for detection of virulence genes in Shigella and EIEC was shown to be effective in terms of diagnostic sensitivity, detection limit and amount of time as compared to Shigella conventional culture. PMID:24031579

  14. Fast measurement of bacterial susceptibility to antibiotics

    NASA Technical Reports Server (NTRS)

    Chappelle, E. W.; Picciolo, G. L.; Schrock, C. G.

    1977-01-01

    Method, based on photoanalysis of adenosine triphosphate using light-emitting reaction with luciferase-luciferin technique, saves time by eliminating isolation period required by conventional methods. Technique is also used to determine presence of infection as well as susceptibilities to several antibiotics.

  15. Economics of hardwood silviculture using skyline and conventional logging

    Treesearch

    John E. Baumgras; Gary W. Miller; Chris B. LeDoux

    1995-01-01

    Managing Appalachian hardwood forests to satisfy the growing and diverse demands on this resource will require alternatives to traditional silvicultural methods and harvesting systems. Determining the relative economic efficiency of these alternative methods and systems with respect to harvest cash flows is essential. The effects of silvicultural methods and roundwood...

  16. An Improved Azimuth Angle Estimation Method with a Single Acoustic Vector Sensor Based on an Active Sonar Detection System

    PubMed Central

    Zhao, Anbang; Ma, Lin; Ma, Xuefei; Hui, Juan

    2017-01-01

    In this paper, an improved azimuth angle estimation method with a single acoustic vector sensor (AVS) is proposed based on matched filtering theory. The proposed method is mainly applied in an active sonar detection system. According to the conventional passive method based on complex acoustic intensity measurement, the mathematical and physical model of this proposed method is described in detail. The computer simulation and lake experiments results indicate that this method can realize the azimuth angle estimation with high precision by using only a single AVS. Compared with the conventional method, the proposed method achieves better estimation performance. Moreover, the proposed method does not require complex operations in frequency-domain and achieves computational complexity reduction. PMID:28230763

  17. Comparison of Different Recruitment Methods for Sexual and Reproductive Health Research: Social Media-Based Versus Conventional Methods.

    PubMed

    Motoki, Yoko; Miyagi, Etsuko; Taguri, Masataka; Asai-Sato, Mikiko; Enomoto, Takayuki; Wark, John Dennis; Garland, Suzanne Marie

    2017-03-10

    Prior research about the sexual and reproductive health of young women has relied mostly on self-reported survey studies. Thus, participant recruitment using Web-based methods can improve sexual and reproductive health research about cervical cancer prevention. In our prior study, we reported that Facebook is a promising way to reach young women for sexual and reproductive health research. However, it remains unknown whether Web-based or other conventional recruitment methods (ie, face-to-face or flyer distribution) yield comparable survey responses from similar participants. We conducted a survey to determine whether there was a difference in the sexual and reproductive health survey responses of young Japanese women based on recruitment methods: social media-based and conventional methods. From July 2012 to March 2013 (9 months), we invited women of ages 16-35 years in Kanagawa, Japan, to complete a Web-based questionnaire. They were recruited through either a social media-based (social networking site, SNS, group) or by conventional methods (conventional group). All participants enrolled were required to fill out and submit their responses through a Web-based questionnaire about their sexual and reproductive health for cervical cancer prevention. Of the 243 participants, 52.3% (127/243) were recruited by SNS, whereas 47.7% (116/243) were recruited by conventional methods. We found no differences between recruitment methods in responses to behaviors and attitudes to sexual and reproductive health survey, although more participants from the conventional group (15%, 14/95) chose not to answer the age of first intercourse compared with those from the SNS group (5.2%, 6/116; P=.03). No differences were found between recruitment methods in the responses of young Japanese women to a Web-based sexual and reproductive health survey. ©Yoko Motoki, Etsuko Miyagi, Masataka Taguri, Mikiko Asai-Sato, Takayuki Enomoto, John Dennis Wark, Suzanne Marie Garland. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 10.03.2017.

  18. Prediction of crude protein and oil content of soybeans using Raman spectroscopy

    USDA-ARS?s Scientific Manuscript database

    While conventional chemical analysis methods for food nutrients require time-consuming, labor-intensive, and invasive pretreatment procedures, Raman spectroscopy can be used to measure a variety of food components rapidly and non-destructively and does not require supervision from experts. The purpo...

  19. Method to enhance the performance of synthetic origin-destination (O-D) trip table estimation models.

    DOT National Transportation Integrated Search

    1998-01-01

    The conventional methods of determining origin-destination (O-D) trip tables involve elaborate surveys, e.g., home interviews, that require considerable time, staff, and funds. To overcome this drawback, a number of theoretical models that synthesize...

  20. A Novel, Low-Volume Method for Organ Culture of Embryonic Kidneys That Allows Development of Cortico-Medullary Anatomical Organization

    PubMed Central

    Sebinger, David D. R.; Unbekandt, Mathieu; Ganeva, Veronika V.; Ofenbauer, Andreas; Werner, Carsten; Davies, Jamie A.

    2010-01-01

    Here, we present a novel method for culturing kidneys in low volumes of medium that offers more organotypic development compared to conventional methods. Organ culture is a powerful technique for studying renal development. It recapitulates many aspects of early development very well, but the established techniques have some disadvantages: in particular, they require relatively large volumes (1–3 mls) of culture medium, which can make high-throughput screens expensive, they require porous (filter) substrates which are difficult to modify chemically, and the organs produced do not achieve good cortico-medullary zonation. Here, we present a technique of growing kidney rudiments in very low volumes of medium–around 85 microliters–using silicone chambers. In this system, kidneys grow directly on glass, grow larger than in conventional culture and develop a clear anatomical cortico-medullary zonation with extended loops of Henle. PMID:20479933

  1. Comparing errors in ED computer-assisted vs conventional pediatric drug dosing and administration.

    PubMed

    Yamamoto, Loren; Kanemori, Joan

    2010-06-01

    Compared to fixed-dose single-vial drug administration in adults, pediatric drug dosing and administration requires a series of calculations, all of which are potentially error prone. The purpose of this study is to compare error rates and task completion times for common pediatric medication scenarios using computer program assistance vs conventional methods. Two versions of a 4-part paper-based test were developed. Each part consisted of a set of medication administration and/or dosing tasks. Emergency department and pediatric intensive care unit nurse volunteers completed these tasks using both methods (sequence assigned to start with a conventional or a computer-assisted approach). Completion times, errors, and the reason for the error were recorded. Thirty-eight nurses completed the study. Summing the completion of all 4 parts, the mean conventional total time was 1243 seconds vs the mean computer program total time of 879 seconds (P < .001). The conventional manual method had a mean of 1.8 errors vs the computer program with a mean of 0.7 errors (P < .001). Of the 97 total errors, 36 were due to misreading the drug concentration on the label, 34 were due to calculation errors, and 8 were due to misplaced decimals. Of the 36 label interpretation errors, 18 (50%) occurred with digoxin or insulin. Computerized assistance reduced errors and the time required for drug administration calculations. A pattern of errors emerged, noting that reading/interpreting certain drug labels were more error prone. Optimizing the layout of drug labels could reduce the error rate for error-prone labels. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  2. Solving coupled groundwater flow systems using a Jacobian Free Newton Krylov method

    NASA Astrophysics Data System (ADS)

    Mehl, S.

    2012-12-01

    Jacobian Free Newton Kyrlov (JFNK) methods can have several advantages for simulating coupled groundwater flow processes versus conventional methods. Conventional methods are defined here as those based on an iterative coupling (rather than a direct coupling) and/or that use Picard iteration rather than Newton iteration. In an iterative coupling, the systems are solved separately, coupling information is updated and exchanged between the systems, and the systems are re-solved, etc., until convergence is achieved. Trusted simulators, such as Modflow, are based on these conventional methods of coupling and work well in many cases. An advantage of the JFNK method is that it only requires calculation of the residual vector of the system of equations and thus can make use of existing simulators regardless of how the equations are formulated. This opens the possibility of coupling different process models via augmentation of a residual vector by each separate process, which often requires substantially fewer changes to the existing source code than if the processes were directly coupled. However, appropriate perturbation sizes need to be determined for accurate approximations of the Frechet derivative, which is not always straightforward. Furthermore, preconditioning is necessary for reasonable convergence of the linear solution required at each Kyrlov iteration. Existing preconditioners can be used and applied separately to each process which maximizes use of existing code and robust preconditioners. In this work, iteratively coupled parent-child local grid refinement models of groundwater flow and groundwater flow models with nonlinear exchanges to streams are used to demonstrate the utility of the JFNK approach for Modflow models. Use of incomplete Cholesky preconditioners with various levels of fill are examined on a suite of nonlinear and linear models to analyze the effect of the preconditioner. Comparisons of convergence and computer simulation time are made using conventional iteratively coupled methods and those based on Picard iteration to those formulated with JFNK to gain insights on the types of nonlinearities and system features that make one approach advantageous. Results indicate that nonlinearities associated with stream/aquifer exchanges are more problematic than those resulting from unconfined flow.

  3. The challenges and promises of genetic approaches for ballast water management

    NASA Astrophysics Data System (ADS)

    Rey, Anaïs; Basurko, Oihane C.; Rodríguez-Ezpeleta, Naiara

    2018-03-01

    Ballast water is a main vector of introduction of Harmful Aquatic Organisms and Pathogens, which includes Non-Indigenous Species. Numerous and diversified organisms are transferred daily from a donor to a recipient port. Developed to prevent these introduction events, the International Convention for the Control and Management of Ships' Ballast Water and Sediments will enter into force in 2017. This international convention is asking for the monitoring of Harmful Aquatic Organisms and Pathogens. In this review, we highlight the urgent need to develop cost-effective methods to: (1) perform the biological analyses required by the convention; and (2) assess the effectiveness of two main ballast water management strategies, i.e. the ballast water exchange and the use of ballast water treatment systems. We have compiled the biological analyses required by the convention, and performed a comprehensive evaluation of the potential and challenges of the use of genetic tools in this context. Following an overview of the studies applying genetic tools to ballast water related research, we present metabarcoding as a relevant approach for early detection of Harmful Aquatic Organisms and Pathogens in general and for ballast water monitoring and port risk assessment in particular. Nonetheless, before implementation of genetic tools in the context of the ballast water management convention, benchmarked tests against traditional methods should be performed, and standard, reproducible and easy to apply protocols should be developed.

  4. Meta-analysis of Odds Ratios: Current Good Practices

    PubMed Central

    Chang, Bei-Hung; Hoaglin, David C.

    2016-01-01

    Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977

  5. Defense Small Business Innovation Research Program (SBIR) FY 1984.

    DTIC Science & Technology

    1984-01-12

    nuclear submarine non-metallic, light weight, high strength piping . Includes the development of adequate fabrication procedures for attaching pipe ...waste heat economizer methods, require development. Improved conventional and hybrid heat pipes and/or two phase transport devices 149 IF are required...DESCRIPTION: A need exists to conceive, design, fabricate and test a method of adjusting the length of the individual legs of nylon or Kevlar rope sling

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slabodchikov, Vladimir A., E-mail: dipis1991@mail.ru; Borisov, Dmitry P., E-mail: borengin@mail.ru; Kuznetsov, Vladimir M., E-mail: kuznetsov@rec.tsu.ru

    The paper reports on a new method of plasma immersion ion implantation for the surface modification of medical materials using the example of nickel-titanium (NiTi) alloys much used for manufacturing medical implants. The chemical composition and surface properties of NiTi alloys doped with silicon by conventional ion implantation and by the proposed plasma immersion method are compared. It is shown that the new plasma immersion method is more efficient than conventional ion beam treatment and provides Si implantation into NiTi surface layers through a depth of a hundred nanometers at low bias voltages (400 V) and temperatures (≤150°C) of the substrate.more » The research results suggest that the chemical composition and surface properties of materials required for medicine, e.g., NiTi alloys, can be successfully attained through modification by the proposed method of plasma immersion ion implantation and by other methods based on the proposed vacuum equipment without using any conventional ion beam treatment.« less

  7. An Upgrade Pinning Block: A Mechanical Practical Aid for Fast Labelling of the Insect Specimens.

    PubMed

    Ghafouri Moghaddam, Mohammad Hossein; Ghafouri Moghaddam, Mostafa; Rakhshani, Ehsan; Mokhtari, Azizollah

    2017-01-01

    A new mechanical innovation is described to deal with standard labelling of dried specimens on triangular cards and/or pinned specimens in personal and public collections. It works quickly, precisely, and easily and is very useful for maintaining label uniformity in collections. The tools accurately sets the position of labels in the shortest possible time. This tools has advantages including rapid processing, cost effectiveness, light weight, and high accuracy, compared to conventional methods. It is fully customisable, compact, and does not require specialist equipment to assemble. Conventional methods generally require locating holes on the pinning block surface when labelling with a resulting risk to damage of the specimens. Insects of different orders can be labelled by this simple and effective tool.

  8. An Upgrade Pinning Block: A Mechanical Practical Aid for Fast Labelling of the Insect Specimens

    PubMed Central

    Ghafouri Moghaddam, Mohammad Hossein; Rakhshani, Ehsan; Mokhtari, Azizollah

    2017-01-01

    Abstract A new mechanical innovation is described to deal with standard labelling of dried specimens on triangular cards and/or pinned specimens in personal and public collections. It works quickly, precisely, and easily and is very useful for maintaining label uniformity in collections. The tools accurately sets the position of labels in the shortest possible time. This tools has advantages including rapid processing, cost effectiveness, light weight, and high accuracy, compared to conventional methods. It is fully customisable, compact, and does not require specialist equipment to assemble. Conventional methods generally require locating holes on the pinning block surface when labelling with a resulting risk to damage of the specimens. Insects of different orders can be labelled by this simple and effective tool. PMID:29104440

  9. Manufacturing Implant Supported Auricular Prostheses by Rapid Prototyping Techniques

    PubMed Central

    Karatas, Meltem Ozdemir; Cifter, Ebru Demet; Ozenen, Didem Ozdemir; Balik, Ali; Tuncer, Erman Bulent

    2011-01-01

    Maxillofacial prostheses are usually fabricated on the models obtained following the impression procedures. Disadvantages of conventional impression techniques used in production of facial prosthesis are deformation of soft tissues caused by impression material and disturbance of the patient due to. Additionally production of prosthesis by conventional methods takes longer time. Recently, rapid prototyping techniques have been developed for extraoral prosthesis in order to reduce these disadvantages of conventional methods. Rapid prototyping technique has the potential to simplify the procedure and decrease the laboratory work required. It eliminates the need for measurement impression procedures and preparation of wax model to be performed by prosthodontists themselves In the near future this technology will become a standard for fabricating maxillofacial prostheses. PMID:21912504

  10. A method to enhance the performance of synthetic origin-destination (O-D) trip table estimation models.

    DOT National Transportation Integrated Search

    1998-01-01

    The conventional methods of determining origin-destination (O-D) trip tables involve elaborate surveys, e.g., home interviews, that require considerable time, staff, and funds. To overcome this drawback, a number of theoretical models that synthesize...

  11. Method for rapidly producing microporous and mesoporous materials

    DOEpatents

    Coronado, P.R.; Poco, J.F.; Hrubesh, L.W.; Hopper, R.W.

    1997-11-11

    An improved, rapid process is provided for making microporous and mesoporous materials, including aerogels and pre-ceramics. A gel or gel precursor is confined in a sealed vessel to prevent structural expansion of the gel during the heating process. This confinement allows the gelation and drying processes to be greatly accelerated, and significantly reduces the time required to produce a dried aerogel compared to conventional methods. Drying may be performed either by subcritical drying with a pressurized fluid to expel the liquid from the gel pores or by supercritical drying. The rates of heating and decompression are significantly higher than for conventional methods. 3 figs.

  12. 15 CFR 745.2 - End-Use Certificate reporting requirements under the Chemical Weapons Convention.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... requirements under the Chemical Weapons Convention. 745.2 Section 745.2 Commerce and Foreign Trade Regulations... EXPORT ADMINISTRATION REGULATIONS CHEMICAL WEAPONS CONVENTION REQUIREMENTS § 745.2 End-Use Certificate reporting requirements under the Chemical Weapons Convention. Note: The End-Use Certificate requirement of...

  13. 15 CFR 745.2 - End-Use Certificate reporting requirements under the Chemical Weapons Convention.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... requirements under the Chemical Weapons Convention. 745.2 Section 745.2 Commerce and Foreign Trade Regulations... EXPORT ADMINISTRATION REGULATIONS CHEMICAL WEAPONS CONVENTION REQUIREMENTS § 745.2 End-Use Certificate reporting requirements under the Chemical Weapons Convention. Note: The End-Use Certificate requirement of...

  14. 15 CFR 745.2 - End-Use Certificate reporting requirements under the Chemical Weapons Convention.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... requirements under the Chemical Weapons Convention. 745.2 Section 745.2 Commerce and Foreign Trade Regulations... EXPORT ADMINISTRATION REGULATIONS CHEMICAL WEAPONS CONVENTION REQUIREMENTS § 745.2 End-Use Certificate reporting requirements under the Chemical Weapons Convention. Note: The End-Use Certificate requirement of...

  15. 15 CFR 745.2 - End-Use Certificate reporting requirements under the Chemical Weapons Convention.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... requirements under the Chemical Weapons Convention. 745.2 Section 745.2 Commerce and Foreign Trade Regulations... EXPORT ADMINISTRATION REGULATIONS CHEMICAL WEAPONS CONVENTION REQUIREMENTS § 745.2 End-Use Certificate reporting requirements under the Chemical Weapons Convention. Note: The End-Use Certificate requirement of...

  16. Prospective randomized comparison of cold snare polypectomy and conventional polypectomy for small colorectal polyps.

    PubMed

    Ichise, Yasuyuki; Horiuchi, Akira; Nakayama, Yoshiko; Tanaka, Naoki

    2011-01-01

    The ideal method to remove small colorectal polyps is unknown. We compared removal by colon snare transection without electrocautery (cold snare polypectomy) with conventional electrocautery snare polypectomy (hot polypectomy) in terms of procedure duration, difficulty in retrieving polyps, bleeding, and post-polypectomy symptoms. Patients with colorectal polyps up to 8 mm in diameter were randomized to polypectomy by cold snare technique (cold group) or conventional polypectomy (conventional group). The principal outcome measures were abdominal symptoms within 2 weeks after polypectomy. Secondary outcome measures were the rates of retrieval of colorectal polyps and bleeding. Eighty patients were randomized: cold group, n = 40 (101 polyps) and conventional group, n = 40 (104 polyps). The patients' demographic characteristics and the number and size of polyps removed were similar between the two techniques. Procedure time was significantly shorter with cold polypectomy vs. conventional polypectomy (18 vs. 25 min, p < 0.0001). Complete polyp retrieval rates were identical [96% (97/101) vs. 96% (100/104)]. No bleeding requiring hemostasis occurred in either group. Abdominal symptoms shortly after polypectomy were more common with conventional polypectomy (i.e. 20%; 8/40) than with cold polypectomy (i.e. 2.5%; 1/40; p = 0.029). Cold polypectomy was superior to conventional polypectomy in terms of procedure time and post-polypectomy abdominal symptoms. The two methods were otherwise essentially identical in terms of bleeding risk and complete polyp retrieval. Cold polypectomy is therefore the preferred method for removal of small colorectal polyps. Copyright © 2011 S. Karger AG, Basel.

  17. Job requirements compared to medical school education: differences between graduates from problem-based learning and conventional curricula.

    PubMed

    Schlett, Christopher L; Doll, Hinnerk; Dahmen, Janosch; Polacsek, Ole; Federkeil, Gero; Fischer, Martin R; Bamberg, Fabian; Butzlaff, Martin

    2010-01-14

    Problem-based Learning (PBL) has been suggested as a key educational method of knowledge acquisition to improve medical education. We sought to evaluate the differences in medical school education between graduates from PBL-based and conventional curricula and to what extent these curricula fit job requirements. Graduates from all German medical schools who graduated between 1996 and 2002 were eligible for this study. Graduates self-assessed nine competencies as required at their day-to-day work and as taught in medical school on a 6-point Likert scale. Results were compared between graduates from a PBL-based curriculum (University Witten/Herdecke) and conventional curricula. Three schools were excluded because of low response rates. Baseline demographics between graduates of the PBL-based curriculum (n = 101, 49% female) and the conventional curricula (n = 4720, 49% female) were similar. No major differences were observed regarding job requirements with priorities for "Independent learning/working" and "Practical medical skills". All competencies were rated to be better taught in PBL-based curriculum compared to the conventional curricula (all p < 0.001), except for "Medical knowledge" and "Research competence". Comparing competencies required at work and taught in medical school, PBL was associated with benefits in "Interdisciplinary thinking" (Delta + 0.88), "Independent learning/working" (Delta + 0.57), "Psycho-social competence" (Delta + 0.56), "Teamwork" (Delta + 0.39) and "Problem-solving skills" (Delta + 0.36), whereas "Research competence" (Delta--1.23) and "Business competence" (Delta--1.44) in the PBL-based curriculum needed improvement. Among medical graduates in Germany, PBL demonstrated benefits with regard to competencies which were highly required in the job of physicians. Research and business competence deserve closer attention in future curricular development.

  18. Rapid Antimicrobial Susceptibility Testing of Bacillus anthracis, Yersinia pestis, and Burkholderia pseudomallei by Use of Laser Light Scattering Technology.

    PubMed

    Bugrysheva, Julia V; Lascols, Christine; Sue, David; Weigel, Linda M

    2016-06-01

    Rapid methods to determine antimicrobial susceptibility would assist in the timely distribution of effective treatment or postexposure prophylaxis in the aftermath of the release of bacterial biothreat agents such as Bacillus anthracis, Yersinia pestis, or Burkholderia pseudomallei Conventional susceptibility tests require 16 to 48 h of incubation, depending on the bacterial species. We evaluated a method that is based on laser light scattering technology that measures cell density in real time. We determined that it has the ability to rapidly differentiate between growth (resistant) and no growth (susceptible) of several bacterial threat agents in the presence of clinically relevant antimicrobials. Results were available in <4 h for B. anthracis and <6 h for Y. pestis and B. pseudomallei One exception was B. pseudomallei in the presence of ceftazidime, which required >10 h of incubation. Use of laser scattering technology decreased the time required to determine antimicrobial susceptibility by 50% to 75% for B. anthracis, Y. pestis, and B. pseudomallei compared to conventional methods. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  19. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    PubMed

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Enhancement of Lipid Extraction from Marine Microalga, Scenedesmus Associated with High-Pressure Homogenization Process

    PubMed Central

    Cho, Seok-Cheol; Choi, Woon-Yong; Oh, Sung-Ho; Lee, Choon-Geun; Seo, Yong-Chang; Kim, Ji-Seon; Song, Chi-Ho; Kim, Ga-Vin; Lee, Shin-Young; Kang, Do-Hyung; Lee, Hyeon-Yong

    2012-01-01

    Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v). Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production. PMID:22969270

  1. A simplified and efficient method for the analysis of fatty acid methyl esters suitable for large clinical studies.

    PubMed

    Masood, Athar; Stark, Ken D; Salem, Norman

    2005-10-01

    Conventional sample preparation for fatty acid analysis is a complicated, multiple-step process, and gas chromatography (GC) analysis alone can require >1 h per sample to resolve fatty acid methyl esters (FAMEs). Fast GC analysis was adapted to human plasma FAME analysis using a modified polyethylene glycol column with smaller internal diameters, thinner stationary phase films, increased carrier gas linear velocity, and faster temperature ramping. Our results indicated that fast GC analyses were comparable to conventional GC in peak resolution. A conventional transesterification method based on Lepage and Roy was simplified to a one-step method with the elimination of the neutralization and centrifugation steps. A robotics-amenable method was also developed, with lower methylation temperatures and in an open-tube format using multiple reagent additions. The simplified methods produced results that were quantitatively similar and with similar coefficients of variation as compared with the original Lepage and Roy method. The present streamlined methodology is suitable for the direct fatty acid analysis of human plasma, is appropriate for research studies, and will facilitate large clinical trials and make possible population studies.

  2. A Simplified Diagnostic Method for Elastomer Bond Durability

    NASA Technical Reports Server (NTRS)

    White, Paul

    2009-01-01

    A simplified method has been developed for determining bond durability under exposure to water or high humidity conditions. It uses a small number of test specimens with relatively short times of water exposure at elevated temperature. The method is also gravimetric; the only equipment being required is an oven, specimen jars, and a conventional laboratory balance.

  3. Edu-Mining for Book Recommendation for Pupils

    ERIC Educational Resources Information Center

    Nagata, Ryo; Takeda, Keigo; Suda, Koji; Kakegawa, Junichi; Morihiro, Koichiro

    2009-01-01

    This paper proposes a novel method for recommending books to pupils based on a framework called Edu-mining. One of the properties of the proposed method is that it uses only loan histories (pupil ID, book ID, date of loan) whereas the conventional methods require additional information such as taste information from a great number of users which…

  4. 76 FR 45281 - Notice of Submission of Proposed Information Collection to OMB; Public Housing Admissions...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-28

    ... project development is done in accordance with State laws and HUD requirements. The forms are prepared by a general contractor constructing a public housing development under the conventional bid method in... work for project development is done in accordance with State laws and HUD requirements. The forms are...

  5. Quantifying the accuracy of the tumor motion and area as a function of acceleration factor for the simulation of the dynamic keyhole magnetic resonance imaging method.

    PubMed

    Lee, Danny; Greer, Peter B; Pollock, Sean; Kim, Taeho; Keall, Paul

    2016-05-01

    The dynamic keyhole is a new MR image reconstruction method for thoracic and abdominal MR imaging. To date, this method has not been investigated with cancer patient magnetic resonance imaging (MRI) data. The goal of this study was to assess the dynamic keyhole method for the task of lung tumor localization using cine-MR images reconstructed in the presence of respiratory motion. The dynamic keyhole method utilizes a previously acquired a library of peripheral k-space datasets at similar displacement and phase (where phase is simply used to determine whether the breathing is inhale to exhale or exhale to inhale) respiratory bins in conjunction with central k-space datasets (keyhole) acquired. External respiratory signals drive the process of sorting, matching, and combining the two k-space streams for each respiratory bin, thereby achieving faster image acquisition without substantial motion artifacts. This study was the first that investigates the impact of k-space undersampling on lung tumor motion and area assessment across clinically available techniques (zero-filling and conventional keyhole). In this study, the dynamic keyhole, conventional keyhole and zero-filling methods were compared to full k-space dataset acquisition by quantifying (1) the keyhole size required for central k-space datasets for constant image quality across sixty four cine-MRI datasets from nine lung cancer patients, (2) the intensity difference between the original and reconstructed images in a constant keyhole size, and (3) the accuracy of tumor motion and area directly measured by tumor autocontouring. For constant image quality, the dynamic keyhole method, conventional keyhole, and zero-filling methods required 22%, 34%, and 49% of the keyhole size (P < 0.0001), respectively, compared to the full k-space image acquisition method. Compared to the conventional keyhole and zero-filling reconstructed images with the keyhole size utilized in the dynamic keyhole method, an average intensity difference of the dynamic keyhole reconstructed images (P < 0.0001) was minimal, and resulted in the accuracy of tumor motion within 99.6% (P < 0.0001) and the accuracy of tumor area within 98.0% (P < 0.0001) for lung tumor monitoring applications. This study demonstrates that the dynamic keyhole method is a promising technique for clinical applications such as image-guided radiation therapy requiring the MR monitoring of thoracic tumors. Based on the results from this study, the dynamic keyhole method could increase the imaging frequency by up to a factor of five compared with full k-space methods for real-time lung tumor MRI.

  6. Improved Dielectric Properties via Mechano-Chemical Activation in Ba0.80Pb0.20TiO3 Ceramics

    NASA Astrophysics Data System (ADS)

    Kumar, Parveen; Rani, Renu; Singh, Sangeeta; Juneja, J. K.; Prakash, Chandra; Raina, K. K.

    2011-12-01

    The present report is about the preparation and dielectric properties of commonly used Ba0.80Pb0.20TiO3 (BPT) ferroelectric ceramic via Mechano-Chemical Activation (MCA). Results were compared by the BPT sample prepared by conventional solid state method. The BPT sample prepared via MCA technique was found to have decreased tetragonality, dielectric constant value (ɛRT = 450 and ɛmax = 6170) approximately double the value for sample prepared by conventional method (ɛRT = 260 and ɛmax = 3275). Also, the sample prepared by MCA was found to be less frequency dependent. Thus, the BPT sample prepared via MCA is more suitable for capacitor applications requiring lesser frequency dependency than the conventionally prepared BPT sample.

  7. Patch-based generation of a pseudo CT from conventional MRI sequences for MRI-only radiotherapy of the brain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andreasen, Daniel, E-mail: dana@dtu.dk; Van Leemput, Koen; Hansen, Rasmus H.

    Purpose: In radiotherapy (RT) based on magnetic resonance imaging (MRI) as the only modality, the information on electron density must be derived from the MRI scan by creating a so-called pseudo computed tomography (pCT). This is a nontrivial task, since the voxel-intensities in an MRI scan are not uniquely related to electron density. To solve the task, voxel-based or atlas-based models have typically been used. The voxel-based models require a specialized dual ultrashort echo time MRI sequence for bone visualization and the atlas-based models require deformable registrations of conventional MRI scans. In this study, we investigate the potential of amore » patch-based method for creating a pCT based on conventional T{sub 1}-weighted MRI scans without using deformable registrations. We compare this method against two state-of-the-art methods within the voxel-based and atlas-based categories. Methods: The data consisted of CT and MRI scans of five cranial RT patients. To compare the performance of the different methods, a nested cross validation was done to find optimal model parameters for all the methods. Voxel-wise and geometric evaluations of the pCTs were done. Furthermore, a radiologic evaluation based on water equivalent path lengths was carried out, comparing the upper hemisphere of the head in the pCT and the real CT. Finally, the dosimetric accuracy was tested and compared for a photon treatment plan. Results: The pCTs produced with the patch-based method had the best voxel-wise, geometric, and radiologic agreement with the real CT, closely followed by the atlas-based method. In terms of the dosimetric accuracy, the patch-based method had average deviations of less than 0.5% in measures related to target coverage. Conclusions: We showed that a patch-based method could generate an accurate pCT based on conventional T{sub 1}-weighted MRI sequences and without deformable registrations. In our evaluations, the method performed better than existing voxel-based and atlas-based methods and showed a promising potential for RT of the brain based only on MRI.« less

  8. Synthetic Hounsfield units from spectral CT data

    NASA Astrophysics Data System (ADS)

    Bornefalk, Hans

    2012-04-01

    Beam-hardening-free synthetic images with absolute CT numbers that radiologists are used to can be constructed from spectral CT data by forming ‘dichromatic’ images after basis decomposition. The CT numbers are accurate for all tissues and the method does not require additional reconstruction. This method prevents radiologists from having to relearn new rules-of-thumb regarding absolute CT numbers for various organs and conditions as conventional CT is replaced by spectral CT. Displaying the synthetic Hounsfield unit images side-by-side with images reconstructed for optimal detectability for a certain task can ease the transition from conventional to spectral CT.

  9. Meshless Local Petrov-Galerkin Euler-Bernoulli Beam Problems: A Radial Basis Function Approach

    NASA Technical Reports Server (NTRS)

    Raju, I. S.; Phillips, D. R.; Krishnamurthy, T.

    2003-01-01

    A radial basis function implementation of the meshless local Petrov-Galerkin (MLPG) method is presented to study Euler-Bernoulli beam problems. Radial basis functions, rather than generalized moving least squares (GMLS) interpolations, are used to develop the trial functions. This choice yields a computationally simpler method as fewer matrix inversions and multiplications are required than when GMLS interpolations are used. Test functions are chosen as simple weight functions as in the conventional MLPG method. Compactly and noncompactly supported radial basis functions are considered. The non-compactly supported cubic radial basis function is found to perform very well. Results obtained from the radial basis MLPG method are comparable to those obtained using the conventional MLPG method for mixed boundary value problems and problems with discontinuous loading conditions.

  10. A hybrid microfluidic-vacuum device for direct interfacing with conventional cell culture methods

    PubMed Central

    Chung, Bong Geun; Park, Jeong Won; Hu, Jia Sheng; Huang, Carlos; Monuki, Edwin S; Jeon, Noo Li

    2007-01-01

    Background Microfluidics is an enabling technology with a number of advantages over traditional tissue culture methods when precise control of cellular microenvironment is required. However, there are a number of practical and technical limitations that impede wider implementation in routine biomedical research. Specialized equipment and protocols required for fabrication and setting up microfluidic experiments present hurdles for routine use by most biology laboratories. Results We have developed and validated a novel microfluidic device that can directly interface with conventional tissue culture methods to generate and maintain controlled soluble environments in a Petri dish. It incorporates separate sets of fluidic channels and vacuum networks on a single device that allows reversible application of microfluidic gradients onto wet cell culture surfaces. Stable, precise concentration gradients of soluble factors were generated using simple microfluidic channels that were attached to a perfusion system. We successfully demonstrated real-time optical live/dead cell imaging of neural stem cells exposed to a hydrogen peroxide gradient and chemotaxis of metastatic breast cancer cells in a growth factor gradient. Conclusion This paper describes the design and application of a versatile microfluidic device that can directly interface with conventional cell culture methods. This platform provides a simple yet versatile tool for incorporating the advantages of a microfluidic approach to biological assays without changing established tissue culture protocols. PMID:17883868

  11. Cat-eye effect target recognition with single-pixel detectors

    NASA Astrophysics Data System (ADS)

    Jian, Weijian; Li, Li; Zhang, Xiaoyue

    2015-12-01

    A prototype of cat-eye effect target recognition with single-pixel detectors is proposed. Based on the framework of compressive sensing, it is possible to recognize cat-eye effect targets by projecting a series of known random patterns and measuring the backscattered light with three single-pixel detectors in different locations. The prototype only requires simpler, less expensive detectors and extends well beyond the visible spectrum. The simulations are accomplished to evaluate the feasibility of the proposed prototype. We compared our results to that obtained from conventional cat-eye effect target recognition methods using area array sensor. The experimental results show that this method is feasible and superior to the conventional method in dynamic and complicated backgrounds.

  12. The Ultimate Pile Bearing Capacity from Conventional and Spectral Analysis of Surface Wave (SASW) Measurements

    NASA Astrophysics Data System (ADS)

    Faizah Bawadi, Nor; Anuar, Shamilah; Rahim, Mustaqqim A.; Mansor, A. Faizal

    2018-03-01

    A conventional and seismic method for determining the ultimate pile bearing capacity was proposed and compared. The Spectral Analysis of Surface Wave (SASW) method is one of the non-destructive seismic techniques that do not require drilling and sampling of soils, was used in the determination of shear wave velocity (Vs) and damping (D) profile of soil. The soil strength was found to be directly proportional to the Vs and its value has been successfully applied to obtain shallow bearing capacity empirically. A method is proposed in this study to determine the pile bearing capacity using Vs and D measurements for the design of pile and also as an alternative method to verify the bearing capacity from the other conventional methods of evaluation. The objectives of this study are to determine Vs and D profile through frequency response data from SASW measurements and to compare pile bearing capacities obtained from the method carried out and conventional methods. All SASW test arrays were conducted near the borehole and location of conventional pile load tests. In obtaining skin and end bearing pile resistance, the Hardin and Drnevich equation has been used with reference strains obtained from the method proposed by Abbiss. Back analysis results of pile bearing capacities from SASW were found to be 18981 kN and 4947 kN compared to 18014 kN and 4633 kN of IPLT with differences of 5% and 6% for Damansara and Kuala Lumpur test sites, respectively. The results of this study indicate that the seismic method proposed in this study has the potential to be used in estimating the pile bearing capacity.

  13. Comparison of Land, Water, and Energy Requirements of Lettuce Grown Using Hydroponic vs. Conventional Agricultural Methods

    PubMed Central

    Lages Barbosa, Guilherme; Almeida Gadelha, Francisca Daiane; Kublik, Natalya; Proctor, Alan; Reichelm, Lucas; Weissinger, Emily; Wohlleb, Gregory M.; Halden, Rolf U.

    2015-01-01

    The land, water, and energy requirements of hydroponics were compared to those of conventional agriculture by example of lettuce production in Yuma, Arizona, USA. Data were obtained from crop budgets and governmental agricultural statistics, and contrasted with theoretical data for hydroponic lettuce production derived by using engineering equations populated with literature values. Yields of lettuce per greenhouse unit (815 m2) of 41 ± 6.1 kg/m2/y had water and energy demands of 20 ± 3.8 L/kg/y and 90,000 ± 11,000 kJ/kg/y (±standard deviation), respectively. In comparison, conventional production yielded 3.9 ± 0.21 kg/m2/y of produce, with water and energy demands of 250 ± 25 L/kg/y and 1100 ± 75 kJ/kg/y, respectively. Hydroponics offered 11 ± 1.7 times higher yields but required 82 ± 11 times more energy compared to conventionally produced lettuce. To the authors’ knowledge, this is the first quantitative comparison of conventional and hydroponic produce production by example of lettuce grown in the southwestern United States. It identified energy availability as a major factor in assessing the sustainability of hydroponics, and it points to water-scarce settings offering an abundance of renewable energy (e.g., from solar, geothermal, or wind power) as particularly attractive regions for hydroponic agriculture. PMID:26086708

  14. Comparison of Land, Water, and Energy Requirements of Lettuce Grown Using Hydroponic vs. Conventional Agricultural Methods.

    PubMed

    Barbosa, Guilherme Lages; Gadelha, Francisca Daiane Almeida; Kublik, Natalya; Proctor, Alan; Reichelm, Lucas; Weissinger, Emily; Wohlleb, Gregory M; Halden, Rolf U

    2015-06-16

    The land, water, and energy requirements of hydroponics were compared to those of conventional agriculture by example of lettuce production in Yuma, Arizona, USA. Data were obtained from crop budgets and governmental agricultural statistics, and contrasted with theoretical data for hydroponic lettuce production derived by using engineering equations populated with literature values. Yields of lettuce per greenhouse unit (815 m2) of 41 ± 6.1 kg/m2/y had water and energy demands of 20 ± 3.8 L/kg/y and 90,000 ± 11,000 kJ/kg/y (±standard deviation), respectively. In comparison, conventional production yielded 3.9 ± 0.21 kg/m2/y of produce, with water and energy demands of 250 ± 25 L/kg/y and 1100 ± 75 kJ/kg/y, respectively. Hydroponics offered 11 ± 1.7 times higher yields but required 82 ± 11 times more energy compared to conventionally produced lettuce. To the authors' knowledge, this is the first quantitative comparison of conventional and hydroponic produce production by example of lettuce grown in the southwestern United States. It identified energy availability as a major factor in assessing the sustainability of hydroponics, and it points to water-scarce settings offering an abundance of renewable energy (e.g., from solar, geothermal, or wind power) as particularly attractive regions for hydroponic agriculture.

  15. A stiffness derivative finite element technique for determination of crack tip stress intensity factors

    NASA Technical Reports Server (NTRS)

    Parks, D. M.

    1974-01-01

    A finite element technique for determination of elastic crack tip stress intensity factors is presented. The method, based on the energy release rate, requires no special crack tip elements. Further, the solution for only a single crack length is required, and the crack is 'advanced' by moving nodal points rather than by removing nodal tractions at the crack tip and performing a second analysis. The promising straightforward extension of the method to general three-dimensional crack configurations is presented and contrasted with the practical impossibility of conventional energy methods.

  16. A novel automatic quantification method for high-content screening analysis of DNA double strand-break response.

    PubMed

    Feng, Jingwen; Lin, Jie; Zhang, Pengquan; Yang, Songnan; Sa, Yu; Feng, Yuanming

    2017-08-29

    High-content screening is commonly used in studies of the DNA damage response. The double-strand break (DSB) is one of the most harmful types of DNA damage lesions. The conventional method used to quantify DSBs is γH2AX foci counting, which requires manual adjustment and preset parameters and is usually regarded as imprecise, time-consuming, poorly reproducible, and inaccurate. Therefore, a robust automatic alternative method is highly desired. In this manuscript, we present a new method for quantifying DSBs which involves automatic image cropping, automatic foci-segmentation and fluorescent intensity measurement. Furthermore, an additional function was added for standardizing the measurement of DSB response inhibition based on co-localization analysis. We tested the method with a well-known inhibitor of DSB response. The new method requires only one preset parameter, which effectively minimizes operator-dependent variations. Compared with conventional methods, the new method detected a higher percentage difference of foci formation between different cells, which can improve measurement accuracy. The effects of the inhibitor on DSB response were successfully quantified with the new method (p = 0.000). The advantages of this method in terms of reliability, automation and simplicity show its potential in quantitative fluorescence imaging studies and high-content screening for compounds and factors involved in DSB response.

  17. A Meshless Method Using Radial Basis Functions for Beam Bending Problems

    NASA Technical Reports Server (NTRS)

    Raju, I. S.; Phillips, D. R.; Krishnamurthy, T.

    2004-01-01

    A meshless local Petrov-Galerkin (MLPG) method that uses radial basis functions (RBFs) as trial functions in the study of Euler-Bernoulli beam problems is presented. RBFs, rather than generalized moving least squares (GMLS) interpolations, are used to develop the trial functions. This choice yields a computationally simpler method as fewer matrix inversions and multiplications are required than when GMLS interpolations are used. Test functions are chosen as simple weight functions as they are in the conventional MLPG method. Compactly and noncompactly supported RBFs are considered. Noncompactly supported cubic RBFs are found to be preferable. Patch tests, mixed boundary value problems, and problems with complex loading conditions are considered. Results obtained from the radial basis MLPG method are either of comparable or better accuracy than those obtained when using the conventional MLPG method.

  18. Patch-Based Super-Resolution of MR Spectroscopic Images: Application to Multiple Sclerosis

    PubMed Central

    Jain, Saurabh; Sima, Diana M.; Sanaei Nezhad, Faezeh; Hangel, Gilbert; Bogner, Wolfgang; Williams, Stephen; Van Huffel, Sabine; Maes, Frederik; Smeets, Dirk

    2017-01-01

    Purpose: Magnetic resonance spectroscopic imaging (MRSI) provides complementary information to conventional magnetic resonance imaging. Acquiring high resolution MRSI is time consuming and requires complex reconstruction techniques. Methods: In this paper, a patch-based super-resolution method is presented to increase the spatial resolution of metabolite maps computed from MRSI. The proposed method uses high resolution anatomical MR images (T1-weighted and Fluid-attenuated inversion recovery) to regularize the super-resolution process. The accuracy of the method is validated against conventional interpolation techniques using a phantom, as well as simulated and in vivo acquired human brain images of multiple sclerosis subjects. Results: The method preserves tissue contrast and structural information, and matches well with the trend of acquired high resolution MRSI. Conclusions: These results suggest that the method has potential for clinically relevant neuroimaging applications. PMID:28197066

  19. The sensitivity of an hydroponic lettuce root elongation bioassay to metals, phenol and wastewaters.

    PubMed

    Park, Jihae; Yoon, Jeong-Hyun; Depuydt, Stephen; Oh, Jung-Woo; Jo, Youn-Min; Kim, Kyungtae; Brown, Murray T; Han, Taejun

    2016-04-01

    The root elongation bioassay is one of the most straightforward test methods used for environmental monitoring in terms of simplicity, rapidity and economy since it merely requires filter paper, distilled water and Petri dishes. However, filter paper as a support material is known to be problematic as it can reduce the sensitivity of the test. The newly developed hydroponic method reported here differs from the conventional root elongation method (US EPA filter paper method) in that no support material is used and the exposure time is shorter (48 h in this test versus 120 h in the US EPA test). For metals, the hydroponic test method was 3.3 (for Hg) to 57 (for Cu) times more sensitive than the US EPA method with the rank orders of sensitivity, estimated from EC50 values, being Cu≥Cd>Ni≥Zn≥Hg for the former and Hg≥Cu≥Ni≥Cd≥Zn for the latter methods. For phenol, the results did not differ significantly; EC50 values were 124 mg L(-1) and 108-180 mg L(-1) for the hydroponic and filter paper methods, respectively. Lettuce was less sensitive than daphnids to wastewaters, but the root elongation response appears to be wastewater-specific and is especially sensitive for detecting the presence of fluorine. The new hydroponic test thus provides many practical advantages, especially in terms of cost and time-effectiveness requiring only a well plate, a small volume of distilled water and short exposure period; furthermore, no specialist expertise is required. The method is simpler than the conventional EPA technique in not using filter paper which can influence the sensitivity of the test. Additionally, plant seeds have a long shelf-life and require little or no maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. A novel approach for in vitro meat production.

    PubMed

    Pandurangan, Muthuraman; Kim, Doo Hwan

    2015-07-01

    The present review describes the possibility of in vitro meat production with the help of advanced co-culturing methods. In vitro meat production method could be a possible alternative for the conventional meat production. Originally, the research on in vitro meat production was initiated by the National Aeronautics and Space Administration (NASA) for space voyages. The required key qualities for accepting in vitro meat for consumption would be good efficiency ratio, increased protein synthesis rate in skeletal muscles, and mimicking the conventional meat qualities. In vitro culturing of meat is possible with the use of skeletal muscle tissue engineering, stem cell, cell co-culture, and tissue culture methods. Co-culture of myoblast and fibroblast is believed as one of the major techniques for in vitro meat production. In our lab, we have co-cultured myoblast and fibroblast. We believe that a billion pounds of in vitro meat could be produced from one animal for consumption. However, we require a great deal of research on in vitro meat production.

  1. Comparison of LED and Conventional Fluorescence Microscopy for Detection of Acid Fast Bacilli in a Low-Incidence Setting

    PubMed Central

    Minion, Jessica; Pai, Madhukar; Ramsay, Andrew; Menzies, Dick; Greenaway, Christina

    2011-01-01

    Introduction Light emitting diode fluorescence microscopes have many practical advantages over conventional mercury vapour fluorescence microscopes, which would make them the preferred choice for laboratories in both low- and high-resource settings, provided performance is equivalent. Methods In a nested case-control study, we compared diagnostic accuracy and time required to read slides with the Zeiss PrimoStar iLED, LW Scientific Lumin, and a conventional fluorescence microscope (Leica DMLS). Mycobacterial culture was used as the reference standard, and subgroup analysis by specimen source and organism isolated were performed. Results There was no difference in sensitivity or specificity between the three microscopes, and agreement was high for all comparisons and subgroups. The Lumin and the conventional fluorescence microscope were equivalent with respect to time required to read smears, but the Zeiss iLED was significantly time saving compared to both. Conclusions Light emitting diode microscopy should be considered by all tuberculosis diagnostic laboratories, including those in high income countries, as a replacement for conventional fluorescence microscopes. Our findings provide support to the recent World Health Organization policy recommending that conventional fluorescence microscopy be replaced by light emitting diode microscopy using auramine staining in all settings where fluorescence microscopy is currently used. PMID:21811622

  2. Spiking neural network simulation: memory-optimal synaptic event scheduling.

    PubMed

    Stewart, Robert D; Gurney, Kevin N

    2011-06-01

    Spiking neural network simulations incorporating variable transmission delays require synaptic events to be scheduled prior to delivery. Conventional methods have memory requirements that scale with the total number of synapses in a network. We introduce novel scheduling algorithms for both discrete and continuous event delivery, where the memory requirement scales instead with the number of neurons. Superior algorithmic performance is demonstrated using large-scale, benchmarking network simulations.

  3. Accuracy of delta 18O isotope ratio measurements on the same sample by continuous-flow isotope-ratio mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    The doubly labeled water method is considered the reference method to measure energy expenditure. Conventional mass spectrometry requires a separate aliquot of the same sample to be prepared and analyzed separately. With continuous-flow isotope-ratio mass spectrometry, the same sample could be analy...

  4. A hydrogen gas-water equilibration method produces accurate and precise stable hydrogen isotope ratio measurements in nutrition studies

    USDA-ARS?s Scientific Manuscript database

    Stable hydrogen isotope methodology is used in nutrition studies to measure growth, breast milk intake, and energy requirement. Isotope ratio MS is the best instrumentation to measure the stable hydrogen isotope ratios in physiological fluids. Conventional methods to convert physiological fluids to ...

  5. Using a second‐order differential model to fit data without baselines in protein isothermal chemical denaturation

    PubMed Central

    Tang, Chuanning; Lew, Scott

    2016-01-01

    Abstract In vitro protein stability studies are commonly conducted via thermal or chemical denaturation/renaturation of protein. Conventional data analyses on the protein unfolding/(re)folding require well‐defined pre‐ and post‐transition baselines to evaluate Gibbs free‐energy change associated with the protein unfolding/(re)folding. This evaluation becomes problematic when there is insufficient data for determining the pre‐ or post‐transition baselines. In this study, fitting on such partial data obtained in protein chemical denaturation is established by introducing second‐order differential (SOD) analysis to overcome the limitations that the conventional fitting method has. By reducing numbers of the baseline‐related fitting parameters, the SOD analysis can successfully fit incomplete chemical denaturation data sets with high agreement to the conventional evaluation on the equivalent completed data, where the conventional fitting fails in analyzing them. This SOD fitting for the abbreviated isothermal chemical denaturation further fulfills data analysis methods on the insufficient data sets conducted in the two prevalent protein stability studies. PMID:26757366

  6. A discontinuous control volume finite element method for multi-phase flow in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Salinas, P.; Pavlidis, D.; Xie, Z.; Osman, H.; Pain, C. C.; Jackson, M. D.

    2018-01-01

    We present a new, high-order, control-volume-finite-element (CVFE) method for multiphase porous media flow with discontinuous 1st-order representation for pressure and discontinuous 2nd-order representation for velocity. The method has been implemented using unstructured tetrahedral meshes to discretize space. The method locally and globally conserves mass. However, unlike conventional CVFE formulations, the method presented here does not require the use of control volumes (CVs) that span the boundaries between domains with differing material properties. We demonstrate that the approach accurately preserves discontinuous saturation changes caused by permeability variations across such boundaries, allowing efficient simulation of flow in highly heterogeneous models. Moreover, accurate solutions are obtained at significantly lower computational cost than using conventional CVFE methods. We resolve a long-standing problem associated with the use of classical CVFE methods to model flow in highly heterogeneous porous media.

  7. Purification of anti-Japanese encephalitis virus monoclonal antibody by ceramic hydroxyapatite chromatography without proteins A and G.

    PubMed

    Saito, Maiko; Kurosawa, Yae; Okuyama, Tsuneo

    2012-02-01

    Antibody purification using proteins A and G has been a standard method for research and industrial processes. The conventional method, however, includes a three-step process, including buffer exchange, before chromatography. In addition, proteins A and G require low pH elution, which causes antibody aggregation and inactivates the antibody's immunity. This report proposes a two-step method using hydroxyapatite chromatography and membrane filtration, without proteins A and G. This novel method shortens the running time to one-third the conventional method for each cycle. Using our two-step method, 90.2% of the monoclonal antibodies purified were recovered in the elution fraction, the purity achieved was >90%, and most of the antigen-specific activity was retained. This report suggests that the two-step method using hydroxyapatite chromatography and membrane filtration should be considered as an alternative to purification using proteins A and G.

  8. Development of a quantum mechanics-based free-energy perturbation method: use in the calculation of relative solvation free energies.

    PubMed

    Reddy, M Rami; Singh, U C; Erion, Mark D

    2004-05-26

    Free-energy perturbation (FEP) is considered the most accurate computational method for calculating relative solvation and binding free-energy differences. Despite some success in applying FEP methods to both drug design and lead optimization, FEP calculations are rarely used in the pharmaceutical industry. One factor limiting the use of FEP is its low throughput, which is attributed in part to the dependence of conventional methods on the user's ability to develop accurate molecular mechanics (MM) force field parameters for individual drug candidates and the time required to complete the process. In an attempt to find an FEP method that could eventually be automated, we developed a method that uses quantum mechanics (QM) for treating the solute, MM for treating the solute surroundings, and the FEP method for computing free-energy differences. The thread technique was used in all transformations and proved to be essential for the successful completion of the calculations. Relative solvation free energies for 10 structurally diverse molecular pairs were calculated, and the results were in close agreement with both the calculated results generated by conventional FEP methods and the experimentally derived values. While considerably more CPU demanding than conventional FEP methods, this method (QM/MM-based FEP) alleviates the need for development of molecule-specific MM force field parameters and therefore may enable future automation of FEP-based calculations. Moreover, calculation accuracy should be improved over conventional methods, especially for calculations reliant on MM parameters derived in the absence of experimental data.

  9. Method for thermal and structural evaluation of shallow intense-beam deposition in matter

    NASA Astrophysics Data System (ADS)

    Pilan Zanoni, André

    2018-05-01

    The projected range of high-intensity proton and heavy-ion beams at energies below a few tens of MeV/A in matter can be as short as a few micrometers. For the evaluation of temperature and stresses from a shallow beam energy deposition in matter conventional numerical 3D models require minuscule element sizes for acceptable element aspect ratio as well as extremely short time steps for numerical convergence. In order to simulate energy deposition using a manageable number of elements this article presents a method using layered elements. This method is applied to beam stoppers and accidental intense-beam impact onto UHV sector valves. In those cases the thermal results from the new method are congruent to those from conventional solid-element and adiabatic models.

  10. A Modified Method for Isolation of Rhein from Senna

    PubMed Central

    Mehta, Namita; Laddha, K. S.

    2009-01-01

    A simple and efficient method for the isolation of rhein from Cassia angustifolia (senna) leaves is described in which the hydrolysis of the sennosides and extraction of the hydrolysis products (free anthraquinones) is carried out in one step. Further isolation of rhein is achieved from the anthraquinone mixture. This method reduces the number of steps required for isolation of rhein as compared to conventional methods. PMID:20336207

  11. Magnetic method for stimulating transport in fluids

    DOEpatents

    Martin, James E.; Solis, Kyle J.

    2016-10-18

    A method for producing mass and heat transport in fluids, wherein the method does not rely on conventional convection, that is, it does not require gravity, a thermal gradient, or a magnetic field gradient. This method gives rise to a unique class of vigorous, field-controllable flow patterns termed advection lattices. The advection lattices can be used to transport heat and/or mass in any desired direction using only magnetic fields.

  12. 8 CFR 204.311 - Convention adoption home study requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Convention adoption home study requirements... IMMIGRANT PETITIONS Intercountry Adoption of a Convention Adoptee § 204.311 Convention adoption home study requirements. (a) Purpose. For immigration purposes, a home study is a process for screening and preparing an...

  13. 8 CFR 204.311 - Convention adoption home study requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Convention adoption home study requirements... IMMIGRANT PETITIONS Intercountry Adoption of a Convention Adoptee § 204.311 Convention adoption home study requirements. (a) Purpose. For immigration purposes, a home study is a process for screening and preparing an...

  14. 8 CFR 204.311 - Convention adoption home study requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Convention adoption home study requirements... IMMIGRANT PETITIONS Intercountry Adoption of a Convention Adoptee § 204.311 Convention adoption home study requirements. (a) Purpose. For immigration purposes, a home study is a process for screening and preparing an...

  15. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter

    PubMed Central

    Huang, Lei

    2015-01-01

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required. PMID:26437409

  16. Quantifying the accuracy of the tumor motion and area as a function of acceleration factor for the simulation of the dynamic keyhole magnetic resonance imaging method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Danny; Pollock, Sean; Keall, Paul, E-mail: paul.keall@sydney.edu.au

    2016-05-15

    Purpose: The dynamic keyhole is a new MR image reconstruction method for thoracic and abdominal MR imaging. To date, this method has not been investigated with cancer patient magnetic resonance imaging (MRI) data. The goal of this study was to assess the dynamic keyhole method for the task of lung tumor localization using cine-MR images reconstructed in the presence of respiratory motion. Methods: The dynamic keyhole method utilizes a previously acquired a library of peripheral k-space datasets at similar displacement and phase (where phase is simply used to determine whether the breathing is inhale to exhale or exhale to inhale)more » respiratory bins in conjunction with central k-space datasets (keyhole) acquired. External respiratory signals drive the process of sorting, matching, and combining the two k-space streams for each respiratory bin, thereby achieving faster image acquisition without substantial motion artifacts. This study was the first that investigates the impact of k-space undersampling on lung tumor motion and area assessment across clinically available techniques (zero-filling and conventional keyhole). In this study, the dynamic keyhole, conventional keyhole and zero-filling methods were compared to full k-space dataset acquisition by quantifying (1) the keyhole size required for central k-space datasets for constant image quality across sixty four cine-MRI datasets from nine lung cancer patients, (2) the intensity difference between the original and reconstructed images in a constant keyhole size, and (3) the accuracy of tumor motion and area directly measured by tumor autocontouring. Results: For constant image quality, the dynamic keyhole method, conventional keyhole, and zero-filling methods required 22%, 34%, and 49% of the keyhole size (P < 0.0001), respectively, compared to the full k-space image acquisition method. Compared to the conventional keyhole and zero-filling reconstructed images with the keyhole size utilized in the dynamic keyhole method, an average intensity difference of the dynamic keyhole reconstructed images (P < 0.0001) was minimal, and resulted in the accuracy of tumor motion within 99.6% (P < 0.0001) and the accuracy of tumor area within 98.0% (P < 0.0001) for lung tumor monitoring applications. Conclusions: This study demonstrates that the dynamic keyhole method is a promising technique for clinical applications such as image-guided radiation therapy requiring the MR monitoring of thoracic tumors. Based on the results from this study, the dynamic keyhole method could increase the imaging frequency by up to a factor of five compared with full k-space methods for real-time lung tumor MRI.« less

  17. Laparoendoscopic single-site surgery varicocelectomy versus conventional laparoscopic varicocele ligation: A meta-analysis

    PubMed Central

    Li, Mingchao; Wang, Zhengyun

    2016-01-01

    Objective To perform a meta-analysis of data from available published studies comparing laparoendoscopic single-site surgery varicocelectomy (LESSV) with conventional transperitoneal laparoscopic varicocele ligation. Methods A comprehensive data search was performed in PubMed and Embase to identify randomized controlled trials and comparative studies that compared the two surgical approaches for the treatment of varicoceles. Results Six studies were included in the meta-analysis. LESSV required a significantly longer operative time than conventional laparoscopic varicocelectomy but was associated with significantly less postoperative pain at 6 h and 24 h, a shorter recovery time and greater patient satisfaction with the cosmetic outcome. There was no difference between the two surgical approaches in terms of postoperative semen quality or the incidence of complications. Conclusion These data suggest that LESSV offers a well tolerated and efficient alternative to conventional laparoscopic varicocelectomy, with less pain, a shorter recovery time and better cosmetic satisfaction. Further well-designed studies are required to confirm these findings and update the results of this meta-analysis. PMID:27688686

  18. Simple method to make a supersaturated oxygen fluid.

    PubMed

    Tange, Yoshihiro; Yoshitake, Shigenori; Takesawa, Shingo

    2018-01-22

    Intravenous oxygenation has demonstrated significant increase in partial pressure of oxygen (PO 2 ) in animal models. A highly dissolved oxygen solution might be able to provide a sufficient level of oxygen delivery to the tissues and organs in patients with hypoxia. However, conventional fluid oxygenation methods have required the use of original devices. If simpler oxygenation of a solution is possible, it will be a useful strategy for application in clinical practice. We simply developed its administration by injection of either air or oxygen gas into conventional saline. We determined the PO 2 values in the solutions in comparison with conventional saline in vitro. To examine the effects of the administration of the new solutions on the blood gas profile, we diluted bovine blood with either conventional or the new solutions and analyzed PO 2 , oxygen saturation (SO 2 ) and total oxygen content. PO 2 levels in the blood and new solution mixture significantly increased with each additional injected gas volume. Significant increases in the PO 2 and SO 2 of the bovine blood were found in those blood samples with the new solution, as compared with those with the control solution. These results suggest that this solution promotes oxygen delivery to the hypoxic tissue and recovery from hypoxia. This method is simpler and easier than previous methods.

  19. The 21st century skills with model eliciting activities on linear program

    NASA Astrophysics Data System (ADS)

    Handajani, Septriana; Pratiwi, Hasih; Mardiyana

    2018-04-01

    Human resources in the 21st century are required to master various forms of skills, including critical thinking skills and problem solving. The teaching of the 21st century is a teaching that integrates literacy skills, knowledge, skills, attitudes, and mastery of ICT. This study aims to determine whether there are differences in the effect of applying Model Elliciting Activities (MEAs) that integrates 21st century skills, namely 4C and conventional learning to learning outcomes. This research was conducted at Vocational High School in the odd semester of 2017 and uses the experimental method. The experimental class is treated MEAs that integrates 4C skills and the control class is given conventional learning. Methods of data collection in this study using the method of documentation and test methods. The data analysis uses Z-test. Data obtained from experiment class and control class. The result of this study showed there are differences in the effect of applying MEAs that integrates 4C skills and conventional learning to learning outcomes. Classes with MEAs that integrates 4C skills give better learning outcomes than the ones in conventional learning classes. This happens because MEAs that integrates 4C skills can improved creativity skills, communication skills, collaboration skills, and problem-solving skills.

  20. [Evaluation of a Computer-Aided Microscope System and Its Anti-Nuclear Antibody Test Kit for Indirect Immunofluorescence Assay].

    PubMed

    Hayashi, Nobuhide; Saegusa, Jun; Uto, Kenichi; Oyabu, Chinami; Saito, Toshiharu; Sato, Itsuko; Kawano, Seiji; Kumagai, Shunichi

    2016-02-01

    Antinuclear antibody (ANA) testing is indispensable for diagnosing and understanding clinical conditions of autoimmune diseases. The indirect immunofluorescence assay (IFA) is the gold standard for ANA screening, and it can detect more than 100 different antibodies, such as anti-PCNA as well as anti-cytoplasmic antibodies. However, complicated procedures of conventional IFA and visual interpretation require highly skilled laboratory staff. This study evaluates the capability, characteristics, and applicability of the recently developed ANA detection system (EUROPattern Cosmic IFA System, EPA) using HEp20-10 cells and the automated pattern recognition microscope. Findings using EPA and conventional methods were compared in 282 sera obtained from connective tissue disease patients and 250 sera from healthy individuals. The concordance of the positivity rate, antibody titer (within +/- 1 tube difference), and the accurate recognition rate of ANA patterns between the automated EPA method and the microscopic judgement of the EPA image by eye was 98.9, 97.4, and 55.3%, respectively. The EPA method showed concordance of the positivity rate as high as 93.3% and concordance of the antibody titer as high as 94.0% (within +/- 1 titer) compared with the conventional method. Regarding the four typical patterns of ANA (homogeneous, speckled, nucleolar, and centromere), large differences between the EPA and conventional methods were not observed, and the rate of concordance between the final EPA result and the conventional method was from 94.1 to 100%. The positivity rate of ANA using the EPA and conventional methods showed marked agreement among the six connective tissue diseases (SLE, MCTD, SSc, PM/DM, and SS) and healthy individuals. Although the EPA system is not considered a complete system and laboratory staff should verify the results, it is a useful system for routine ANA analysis because it contributes to ANA standardization and an efficient workflow.

  1. Passive wireless strain monitoring of tire using capacitance change

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Ryosuke; Todoroki, Akira

    2004-07-01

    In-service strain monitoring of tires of automobile is quite effective for improving the reliability of tires and Anti-lock Braking System (ABS). Since conventional strain gages have high stiffness and require lead wires, the conventional strain gages are cumbersome for the strain measurements of the tires. In a previous study, the authors proposed a new wireless strain monitoring method that adopts the tire itself as a sensor, with an oscillating circuit. This method is very simple and useful, but it requires a battery to activate the oscillating circuit. In the present study, the previous method for wireless tire monitoring is improved to produce a passive wireless sensor. A specimen made from a commercially available tire is connected to a tuning circuit comprising an inductance and a capacitance as a condenser. The capacitance change of tire causes change of the tuning frequency. This change of the tuned radio wave enables us to measure the applied strain of the specimen wirelessly, without any power supply from outside. This new passive wireless method is applied to a specimen and the static applied strain is measured. As a result, the method is experimentally shown to be effective as a passive wireless strain monitoring of tires.

  2. A comparative study of the novel spectrophotometric methods versus conventional ones for the simultaneous determination of Esomeprazole magnesium trihydrate and Naproxen in their binary mixture.

    PubMed

    Lotfy, Hayam M; Amer, Sawsan M; Zaazaa, Hala E; Mostafa, Noha S

    2015-01-01

    Two novel simple, specific, accurate and precise spectrophotometric methods manipulating ratio spectra are developed and validated for simultaneous determination of Esomeprazole magnesium trihydrate (ESO) and Naproxen (NAP) namely; absorbance subtraction and ratio difference. The results were compared to that of the conventional spectrophotometric methods namely; dual wavelength and isoabsorptive point coupled with first derivative of ratio spectra and derivative ratio. The suggested methods were validated in compliance with the ICH guidelines and were successfully applied for determination of ESO and NAP in their laboratory prepared mixtures and pharmaceutical preparation. No preliminary separation steps are required for the proposed spectrophotometeric procedures. The statistical comparison showed that there is no significant difference between the proposed methods and the reported method with respect to both accuracy and precision. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Electroplating eliminates gas leakage in brazed areas

    NASA Technical Reports Server (NTRS)

    Leigh, J. D.

    1966-01-01

    Electroplating method seals brazed or welded joints against gas leakage under high pressure. Any conventional electroplating process with many different metal anodes can be used, as well as the build up of layers of different metals to any required thickness.

  4. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  5. Quality characteristics of battered and fried chicken: comparison of pressure frying and conventional frying.

    PubMed

    Das, Rashmi; Pawar, Deepthi P; Modi, Vinod Kumar

    2013-04-01

    The marinated and battered chicken leg meat and breast meat were pressure fried and their physico-chemical qualities were compared to the conventional fried product (open pan deep fat frying). Shrinkage due to frying process was significantly lesser in case of pressure fried leg meat (PLM) and breast meat (PBM) as compared to products prepared by conventional frying leg meat (CLM) and breast meat (CBM). Also, juiciness of pressure fried chicken products was superior (p ≤ 0.05) than fried products obtained by the conventional method. PLM and PBM had lower fat content (p ≤ 0.05) compared to conventionally fried CLM and CBM. Lipid oxidation was higher (p ≤ 0.05) in conventional frying as compared to pressure frying. Irrespective of the type of chicken meat, conventionally fried meat required higher shear force as compared to pressure fried products. Salmonella, Staphylococcus aureus, Shigella and E. coli were not detected. The study indicates the usefulness and superiority of pressure frying in comparison to conventional deep fat frying.

  6. Speeding up local correlation methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kats, Daniel

    2014-12-28

    We present two techniques that can substantially speed up the local correlation methods. The first one allows one to avoid the expensive transformation of the electron-repulsion integrals from atomic orbitals to virtual space. The second one introduces an algorithm for the residual equations in the local perturbative treatment that, in contrast to the standard scheme, does not require holding the amplitudes or residuals in memory. It is shown that even an interpreter-based implementation of the proposed algorithm in the context of local MP2 method is faster and requires less memory than the highly optimized variants of conventional algorithms.

  7. Application of Grey Relational Analysis to Decision-Making during Product Development

    ERIC Educational Resources Information Center

    Hsiao, Shih-Wen; Lin, Hsin-Hung; Ko, Ya-Chuan

    2017-01-01

    A multi-attribute decision-making (MADM) approach was proposed in this study as a prediction method that differs from the conventional production and design methods for a product. When a client has different dimensional requirements, this approach can quickly provide a company with design decisions for each product. The production factors of a…

  8. Runge-Kutta Methods for Linear Ordinary Differential Equations

    NASA Technical Reports Server (NTRS)

    Zingg, David W.; Chisholm, Todd T.

    1997-01-01

    Three new Runge-Kutta methods are presented for numerical integration of systems of linear inhomogeneous ordinary differential equations (ODES) with constant coefficients. Such ODEs arise in the numerical solution of the partial differential equations governing linear wave phenomena. The restriction to linear ODEs with constant coefficients reduces the number of conditions which the coefficients of the Runge-Kutta method must satisfy. This freedom is used to develop methods which are more efficient than conventional Runge-Kutta methods. A fourth-order method is presented which uses only two memory locations per dependent variable, while the classical fourth-order Runge-Kutta method uses three. This method is an excellent choice for simulations of linear wave phenomena if memory is a primary concern. In addition, fifth- and sixth-order methods are presented which require five and six stages, respectively, one fewer than their conventional counterparts, and are therefore more efficient. These methods are an excellent option for use with high-order spatial discretizations.

  9. Damage diagnosis algorithm using a sequential change point detection method with an unknown distribution for damage

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.

    2012-04-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  10. Colon Capsule Endoscopy: Where Are We and Where Are We Going

    PubMed Central

    Han, Yoo Min; Im, Jong Pil

    2016-01-01

    Colon capsule endoscopy (CCE) is a noninvasive technique for diagnostic imaging of the colon. It does not require air inflation or sedation and allows minimally invasive and painless colonic evaluation. The role of CCE is rapidly evolving; for example, for colorectal screening (colorectal cancer [CRC]) in average-risk patients, in patients with an incomplete colonoscopy, in patients refusing a conventional colonoscopy, and in patients with contraindications for conventional colonoscopy. In this paper, we comprehensively review the technical characteristics and procedure of CCE and compare CCE with conventional methods such as conventional colonoscopy or computed tomographic colonography. Future expansion of CCE in the area of CRC screening for the surveillance of polyps and adenomatous lesions and for assessment of inflammatory bowel disease is also discussed. PMID:27653441

  11. Microbial Burden Approach : New Monitoring Approach for Measuring Microbial Burden

    NASA Technical Reports Server (NTRS)

    Venkateswaran, Kasthuri; Vaishampayan, Parag; Barmatz, Martin

    2013-01-01

    Advantages of new approach for differentiating live cells/ spores from dead cells/spores. Four examples of Salmonella outbreaks leading to costly destruction of dairy products. List of possible collaboration activities between JPL and other industries (for future discussion). Limitations of traditional microbial monitoring approaches. Introduction to new approach for rapid measurement of viable (live) bacterial cells/spores and its areas of application. Detailed example for determining live spores using new approach (similar procedure for determining live cells). JPL has developed a patented approach for measuring amount of live and dead cells/spores. This novel "molecular" method takes less than 5 to 7 hrs. compared to the seven days required using conventional techniques. Conventional "molecular" techniques can not discriminate live cells/spores among dead cells/spores. The JPL-developed novel method eliminates false positive results obtained from conventional "molecular" techniques that lead to unnecessary delay in the processing and to unnecessary destruction of food products.

  12. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  13. Joint Contracture Orthosis (JCO)

    NASA Technical Reports Server (NTRS)

    Lunsford, Thomas R.; Parsons, Ken; Krouskop, Thomas; McGee, Kevin

    1997-01-01

    The purpose of this project was to develop an advanced orthosis which is effective in reducing upper and lower limb contractures in significantly less time than currently required with conventional methods. The team that developed the JCO consisted of an engineer, orthotist, therapist, and physician.

  14. A non-randomised, controlled clinical trial of an innovative device for negative pressure wound therapy of pressure ulcers in traumatic paraplegia patients.

    PubMed

    Srivastava, Rajeshwar N; Dwivedi, Mukesh K; Bhagat, Amit K; Raj, Saloni; Agarwal, Rajiv; Chandra, Abhijit

    2016-06-01

    The conventional methods of treatment of pressure ulcers (PUs) by serial debridement and daily dressings require prolonged hospitalisation, associated with considerable morbidity. There is, however, recent evidence to suggest that negative pressure wound therapy (NPWT) accelerates healing. The commercial devices for NPWT are costly, cumbersome, and electricity dependent. We compared PU wound healing in traumatic paraplegia patients by conventional dressing and by an innovative negative pressure device (NPD). In this prospective, non-randomised trial, 48 traumatic paraplegia patients with PUs of stages 3 and 4 were recruited. Patients were divided into two groups: group A (n = 24) received NPWT with our NPD, and group B (n = 24) received conventional methods of dressing. All patients were followed up for 9 weeks. At week 9, all patients on NPD showed a statistically significant improvement in PU healing in terms of slough clearance, granulation tissue formation, wound discharge and culture. A significant reduction in wound size and ulcer depth was observed in NPD as compared with conventional methods at all follow-up time points (P = 0·0001). NPWT by the innovative device heals PUs at a significantly higher rate than conventional treatment. The device is safe, easy to apply and cost-effective. © 2014 The Authors. International Wound Journal © 2014 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  15. Development of Extended Ray-tracing method including diffraction, polarization and wave decay effects

    NASA Astrophysics Data System (ADS)

    Yanagihara, Kota; Kubo, Shin; Dodin, Ilya; Nakamura, Hiroaki; Tsujimura, Toru

    2017-10-01

    Geometrical Optics Ray-tracing is a reasonable numerical analytic approach for describing the Electron Cyclotron resonance Wave (ECW) in slowly varying spatially inhomogeneous plasma. It is well known that the result with this conventional method is adequate in most cases. However, in the case of Helical fusion plasma which has complicated magnetic structure, strong magnetic shear with a large scale length of density can cause a mode coupling of waves outside the last closed flux surface, and complicated absorption structure requires a strong focused wave for ECH. Since conventional Ray Equations to describe ECW do not have any terms to describe the diffraction, polarization and wave decay effects, we can not describe accurately a mode coupling of waves, strong focus waves, behavior of waves in inhomogeneous absorption region and so on. For fundamental solution of these problems, we consider the extension of the Ray-tracing method. Specific process is planned as follows. First, calculate the reference ray by conventional method, and define the local ray-base coordinate system along the reference ray. Then, calculate the evolution of the distributions of amplitude and phase on ray-base coordinate step by step. The progress of our extended method will be presented.

  16. A Review of Microwave-Assisted Reactions for Biodiesel Production

    PubMed Central

    Nomanbhay, Saifuddin; Ong, Mei Yin

    2017-01-01

    The conversion of biomass into chemicals and biofuels is an active research area as trends move to replace fossil fuels with renewable resources due to society’s increased concern towards sustainability. In this context, microwave processing has emerged as a tool in organic synthesis and plays an important role in developing a more sustainable world. Integration of processing methods with microwave irradiation has resulted in a great reduction in the time required for many processes, while the reaction efficiencies have been increased markedly. Microwave processing produces a higher yield with a cleaner profile in comparison to other methods. The microwave processing is reported to be a better heating method than the conventional methods due to its unique thermal and non-thermal effects. This paper provides an insight into the theoretical aspects of microwave irradiation practices and highlights the importance of microwave processing. The potential of the microwave technology to accomplish superior outcomes over the conventional methods in biodiesel production is presented. A green process for biodiesel production using a non-catalytic method is still new and very costly because of the supercritical condition requirement. Hence, non-catalytic biodiesel conversion under ambient pressure using microwave technology must be developed, as the energy utilization for microwave-based biodiesel synthesis is reported to be lower and cost-effective. PMID:28952536

  17. A Review of Microwave-Assisted Reactions for Biodiesel Production.

    PubMed

    Nomanbhay, Saifuddin; Ong, Mei Yin

    2017-06-15

    The conversion of biomass into chemicals and biofuels is an active research area as trends move to replace fossil fuels with renewable resources due to society's increased concern towards sustainability. In this context, microwave processing has emerged as a tool in organic synthesis and plays an important role in developing a more sustainable world. Integration of processing methods with microwave irradiation has resulted in a great reduction in the time required for many processes, while the reaction efficiencies have been increased markedly. Microwave processing produces a higher yield with a cleaner profile in comparison to other methods. The microwave processing is reported to be a better heating method than the conventional methods due to its unique thermal and non-thermal effects. This paper provides an insight into the theoretical aspects of microwave irradiation practices and highlights the importance of microwave processing. The potential of the microwave technology to accomplish superior outcomes over the conventional methods in biodiesel production is presented. A green process for biodiesel production using a non-catalytic method is still new and very costly because of the supercritical condition requirement. Hence, non-catalytic biodiesel conversion under ambient pressure using microwave technology must be developed, as the energy utilization for microwave-based biodiesel synthesis is reported to be lower and cost-effective.

  18. Laparoendoscopic single-site surgery (LESS) versus conventional laparoscopic surgery for adnexal preservation: a randomized controlled study

    PubMed Central

    Cho, Yeon Jean; Kim, Mi-La; Lee, Soo Yoon; Lee, Hee Suk; Kim, Joo Myoung; Joo, Kwan Young

    2012-01-01

    Objective To compare the operative outcomes, postoperative pain, and subsequent convalescence after laparoendoscopic single-site surgery (LESS) or conventional laparoscopic surgery for adnexal preservation. Study design From December 2009 to September 2010, 63 patients underwent LESS (n = 33) or a conventional laparoscopic surgery (n = 30) for cyst enucleation. The overall operative outcomes including postoperative pain measurement using the visual analog scale (VAS) were evaluated (time points 6, 24, and 24 hours). The convalescence data included data obtained from questionnaires on the need for analgesics and on patient-reported time to recovery end points. Results The preoperative characteristics did not significantly differ between the two groups. The postoperative hemoglobin drop was higher in the LESS group than in the conventional laparoscopic surgery group (P = 0.048). Postoperative pain at each VAS time point, oral analgesic requirement, intramuscular analgesic requirement, and the number of days until return to work were similar in both groups. Conclusion In adnexa-preserving surgery performed in reproductive-age women, the operative outcomes, including satisfaction of the patients and convalescence after surgery, are comparable for LESS and conventional laparoscopy. LESS may be a feasible and a promising alternative method for scarless abdominal surgery in the treatment of young women with adnexal cysts PMID:22448110

  19. Center index method-an alternative for wear measurements with radiostereometry (RSA).

    PubMed

    Dahl, Jon; Figved, Wender; Snorrason, Finnur; Nordsletten, Lars; Röhrl, Stephan M

    2013-03-01

    Radiostereometry (RSA) is considered to be the most precise and accurate method for wear-measurements in total hip replacement. Post-operative stereoradiographs has so far been necessary for wear measurement. Hence, the use of RSA has been limited to studies planned for RSA measurements. We compared a new RSA method for wear measurements that does not require previous radiographs with conventional RSA. Instead of comparing present stereoradiographs with post-operative ones, we developed a method for calculating the post-operative position of the center of the femoral head on the present examination and using this as the index measurement. We compared this alternative method to conventional RSA in 27 hips in an ongoing RSA study. We found a high degree of agreement between the methods for both mean proximal (1.19 mm vs. 1.14 mm) and mean 3D wear (1.52 mm vs. 1.44 mm) after 10 years. Intraclass correlation coefficients (ICC) were 0.958 and 0.955, respectively (p<0.001 for both ICCs). The results were also within the limits of agreement when plotted subject-by-subject in a Bland-Altman plot. Our alternative method for wear measurements with RSA offers comparable results to conventional RSA measurements. It allows precise wear measurements without previous radiological examinations. Copyright © 2012 Orthopaedic Research Society.

  20. Microwave-Assisted Synthesis of High Dielectric Constant CaCu3Ti4O12 from Sol-Gel Precursor

    NASA Astrophysics Data System (ADS)

    Ouyang, Xin; Cao, Peng; Huang, Saifang; Zhang, Weijun; Huang, Zhaohui; Gao, Wei

    2015-07-01

    CaCu3Ti4O12 (CCTO) powders derived from sol-gel precursors were calcined and sintered via microwave radiation. The obtained CCTO powders were compared with that obtained via a conventional heating method. For microwave heating, 89.1 wt.% CCTO was achieved from the sol-gel precursor, after only 17 min at 950°C. In contrast, the conventional calcination method required 3 h to generate 87.6 wt.% CCTO content at 1100°C. In addition, the CCTO powders prepared through 17 min of microwave calcination exhibited a small particle size distribution of D50 = 3.826 μm. It was found that a lengthy hold time of 1 h by microwave sintering is required to obtain a high dielectric constant (3.14 × 103 at 102 Hz) and a reasonably low dielectric loss (0.161) in the sintered CCTO ceramic. Based upon the distinct microstructures, the dielectric responses of the CCTO samples sintered by different methods are attributed to space charge polarization and internal barrier layer capacitor mechanism.

  1. Isothermal separation processes

    NASA Technical Reports Server (NTRS)

    England, C.

    1982-01-01

    The isothermal processes of membrane separation, supercritical extraction and chromatography were examined using availability analysis. The general approach was to derive equations that identified where energy is consumed in these processes and how they compare with conventional separation methods. These separation methods are characterized by pure work inputs, chiefly in the form of a pressure drop which supplies the required energy. Equations were derived for the energy requirement in terms of regular solution theory. This approach is believed to accurately predict the work of separation in terms of the heat of solution and the entropy of mixing. It can form the basis of a convenient calculation method for optimizing membrane and solvent properties for particular applications. Calculations were made on the energy requirements for a membrane process separating air into its components.

  2. Quantifying electrical impacts on redundant wire insertion in 7nm unidirectional designs

    NASA Astrophysics Data System (ADS)

    Mohyeldin, Ahmed; Schroeder, Uwe Paul; Srinivasan, Ramya; Narisetty, Haritez; Malik, Shobhit; Madhavan, Sriram

    2017-04-01

    In nano-meter scale Integrated Circuits, via fails due to random defects is a well-known yield detractor, and via redundancy insertion is a common method to help enhance semiconductors yield. For the case of Self Aligned Double Patterning (SADP), which might require unidirectional design layers as in the case of some advanced technology nodes, the conventional methods of inserting redundant vias don't work any longer. This is because adding redundant vias conventionally requires adding metal shapes in the non-preferred direction, which will violate the SADP design constraints in that case. Therefore, such metal layers fabricated using unidirectional SADP require an alternative method for providing the needed redundancy. This paper proposes a post-layout Design for Manufacturability (DFM) redundancy insertion method tailored for the design requirements introduced by unidirectional metal layers. The proposed method adds redundant wires in the preferred direction - after searching for nearby vacant routing tracks - in order to provide redundant paths for electrical signals. This method opportunistically adds robustness against failures due to silicon defects without impacting area or incurring new design rule violations. Implementation details of this redundancy insertion method will be explained in this paper. One known challenge with similar DFM layout fixing methods is the possible introduction of undesired electrical impact, causing other unintentional failures in design functionality. In this paper, a study is presented to quantify the electrical impacts of such redundancy insertion scheme and to examine if that electrical impact can be tolerated. The paper will show results to evaluate DFM insertion rates and corresponding electrical impact for a given design utilization and maximum inserted wire length. Parasitic extraction and static timing analysis results will be presented. A typical digital design implemented using GLOBALFOUNDRIES 7nm technology is used for demonstration. The provided results can help evaluate such extensive DFM insertion method from an electrical standpoint. Furthermore, the results could provide guidance on how to implement the proposed method of adding electrical redundancy such that intolerable electrical impacts could be avoided.

  3. A comparative analysis of vehicle-related greenhouse gas emissions between organic and conventional dairy production.

    PubMed

    Aggestam, Vivianne; Buick, Jon

    2017-08-01

    Agricultural industrialisation and globalisation have steadily increased the transportation of food across the world. In efforts to promote sustainability and self-sufficiency, organic milk producers in Sweden are required to produce a higher level of cattle feed on-farm in the hope that increased self-sufficiency will reduce reliance on external inputs and reduce transport-related greenhouse gas emissions. Using data collected from 20 conventional and 20 organic milk producers in Sweden this paper aims to assess the global warming impact of farmyard vehicles and the transportation of feed produced 'off-farm' in order to compare the impact of vehicle-related emissions from the different production methods. The findings show organic and conventional production methods have different vehicle-related emission outputs that vary according to a reliance on either road transportation or increased farmyard machinery use. Mechanical weeding is more fuel demanding than conventional agrichemical sprayers. However, artificial fertilising is one of the highest farmyard vehicle-related emitters. The general findings show organic milk production emits higher levels of farm vehicle-related emissions that fail to be offset by reduced emissions occurring from international transport emissions. This paper does not propose to cover a comprehensive supply chain carbon footprint for milk production or attempt to determine which method of production has the largest climatic impact. However, it does demonstrate that Sweden's legal requirements for organic producers to produce more feed on-farm to reduce transport emissions have brought emissions back within Sweden's greenhouse gas inventory and raises questions around the effectiveness of policies to reduce vehicle-related emissions. Further research is needed into the effectiveness of climate change mitigation on food production policies, in particular looking at various trade-offs that affects the entire food supply chain.

  4. An analytical fuzzy-based approach to ?-gain optimal control of input-affine nonlinear systems using Newton-type algorithm

    NASA Astrophysics Data System (ADS)

    Milic, Vladimir; Kasac, Josip; Novakovic, Branko

    2015-10-01

    This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.

  5. Method of forming crystalline silicon devices on glass

    DOEpatents

    McCarthy, Anthony M.

    1995-01-01

    A method for fabricating single-crystal silicon microelectronic components on a silicon substrate and transferring same to a glass substrate. This is achieved by utilizing conventional silicon processing techniques for fabricating components of electronic circuits and devices on bulk silicon, wherein a bulk silicon surface is prepared with epitaxial layers prior to the conventional processing. The silicon substrate is bonded to a glass substrate and the bulk silicon is removed leaving the components intact on the glass substrate surface. Subsequent standard processing completes the device and circuit manufacturing. This invention is useful in applications requiring a transparent or insulating substrate, particularly for display manufacturing. Other applications include sensors, actuators, optoelectronics, radiation hard electronics, and high temperature electronics.

  6. Detecting proteins in highly autofluorescent cells using quantum dot antibody conjugates.

    PubMed

    Orcutt, Karen M; Ren, Shanshan; Gundersen, Kjell

    2009-01-01

    We have applied quantum dot (Qdot) antibody conjugates as a biomolecular probe for cellular proteins important in biogeochemical cycling in the sea. Conventional immunological methods have been hampered by the strong autofluorescence found in cyanobacteria cells. Qdot conjugates provide an ideal alternative for studies that require long-term imaging of cells such as detection of low abundance cellular antigens by fluorescence microscopy. The advantage of Qdot labeled probes over conventional immunological methods is the photostability of the probe. Phycoerythrin bleaches in cyanobacterial cells under prolonged UV or blue light excitation, which means that the semiconducting nanocrystal probe, the Qdot, can yield a strong fluorescent signal without interference from cellular pigments.

  7. Progress in electrochemical storage for battery systems

    NASA Technical Reports Server (NTRS)

    Ford, F. E.; Hennigan, T. J.; Palandati, C. F.; Cohn, E.

    1972-01-01

    Efforts to improve electrochemical systems for space use relate to: (1) improvement of conventional systems; (2) development of fuel cells to practical power systems; and (3) a search for new systems that provide gains in energy density but offer comparable life and performance as conventional systems. Improvements in sealed conventional systems resulted in the areas of materials, charge control methods, cell operations and battery control, and specific process controls required during cell manufacture. Fuel-cell systems have been developed for spacecraft but the use of these power plants is limited. For present and planned flights, nickel-cadmium, silver-zinc, and silver-cadmium systems will be used. Improvements in nickel-cadmium batteries have been applied in medical and commercial areas.

  8. Model of Learning Using iLearning on Independent Study Classes at University

    ERIC Educational Resources Information Center

    Sudaryono; Padeli; Febriyanto, Erick

    2017-01-01

    Raharja College is one of the universities who apply a learning method that is quite different which does not only rely on the conventional learning system in which Teaching and Learning Activity is done by students and lecturers are required to come face to face directly, but also applying e-learning method learning or better known as iLearning…

  9. Understanding the Learning Experiences of Postgraduate Latin American Students in a UK Context: A Narrative Approach

    ERIC Educational Resources Information Center

    James, Gwyneth

    2013-01-01

    Researching the learning experiences of postgraduate students requires a different type of qualitative research to enable access to areas of their lives which may well remain hidden with more conventional methods of research. Narrative inquiry as both method and methodology allows such access. In this article, I focus on the use, appropriateness,…

  10. GaAs thin films and methods of making and using the same

    DOEpatents

    Boettcher, Shannon; Ritenour, Andrew; Boucher, Jason; Greenaway, Ann

    2016-06-14

    Disclosed herein are embodiments of methods for making GaAs thin films, such as photovoltaic GaAs thin films. The methods disclosed herein utilize sources, precursors, and reagents that do not produce (or require) toxic gas and that are readily available and relatively low in cost. In some embodiments, the methods are readily scalable for industrial applications and can provide GaAs thin films having properties that are at least comparable to or potentially superior to GaAs films obtained from conventional methods.

  11. Autonomous control systems - Architecture and fundamental issues

    NASA Technical Reports Server (NTRS)

    Antsaklis, P. J.; Passino, K. M.; Wang, S. J.

    1988-01-01

    A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).

  12. Vascular applications of telepresence surgery: initial feasibility studies in swine.

    PubMed

    Bowersox, J C; Shah, A; Jensen, J; Hill, J; Cordts, P R; Green, P S

    1996-02-01

    Telepresence surgery is a novel technology that will allow procedures to be performed on a patient at locations that are physically remote from the operating surgeon. This new method provides the sensory illusion that the surgeon's hands are in direct contact with the patient. We studied the feasibility of the use of telepresence surgery to perform basic operations in vascular surgery, including tissue dissection, vessel manipulation, and suturing. A prototype telepresence surgery system with bimanual force-reflective manipulators, interchangeable surgical instruments, and stereoscopic video input was used. Arteriotomies created ex vivo in segments of bovine aortae or in vivo in femoral arteries of anesthetized swine were closed with telepresence surgery or by conventional techniques. Time required, technical quality (patency, integrity of suture line), and subjective difficulty were compared for the two methods. All attempted procedures were successfully completed with telepresence surgery. Arteriotomy closures were completed in 192+/-24 sec with conventional techniques and 483+/-118 sec with telepresence surgery, but the precision attained with telepresence surgery was equal to that of conventional techniques. Telepresence surgery was described as intuitive and natural by the surgeons who used the system. Blood-vessel manipulation and suturing with telepresence surgery are feasible. Further instrument development (to increase degrees of freedom) is required to achieve operating times comparable to conventional open surgery, but the system has great potential to extend the expertise of vascular surgeons to locations where specialty care is currently unavailable.

  13. Recent Advances in Mycotoxin Determination for Food Monitoring via Microchip

    PubMed Central

    Man, Yan; Liang, Gang; Li, An; Pan, Ligang

    2017-01-01

    Mycotoxins are one of the main factors impacting food safety. Mycotoxin contamination has threatened the health of humans and animals. Conventional methods for the detection of mycotoxins are gas chromatography (GC) or liquid chromatography (LC) coupled with mass spectrometry (MS), or enzyme-linked immunosorbent assay (ELISA). However, all these methods are time-consuming, require large-scale instruments and skilled technicians, and consume large amounts of hazardous regents and solvents. Interestingly, a microchip requires less sample consumption and short analysis time, and can realize the integration, miniaturization, and high-throughput detection of the samples. Hence, the application of a microchip for the detection of mycotoxins can make up for the deficiency of the conventional detection methods. This review focuses on the application of a microchip to detect mycotoxins in foods. The toxicities of mycotoxins and the materials of the microchip are firstly summarized in turn. Then the application of a microchip that integrates various kinds of detection methods (optical, electrochemical, photo-electrochemical, and label-free detection) to detect mycotoxins is reviewed in detail. Finally, challenges and future research directions in the development of a microchip to detect mycotoxins are previewed. PMID:29036884

  14. Design of a practical model-observer-based image quality assessment method for CT imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana

    2014-03-01

    The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

  15. Wideband Motion Control by Position and Acceleration Input Based Disturbance Observer

    NASA Astrophysics Data System (ADS)

    Irie, Kouhei; Katsura, Seiichiro; Ohishi, Kiyoshi

    The disturbance observer can observe and suppress the disturbance torque within its bandwidth. Recent motion systems begin to spread in the society and they are required to have ability to contact with unknown environment. Such a haptic motion requires much wider bandwidth. However, since the conventional disturbance observer attains the acceleration response by the second order derivative of position response, the bandwidth is limited due to the derivative noise. This paper proposes a novel structure of a disturbance observer. The proposed disturbance observer uses an acceleration sensor for enlargement of bandwidth. Generally, the bandwidth of an acceleration sensor is from 1Hz to more than 1kHz. To cover DC range, the conventional position sensor based disturbance observer is integrated. Thus, the performance of the proposed Position and Acceleration input based disturbance observer (PADO) is superior to the conventional one. The PADO is applied to position control (infinity stiffness) and force control (zero stiffness). The numerical and experimental results show viability of the proposed method.

  16. A variational eigenvalue solver on a photonic quantum processor

    PubMed Central

    Peruzzo, Alberto; McClean, Jarrod; Shadbolt, Peter; Yung, Man-Hong; Zhou, Xiao-Qi; Love, Peter J.; Aspuru-Guzik, Alán; O’Brien, Jeremy L.

    2014-01-01

    Quantum computers promise to efficiently solve important problems that are intractable on a conventional computer. For quantum systems, where the physical dimension grows exponentially, finding the eigenvalues of certain operators is one such intractable problem and remains a fundamental challenge. The quantum phase estimation algorithm efficiently finds the eigenvalue of a given eigenvector but requires fully coherent evolution. Here we present an alternative approach that greatly reduces the requirements for coherent evolution and combine this method with a new approach to state preparation based on ansätze and classical optimization. We implement the algorithm by combining a highly reconfigurable photonic quantum processor with a conventional computer. We experimentally demonstrate the feasibility of this approach with an example from quantum chemistry—calculating the ground-state molecular energy for He–H+. The proposed approach drastically reduces the coherence time requirements, enhancing the potential of quantum resources available today and in the near future. PMID:25055053

  17. Robust backstepping control of an interlink converter in a hybrid AC/DC microgrid based on feedback linearisation method

    NASA Astrophysics Data System (ADS)

    Dehkordi, N. Mahdian; Sadati, N.; Hamzeh, M.

    2017-09-01

    This paper presents a robust dc-link voltage as well as a current control strategy for a bidirectional interlink converter (BIC) in a hybrid ac/dc microgrid. To enhance the dc-bus voltage control, conventional methods strive to measure and feedforward the load or source power in the dc-bus control scheme. However, the conventional feedforward-based approaches require remote measurement with communications. Moreover, conventional methods suffer from stability and performance issues, mainly due to the use of the small-signal-based control design method. To overcome these issues, in this paper, the power from DG units of the dc subgrid imposed on the BIC is considered an unmeasurable disturbance signal. In the proposed method, in contrast to existing methods, using the nonlinear model of BIC, a robust controller that does not need the remote measurement with communications effectively rejects the impact of the disturbance signal imposed on the BIC's dc-link voltage. To avoid communication links, the robust controller has a plug-and-play feature that makes it possible to add a DG/load to or remove it from the dc subgrid without distorting the hybrid microgrid stability. Finally, Monte Carlo simulations are conducted to confirm the effectiveness of the proposed control strategy in MATLAB/SimPowerSystems software environment.

  18. Economic comparison of conventional maintenance and electrochemical oxidation to warrant water safety in dental unit water lines.

    PubMed

    Fischer, Sebastian; Meyer, Georg; Kramer, Axel

    2012-01-01

    In preparation for implementation of a central water processing system at a dental department, we analyzed the costs of conventional decentralized disinfection of dental units against a central water treatment concept based on electrochemical disinfection. The cost evaluation included only the costs of annually required antimicrobial consumables and additional water usage of a decentralize conventional maintenance system for dental water lines build in the respective dental units and the central electrochemical water disinfection system, BLUE SAFETY™ Technologies. In total, analysis of costs of 6 dental departments reviled additional annual costs for hygienic preventive measures of € 4,448.37. For the BLUE SAFETY™ Technology, the additional annual total agent consumption costs were € 2.18, accounting for approximately 0.05% of the annual total agent consumption costs of the conventional maintenance system. For both water processing concepts, the additional costs for energy could not be calculated, since the required data was not obtainable from the manufacturers. For both concepts, the investment and maintenance costs were not calculated due to lack of manufacturer's data. Therefore, the results indicate the difference of costs for the required consumables only. Aside of the significantly lower annual costs for required consumables and disinfectants; a second advantage for the BLUE SAFETY™ Technology is its constant and automatic operation, which does not require additional staff resources. This not only safety human resources, but add additionally to cost saving. Since the antimicrobial disinfection capacity of the BLUE SAFETY™ was demonstrated previously and is well known, this technology, which is comparable or even superior in its non-corrosive effect, may be regarded as method of choice for continuous disinfection and prevention of biofilm formation in dental units' water lines.

  19. Evaluation of counting methods for oceanic radium-228

    NASA Astrophysics Data System (ADS)

    Orr, James C.

    1988-07-01

    Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing is required. The efficiency and background of each of the three new methods have been estimated and are compared with those of the three methods currently employed to measure oceanic 228Ra. From efficiency and background, the relative figure of merit and the detection limit have been determined for each of the six counters. These data suggest that the new counting methods have the potential to measure most 228Ra samples with just 30 L of seawater, to better than 5% precision. Not only would this reduce the time, effort, and expense involved in sample collection, but 228Ra could then be measured on many small-volume samples (20-30 L) previously collected with only 226Ra in mind. By measuring 228Ra quantitatively on such small-volume samples, three analyses (large-volume 228Ra, large-volume 226Ra, and small-volume 226Ra) could be reduced to one, thereby dramatically improving analytical precision.

  20. NOTE: Solving the ECG forward problem by means of a meshless finite element method

    NASA Astrophysics Data System (ADS)

    Li, Z. S.; Zhu, S. A.; He, Bin

    2007-07-01

    The conventional numerical computational techniques such as the finite element method (FEM) and the boundary element method (BEM) require laborious and time-consuming model meshing. The new meshless FEM only uses the boundary description and the node distribution and no meshing of the model is required. This paper presents the fundamentals and implementation of meshless FEM and the meshless FEM method is adapted to solve the electrocardiography (ECG) forward problem. The method is evaluated on a single-layer torso model, in which the analytical solution exists, and tested in a realistic geometry homogeneous torso model, with satisfactory results being obtained. The present results suggest that the meshless FEM may provide an alternative for ECG forward solutions.

  1. Estimating Soil Hydraulic Parameters using Gradient Based Approach

    NASA Astrophysics Data System (ADS)

    Rai, P. K.; Tripathi, S.

    2017-12-01

    The conventional way of estimating parameters of a differential equation is to minimize the error between the observations and their estimates. The estimates are produced from forward solution (numerical or analytical) of differential equation assuming a set of parameters. Parameter estimation using the conventional approach requires high computational cost, setting-up of initial and boundary conditions, and formation of difference equations in case the forward solution is obtained numerically. Gaussian process based approaches like Gaussian Process Ordinary Differential Equation (GPODE) and Adaptive Gradient Matching (AGM) have been developed to estimate the parameters of Ordinary Differential Equations without explicitly solving them. Claims have been made that these approaches can straightforwardly be extended to Partial Differential Equations; however, it has been never demonstrated. This study extends AGM approach to PDEs and applies it for estimating parameters of Richards equation. Unlike the conventional approach, the AGM approach does not require setting-up of initial and boundary conditions explicitly, which is often difficult in real world application of Richards equation. The developed methodology was applied to synthetic soil moisture data. It was seen that the proposed methodology can estimate the soil hydraulic parameters correctly and can be a potential alternative to the conventional method.

  2. COMBINED SEWER OVERFLOW - BALANCING FLOW FOR CSO ABATEMENT

    EPA Science Inventory

    Instead of using conventional storage units, e.g., reinforced concrete tanks and lined earthen basins, which are relatively expensive and require a lot of urban land area, the in-receiving water flow balance method (FBM) facilities use the receiving water body itself for storage ...

  3. Evaluation of the marginal fit of metal copings fabricated on three different marginal designs using conventional and accelerated casting techniques: an in vitro study.

    PubMed

    Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad

    2014-01-01

    Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.

  4. Stapled peptides as a new technology to investigate protein-protein interactions in human platelets.

    PubMed

    Iegre, Jessica; Ahmed, Niaz S; Gaynord, Josephine S; Wu, Yuteng; Herlihy, Kara M; Tan, Yaw Sing; Lopes-Pires, Maria E; Jha, Rupam; Lau, Yu Heng; Sore, Hannah F; Verma, Chandra; O' Donovan, Daniel H; Pugh, Nicholas; Spring, David R

    2018-05-28

    Platelets are blood cells with numerous crucial pathophysiological roles in hemostasis, cardiovascular thrombotic events and cancer metastasis. Platelet activation requires the engagement of intracellular signalling pathways that involve protein-protein interactions (PPIs). A better understanding of these pathways is therefore crucial for the development of selective anti-platelet drugs. New strategies for studying PPIs in human platelets are required to overcome limitations associated with conventional platelet research methods. For example, small molecule inhibitors can lack selectivity and are often difficult to design and synthesise. Additionally, development of transgenic animal models is costly and time-consuming and conventional recombinant techniques are ineffective due to the lack of a nucleus in platelets. Herein, we describe the generation of a library of novel, functionalised stapled peptides and their first application in the investigation of platelet PPIs. Moreover, the use of platelet-permeable stapled Bim BH3 peptides confirms the part of Bim in phosphatidyl-serine (PS) exposure and reveals a role for the Bim protein in platelet activatory processes. Our work demonstrates that functionalised stapled peptides are a complementary alternative to conventional platelet research methods, and could make a significant contribution to the understanding of platelet signalling pathways and hence to the development of anti-platelet drugs.

  5. Distributive Distillation Enabled by Microchannel Process Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arora, Ravi

    The application of microchannel technology for distributive distillation was studied to achieve the Grand Challenge goals of 25% energy savings and 10% return on investment. In Task 1, a detailed study was conducted and two distillation systems were identified that would meet the Grand Challenge goals if the microchannel distillation technology was used. Material and heat balance calculations were performed to develop process flow sheet designs for the two distillation systems in Task 2. The process designs were focused on two methods of integrating the microchannel technology 1) Integrating microchannel distillation to an existing conventional column, 2) Microchannel distillation formore » new plants. A design concept for a modular microchannel distillation unit was developed in Task 3. In Task 4, Ultrasonic Additive Machining (UAM) was evaluated as a manufacturing method for microchannel distillation units. However, it was found that a significant development work would be required to develop process parameters to use UAM for commercial distillation manufacturing. Two alternate manufacturing methods were explored. Both manufacturing approaches were experimentally tested to confirm their validity. The conceptual design of the microchannel distillation unit (Task 3) was combined with the manufacturing methods developed in Task 4 and flowsheet designs in Task 2 to estimate the cost of the microchannel distillation unit and this was compared to a conventional distillation column. The best results were for a methanol-water separation unit for the use in a biodiesel facility. For this application microchannel distillation was found to be more cost effective than conventional system and capable of meeting the DOE Grand Challenge performance requirements.« less

  6. Laser scanning saturated structured illumination microscopy based on phase modulation

    NASA Astrophysics Data System (ADS)

    Huang, Yujia; Zhu, Dazhao; Jin, Luhong; Kuang, Cuifang; Xu, Yingke; Liu, Xu

    2017-08-01

    Wide-field saturated structured illumination microscopy has not been widely used due to the requirement of high laser power. We propose a novel method called laser scanning saturated structured illumination microscopy (LS-SSIM), which introduces high order of harmonics frequency and greatly reduces the required laser power for SSIM imaging. To accomplish that, an excitation PSF with two peaks is generated and scanned along different directions on the sample. Raw images are recorded cumulatively by a CCD detector and then reconstructed to form a high-resolution image with extended optical transfer function (OTF). Our theoretical analysis and simulation results show that LS-SSIM method reaches a resolution of 0.16 λ, equivalent to 2.7-fold resolution than conventional wide-field microscopy. In addition, LS-SSIM greatly improves the optical sectioning capability of conventional wide-field illumination system by diminishing our-of-focus light. Furthermore, this modality has the advantage of implementation in multi-photon microscopy with point scanning excitation to image samples in greater depths.

  7. 2D and 3D X-ray phase retrieval of multi-material objects using a single defocus distance.

    PubMed

    Beltran, M A; Paganin, D M; Uesugi, K; Kitchen, M J

    2010-03-29

    A method of tomographic phase retrieval is developed for multi-material objects whose components each has a distinct complex refractive index. The phase-retrieval algorithm, based on the Transport-of-Intensity equation, utilizes propagation-based X-ray phase contrast images acquired at a single defocus distance for each tomographic projection. The method requires a priori knowledge of the complex refractive index for each material present in the sample, together with the total projected thickness of the object at each orientation. The requirement of only a single defocus distance per projection simplifies the experimental setup and imposes no additional dose compared to conventional tomography. The algorithm was implemented using phase contrast data acquired at the SPring-8 Synchrotron facility in Japan. The three-dimensional (3D) complex refractive index distribution of a multi-material test object was quantitatively reconstructed using a single X-ray phase-contrast image per projection. The technique is robust in the presence of noise, compared to conventional absorption based tomography.

  8. Guiding principles of USGS methodology for assessment of undiscovered conventional oil and gas resources

    USGS Publications Warehouse

    Charpentier, R.R.; Klett, T.R.

    2005-01-01

    During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.

  9. Standard addition with internal standardisation as an alternative to using stable isotope labelled internal standards to correct for matrix effects-Comparison and validation using liquid chromatography-​tandem mass spectrometric assay of vitamin D.

    PubMed

    Hewavitharana, Amitha K; Abu Kassim, Nur Sofiah; Shaw, Paul Nicholas

    2018-06-08

    With mass spectrometric detection in liquid chromatography, co-eluting impurities affect the analyte response due to ion suppression/enhancement. Internal standard calibration method, using co-eluting stable isotope labelled analogue of each analyte as the internal standard, is the most appropriate technique available to correct for these matrix effects. However, this technique is not without drawbacks, proved to be expensive because separate internal standard for each analyte is required, and the labelled compounds are expensive or require synthesising. Traditionally, standard addition method has been used to overcome the matrix effects in atomic spectroscopy and was a well-established method. This paper proposes the same for mass spectrometric detection, and demonstrates that the results are comparable to those with the internal standard method using labelled analogues, for vitamin D assay. As conventional standard addition procedure does not address procedural errors, we propose the inclusion of an additional internal standard (not co-eluting). Recoveries determined on human serum samples show that the proposed method of standard addition yields more accurate results than the internal standardisation using stable isotope labelled analogues. The precision of the proposed method of standard addition is superior to the conventional standard addition method. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Reliable use of determinants to solve nonlinear structural eigenvalue problems efficiently

    NASA Technical Reports Server (NTRS)

    Williams, F. W.; Kennedy, D.

    1988-01-01

    The analytical derivation, numerical implementation, and performance of a multiple-determinant parabolic interpolation method (MDPIM) for use in solving transcendental eigenvalue (critical buckling or undamped free vibration) problems in structural mechanics are presented. The overall bounding, eigenvalue-separation, qualified parabolic interpolation, accuracy-confirmation, and convergence-recovery stages of the MDPIM are described in detail, and the numbers of iterations required to solve sample plane-frame problems using the MDPIM are compared with those for a conventional bisection method and for the Newtonian method of Simpson (1984) in extensive tables. The MDPIM is shown to use 31 percent less computation time than bisection when accuracy of 0.0001 is required, but 62 percent less when accuracy of 10 to the -8th is required; the time savings over the Newtonian method are about 10 percent.

  11. A wave superposition method formulated in digital acoustic space

    NASA Astrophysics Data System (ADS)

    Hwang, Yong-Sin

    In this thesis, a new formulation of the Wave Superposition method is proposed wherein the conventional mesh approach is replaced by a simple 3-D digital work space that easily accommodates shape optimization for minimizing or maximizing radiation efficiency. As sound quality is in demand in almost all product designs and also because of fierce competition between product manufacturers, faster and accurate computational method for shape optimization is always desired. Because the conventional Wave Superposition method relies solely on mesh geometry, it cannot accommodate fast shape changes in the design stage of a consumer product or machinery, where many iterations of shape changes are required. Since the use of a mesh hinders easy shape changes, a new approach for representing geometry is introduced by constructing a uniform lattice in a 3-D digital work space. A voxel (a portmanteau, a new word made from combining the sound and meaning, of the words, volumetric and pixel) is essentially a volume element defined by the uniform lattice, and does not require separate connectivity information as a mesh element does. In the presented method, geometry is represented with voxels that can easily adapt to shape changes, therefore it is more suitable for shape optimization. The new method was validated by computing radiated sound power of structures of simple and complex geometries and complex mode shapes. It was shown that matching volume velocity is a key component to an accurate analysis. A sensitivity study showed that it required at least 6 elements per acoustic wavelength, and a complexity study showed a minimal reduction in computational time.

  12. Magnetic constraints on early lunar evolution revisited: Limits on accuracy imposed by methods of paleointensity measurements

    NASA Technical Reports Server (NTRS)

    Banerjee, S. K.

    1984-01-01

    It is impossible to carry out conventional paleointensity experiments requiring repeated heating and cooling to 770 C without chemical, physical or microstructural changes on lunar samples. Non-thermal methods of paleointensity determination have been sought: the two anhysteretic remanent magnetization (ARM) methods, and the saturation isothermal remanent magnetization (IRMS) method. Experimental errors inherent in these alternative approaches have been investigated to estimate the accuracy limits on the calculated paleointensities. Results are indicated in this report.

  13. Confocal laser scanning microscopic photoconversion: a new method to stabilize fluorescently labeled cellular elements for electron microscopic analysis.

    PubMed

    Colello, Raymond J; Tozer, Jordan; Henderson, Scott C

    2012-01-01

    Photoconversion, the method by which a fluorescent dye is transformed into a stable, osmiophilic product that can be visualized by electron microscopy, is the most widely used method to enable the ultrastructural analysis of fluorescently labeled cellular structures. Nevertheless, the conventional method of photoconversion using widefield fluorescence microscopy requires long reaction times and results in low-resolution cell targeting. Accordingly, we have developed a photoconversion method that ameliorates these limitations by adapting confocal laser scanning microscopy to the procedure. We have found that this method greatly reduces photoconversion times, as compared to conventional wide field microscopy. Moreover, region-of-interest scanning capabilities of a confocal microscope facilitate the targeting of the photoconversion process to individual cellular or subcellular elements within a fluorescent field. This reduces the area of the cell exposed to light energy, thereby reducing the ultrastructural damage common to this process when widefield microscopes are employed. © 2012 by John Wiley & Sons, Inc.

  14. Characterization of nanoporous shales with gas sorption

    NASA Astrophysics Data System (ADS)

    Joewondo, N.; Prasad, M.

    2017-12-01

    The understanding of the fluid flow in porous media requires the knowledge of the pore system involved. Fluid flow in fine grained shales falls under different regime than transport regime in conventional reservoir due to the different average pore sizes in the two materials; the average pore diameter of conventional sandstones is on the micrometer scale, while of shales can be as small as several nanometers. Mercury intrusion porosimetry is normally used to characterize the pores of conventional reservoir, however with increasingly small pores, the injection pressure required to imbibe the pores becomes infinitely large due to surface tension. Characterization of pores can be expressed by a pore size distribution (PSD) plot, which reflects distribution of pore volume or surface area with respect to pore size. For the case of nanoporous materials, the surface area, which serves as the interface between the rock matrix and fluid, becomes increasingly large and important. Physisorption of gas has been extensively studied as a method of nanoporous solid characterization (particularly for the application of catalysis, metal organic frameworks, etc). The PSD is obtained by matching the experimental result to the calculated theoretical result (using Density Functional Theory (DFT), a quantum mechanics based modelling method for molecular scale interactions). We present the challenges and experimental result of Nitrogen and CO2 gas sorption on shales with various mineralogy and the interpreted PSD obtained by DFT method. Our result shows significant surface area contributed by the nanopores of shales, hence the importance of surface area measurements for the characterization of shales.

  15. A rapid, efficient, and economic device and method for the isolation and purification of mouse islet cells

    PubMed Central

    Zongyi, Yin; Funian, Zou; Hao, Li; Ying, Cheng; Jialin, Zhang

    2017-01-01

    Rapid, efficient, and economic method for the isolation and purification of islets has been pursued by numerous islet-related researchers. In this study, we compared the advantages and disadvantages of our developed patented method with those of commonly used conventional methods (Ficoll-400, 1077, and handpicking methods). Cell viability was assayed using Trypan blue, cell purity and yield were assayed using diphenylthiocarbazone, and islet function was assayed using acridine orange/ethidium bromide staining and enzyme-linked immunosorbent assay-glucose stimulation testing 4 days after cultivation. The results showed that our islet isolation and purification method required 12 ± 3 min, which was significantly shorter than the time required in Ficoll-400, 1077, and HPU groups (34 ± 3, 41 ± 4, and 30 ± 4 min, respectively; P < 0.05). There was no significant difference in islet viability among the four groups. The islet purity, function, yield, and cost of our method were superior to those of the Ficoll-400 and 1077 methods, but inferior to the handpicking method. However, the handpicking method may cause wrist injury and visual impairment in researchers during large-scale islet isolation (>1000 islets). In summary, the MCT method is a rapid, efficient, and economic method for isolating and purifying murine islet cell clumps. This method overcomes some of the shortcomings of conventional methods, showing a relatively higher quality and yield of islets within a shorter duration at a lower cost. Therefore, the current method provides researchers with an alternative option for islet isolation and should be widely generalized. PMID:28207765

  16. A rapid, efficient, and economic device and method for the isolation and purification of mouse islet cells.

    PubMed

    Zongyi, Yin; Funian, Zou; Hao, Li; Ying, Cheng; Jialin, Zhang; Baifeng, Li

    2017-01-01

    Rapid, efficient, and economic method for the isolation and purification of islets has been pursued by numerous islet-related researchers. In this study, we compared the advantages and disadvantages of our developed patented method with those of commonly used conventional methods (Ficoll-400, 1077, and handpicking methods). Cell viability was assayed using Trypan blue, cell purity and yield were assayed using diphenylthiocarbazone, and islet function was assayed using acridine orange/ethidium bromide staining and enzyme-linked immunosorbent assay-glucose stimulation testing 4 days after cultivation. The results showed that our islet isolation and purification method required 12 ± 3 min, which was significantly shorter than the time required in Ficoll-400, 1077, and HPU groups (34 ± 3, 41 ± 4, and 30 ± 4 min, respectively; P < 0.05). There was no significant difference in islet viability among the four groups. The islet purity, function, yield, and cost of our method were superior to those of the Ficoll-400 and 1077 methods, but inferior to the handpicking method. However, the handpicking method may cause wrist injury and visual impairment in researchers during large-scale islet isolation (>1000 islets). In summary, the MCT method is a rapid, efficient, and economic method for isolating and purifying murine islet cell clumps. This method overcomes some of the shortcomings of conventional methods, showing a relatively higher quality and yield of islets within a shorter duration at a lower cost. Therefore, the current method provides researchers with an alternative option for islet isolation and should be widely generalized.

  17. The validation study on a three-dimensional burn estimation smart-phone application: accurate, free and fast?

    PubMed

    Cheah, A K W; Kangkorn, T; Tan, E H; Loo, M L; Chong, S J

    2018-01-01

    Accurate total body surface area burned (TBSAB) estimation is a crucial aspect of early burn management. It helps guide resuscitation and is essential in the calculation of fluid requirements. Conventional methods of estimation can often lead to large discrepancies in burn percentage estimation. We aim to compare a new method of TBSAB estimation using a three-dimensional smart-phone application named 3D Burn Resuscitation (3D Burn) against conventional methods of estimation-Rule of Palm, Rule of Nines and the Lund and Browder chart. Three volunteer subjects were moulaged with simulated burn injuries of 25%, 30% and 35% total body surface area (TBSA), respectively. Various healthcare workers were invited to use both the 3D Burn application as well as the conventional methods stated above to estimate the volunteer subjects' burn percentages. Collective relative estimations across the groups showed that when used, the Rule of Palm, Rule of Nines and the Lund and Browder chart all over-estimated burns area by an average of 10.6%, 19.7%, and 8.3% TBSA, respectively, while the 3D Burn application under-estimated burns by an average of 1.9%. There was a statistically significant difference between the 3D Burn application estimations versus all three other modalities ( p  < 0.05). Time of using the application was found to be significantly longer than traditional methods of estimation. The 3D Burn application, although slower, allowed more accurate TBSAB measurements when compared to conventional methods. The validation study has shown that the 3D Burn application is useful in improving the accuracy of TBSAB measurement. Further studies are warranted, and there are plans to repeat the above study in a different centre overseas as part of a multi-centre study, with a view of progressing to a prospective study that compares the accuracy of the 3D Burn application against conventional methods on actual burn patients.

  18. Improved dewpoint-probe calibration

    NASA Technical Reports Server (NTRS)

    Stephenson, J. G.; Theodore, E. A.

    1978-01-01

    Relatively-simple pressure-control apparatus calibrates dewpoint probes considerably faster than conventional methods, with no loss of accuracy. Technique requires only pressure measurement at each calibration point and single absolute-humidity measurement at beginning of run. Several probes can be calibrated simultaneously and points can be checked above room temperature.

  19. Wirelessly powering miniature implants for optogenetic stimulation

    NASA Astrophysics Data System (ADS)

    Yeh, Alexander J.; Ho, John S.; Tanabe, Yuji; Neofytou, Evgenios; Beygui, Ramin E.; Poon, Ada S. Y.

    2013-10-01

    Conventional methods for in vivo optogenetic stimulation require optical fibers or mounted prosthesis. We present an approach for wirelessly powering implantable stimulators using electromagnetic midfield. By exploiting the properties of the midfield, we demonstrate the ability to generate high intensity light pulses in a freely moving animal.

  20. RECOVERY OF SEMI-VOLATILE ORGANIC COMPOUNDS DURING SAMPLE PREPARATION: IMPLICATIONS FOR CHARACTERIZATION OF AIRBORNE PARTICULATE MATTER

    EPA Science Inventory

    Semi-volatile compounds present special analytical challenges not met by conventional methods for analysis of ambient particulate matter (PM). Accurate quantification of PM-associated organic compounds requires validation of the laboratory procedures for recovery over a wide v...

  1. Detection of foodborne pathogens using microarray technology

    USDA-ARS?s Scientific Manuscript database

    Assays based on the polymerase chain reaction (PCR) are now accepted methods for rapidly confirming the presence or absence of specific pathogens in foods and other types of samples. Conventional PCR requires the use of agarose gel electrophoresis to detect the PCR product; whereas, real-time PCR c...

  2. Limitations of the Conventional Phase Advance Method for Constant Power Operation of the Brushless DC Motor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawler, J.S.

    2001-10-29

    The brushless dc motor (BDCM) has high-power density and efficiency relative to other motor types. These properties make the BDCM well suited for applications in electric vehicles provided a method can be developed for driving the motor over the 4 to 6:1 constant power speed range (CPSR) required by such applications. The present state of the art for constant power operation of the BDCM is conventional phase advance (CPA) [1]. In this paper, we identify key limitations of CPA. It is shown that the CPA has effective control over the developed power but that the current magnitude is relatively insensitivemore » to power output and is inversely proportional to motor inductance. If the motor inductance is low, then the rms current at rated power and high speed may be several times larger than the current rating. The inductance required to maintain rms current within rating is derived analytically and is found to be large relative to that of BDCM designs using high-strength rare earth magnets. Th us, the CPA requires a BDCM with a large equivalent inductance.« less

  3. Investigation into photostability of soybean oils by thermal lens spectroscopy

    NASA Astrophysics Data System (ADS)

    Savi, E. L.; Malacarne, L. C.; Baesso, M. L.; Pintro, P. T. M.; Croge, C.; Shen, J.; Astrath, N. G. C.

    2015-06-01

    Assessment of photochemical stability is essential for evaluating quality and the shelf life of vegetable oils, which are very important aspects of marketing and human health. Most of conventional methods used to investigate oxidative stability requires long time experimental procedures with high consumption of chemical inputs for the preparation or extraction of sample compounds. In this work we propose a time-resolved thermal lens method to analyze photostability of edible oils by quantitative measurement of photoreaction cross-section. An all-numerical routine is employed to solve a complex theoretical problem involving photochemical reaction, thermal lens effect, and mass diffusion during local laser excitation. The photostability of pure oil and oils with natural and synthetic antioxidants is investigated. The thermal lens results are compared with those obtained by conventional methods, and a complete set of physical properties of the samples is presented.

  4. Voltage Drop Compensation Method for Active Matrix Organic Light Emitting Diode Displays

    NASA Astrophysics Data System (ADS)

    Choi, Sang-moo; Ryu, Do-hyung; Kim, Keum-nam; Choi, Jae-beom; Kim, Byung-hee; Berkeley, Brian

    2011-03-01

    In this paper, the conventional voltage drop compensation methods are reviewed and the novel design and driving scheme, the advanced power de-coupled (aPDC) driving method, is proposed to effectively compensate the voltage IR drop of active matrix light emitting diode (AMOLED) displays. The advanced PDC driving scheme can be applied to general AMOLED pixel circuits that have been developed with only minor modification or without requiring modification in pixel circuit. A 14-in. AMOLED panel with the aPDC driving scheme was fabricated. Long range uniformity (LRU) of the 14-in. AMOLED panel was improved from 43% without the aPDC driving scheme, to over 87% at the same brightness by using the scheme and the layout complexity of the panel with new design scheme is less than that of the panel with the conventional design scheme.

  5. Attitude determination and parameter estimation using vector observations - Theory

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis

    1989-01-01

    Procedures for attitude determination based on Wahba's loss function are generalized to include the estimation of parameters other than the attitude, such as sensor biases. Optimization with respect to the attitude is carried out using the q-method, which does not require an a priori estimate of the attitude. Optimization with respect to the other parameters employs an iterative approach, which does require an a priori estimate of these parameters. Conventional state estimation methods require a priori estimates of both the parameters and the attitude, while the algorithm presented in this paper always computes the exact optimal attitude for given values of the parameters. Expressions for the covariance of the attitude and parameter estimates are derived.

  6. Sensitive determination of total particulate phosphorus and particulate inorganic phosphorus in seawater using liquid waveguide spectrophotometry.

    PubMed

    Ehama, Makoto; Hashihama, Fuminori; Kinouchi, Shinko; Kanda, Jota; Saito, Hiroaki

    2016-06-01

    Determining the total particulate phosphorus (TPP) and particulate inorganic phosphorus (PIP) in oligotrophic oceanic water generally requires the filtration of a large amount of water sample. This paper describes methods that require small filtration volumes for determining the TPP and PIP concentrations. The methods were devised by validating or improving conventional sample processing and by applying highly sensitive liquid waveguide spectrophotometry to the measurements of oxidized or acid-extracted phosphate from TPP and PIP, respectively. The oxidation of TPP was performed by a chemical wet oxidation method using 3% potassium persulfate. The acid extraction of PIP was initially carried out based on the conventional extraction methodology, which requires 1M HCl, followed by the procedure for decreasing acidity. While the conventional procedure for acid removal requires a ten-fold dilution of the 1M HCl extract with purified water, the improved procedure proposed in this study uses 8M NaOH solution for neutralizing 1M HCl extract in order to reduce the dilution effect. An experiment for comparing the absorbances of the phosphate standard dissolved in 0.1M HCl and of that dissolved in a neutralized solution [1M HCl: 8M NaOH=8:1 (v:v)] exhibited a higher absorbance in the neutralized solution. This indicated that the improved procedure completely removed the acid effect, which reduces the sensitivity of the phosphate measurement. Application to an ultraoligotrophic water sample showed that the TPP concentration in a 1075mL-filtered sample was 8.4nM with a coefficient of variation (CV) of 4.3% and the PIP concentration in a 2300mL-filtered sample was 1.3nM with a CV of 6.1%. Based on the detection limit (3nM) of the sensitive phosphate measurement and the ambient TPP and PIP concentrations of the ultraoligotrophic water, the minimum filtration volumes required for the detection of TPP and PIP were estimated to be 15 and 52mL, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Method of forming crystalline silicon devices on glass

    DOEpatents

    McCarthy, A.M.

    1995-03-21

    A method is disclosed for fabricating single-crystal silicon microelectronic components on a silicon substrate and transferring same to a glass substrate. This is achieved by utilizing conventional silicon processing techniques for fabricating components of electronic circuits and devices on bulk silicon, wherein a bulk silicon surface is prepared with epitaxial layers prior to the conventional processing. The silicon substrate is bonded to a glass substrate and the bulk silicon is removed leaving the components intact on the glass substrate surface. Subsequent standard processing completes the device and circuit manufacturing. This invention is useful in applications requiring a transparent or insulating substrate, particularly for display manufacturing. Other applications include sensors, actuators, optoelectronics, radiation hard electronics, and high temperature electronics. 7 figures.

  8. IMPROVEMENT OF EFFICIENCY OF CUT AND OVERLAY ASPHALT WORKS BY USING MOBILE MAPPING SYSTEM

    NASA Astrophysics Data System (ADS)

    Yabuki, Nobuyoshi; Nakaniwa, Kazuhide; Kidera, Hiroki; Nishi, Daisuke

    When the cut-and-overlay asphalt work is done for improving road pavement, conventional road surface elevation survey with levels often requires traffic regulation and takes much time and effort. Recently, although new surveying methods using non-prismatic total stations or fixed 3D laser scanners have been proposed in industry, they have not been adopted much due to their high cost. In this research, we propose a new method using Mobile Mapping Systems (MMS) in order to increase the efficiency and to reduce the cost. In this method, small white marks are painted at the intervals of 10m along the road to identify cross sections and to modify the elevations of the white marks with accurate survey data. To verify this proposed method, we executed an experiment and compared this method with the conventional level survey method and the fixed 3D laser scanning method at a road of Osaka University. The result showed that the proposed method had a similar accuracy with other methods and it was more efficient.

  9. Quantification of protein interaction kinetics in a micro droplet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yin, L. L.; College of Chemistry and Chemical Engineering, Chongqing University, Chongqing 400044; Wang, S. P., E-mail: shaopeng.wang@asu.edu, E-mail: njtao@asu.edu

    Characterization of protein interactions is essential to the discovery of disease biomarkers, the development of diagnostic assays, and the screening for therapeutic drugs. Conventional flow-through kinetic measurements need relative large amount of sample that is not feasible for precious protein samples. We report a novel method to measure protein interaction kinetics in a single droplet with sub microliter or less volume. A droplet in a humidity-controlled environmental chamber is replacing the microfluidic channels as the reactor for the protein interaction. The binding process is monitored by a surface plasmon resonance imaging (SPRi) system. Association curves are obtained from the averagemore » SPR image intensity in the center area of the droplet. The washing step required by conventional flow-through SPR method is eliminated in the droplet method. The association and dissociation rate constants and binding affinity of an antigen-antibody interaction are obtained by global fitting of association curves at different concentrations. The result obtained by this method is accurate as validated by conventional flow-through SPR system. This droplet-based method not only allows kinetic studies for proteins with limited supply but also opens the door for high-throughput protein interaction study in a droplet-based microarray format that enables measurement of many to many interactions on a single chip.« less

  10. Quantification of protein interaction kinetics in a micro droplet

    NASA Astrophysics Data System (ADS)

    Yin, L. L.; Wang, S. P.; Shan, X. N.; Zhang, S. T.; Tao, N. J.

    2015-11-01

    Characterization of protein interactions is essential to the discovery of disease biomarkers, the development of diagnostic assays, and the screening for therapeutic drugs. Conventional flow-through kinetic measurements need relative large amount of sample that is not feasible for precious protein samples. We report a novel method to measure protein interaction kinetics in a single droplet with sub microliter or less volume. A droplet in a humidity-controlled environmental chamber is replacing the microfluidic channels as the reactor for the protein interaction. The binding process is monitored by a surface plasmon resonance imaging (SPRi) system. Association curves are obtained from the average SPR image intensity in the center area of the droplet. The washing step required by conventional flow-through SPR method is eliminated in the droplet method. The association and dissociation rate constants and binding affinity of an antigen-antibody interaction are obtained by global fitting of association curves at different concentrations. The result obtained by this method is accurate as validated by conventional flow-through SPR system. This droplet-based method not only allows kinetic studies for proteins with limited supply but also opens the door for high-throughput protein interaction study in a droplet-based microarray format that enables measurement of many to many interactions on a single chip.

  11. Chemical-free inactivated whole influenza virus vaccine prepared by ultrashort pulsed laser treatment

    NASA Astrophysics Data System (ADS)

    Tsen, Shaw-Wei David; Donthi, Nisha; La, Victor; Hsieh, Wen-Han; Li, Yen-Der; Knoff, Jayne; Chen, Alexander; Wu, Tzyy-Choou; Hung, Chien-Fu; Achilefu, Samuel; Tsen, Kong-Thon

    2015-05-01

    There is an urgent need for rapid methods to develop vaccines in response to emerging viral pathogens. Whole inactivated virus (WIV) vaccines represent an ideal strategy for this purpose; however, a universal method for producing safe and immunogenic inactivated vaccines is lacking. Conventional pathogen inactivation methods such as formalin, heat, ultraviolet light, and gamma rays cause structural alterations in vaccines that lead to reduced neutralizing antibody specificity, and in some cases, disastrous T helper type 2-mediated immune pathology. We have evaluated the potential of a visible ultrashort pulsed (USP) laser method to generate safe and immunogenic WIV vaccines without adjuvants. Specifically, we demonstrate that vaccination of mice with laser-inactivated H1N1 influenza virus at about a 10-fold lower dose than that required using conventional formalin-inactivated influenza vaccines results in protection against lethal H1N1 challenge in mice. The virus, inactivated by the USP laser irradiation, has been shown to retain its surface protein structure through hemagglutination assay. Unlike conventional inactivation methods, laser treatment did not generate carbonyl groups in protein, thereby reducing the risk of adverse vaccine-elicited T helper type 2 responses. Therefore, USP laser treatment is an attractive potential strategy to generate WIV vaccines with greater potency and safety than vaccines produced by current inactivation techniques.

  12. Cryopreservation: Vitrification and Controlled Rate Cooling.

    PubMed

    Hunt, Charles J

    2017-01-01

    Cryopreservation is the application of low temperatures to preserve the structural and functional integrity of cells and tissues. Conventional cooling protocols allow ice to form and solute concentrations to rise during the cryopreservation process. The damage caused by the rise in solute concentration can be mitigated by the use of compounds known as cryoprotectants. Such compounds protect cells from the consequences of slow cooling injury, allowing them to be cooled at cooling rates which avoid the lethal effects of intracellular ice. An alternative to conventional cooling is vitrification. Vitrification methods incorporate cryoprotectants at sufficiently high concentrations to prevent ice crystallization so that the system forms an amorphous glass thus avoiding the damaging effects caused by conventional slow cooling. However, vitrification too can impose damaging consequences on cells as the cryoprotectant concentrations required to vitrify cells at lower cooling rates are potentially, and often, harmful. While these concentrations can be lowered to nontoxic levels, if the cells are ultra-rapidly cooled, the resulting metastable system can lead to damage through devitrification and growth of ice during subsequent storage and rewarming if not appropriately handled.The commercial and clinical application of stem cells requires robust and reproducible cryopreservation protocols and appropriate long-term, low-temperature storage conditions to provide reliable master and working cell banks. Though current Good Manufacturing Practice (cGMP) compliant methods for the derivation and banking of clinical grade pluripotent stem cells exist and stem cell lines suitable for clinical applications are available, current cryopreservation protocols, whether for vitrification or conventional slow freezing, remain suboptimal. Apart from the resultant loss of valuable product that suboptimal cryopreservation engenders, there is a danger that such processes will impose a selective pressure on the cells selecting out a nonrepresentative, freeze-resistant subpopulation. Optimizing this process requires knowledge of the fundamental processes that occur during the freezing of cellular systems, the mechanisms of damage and methods for avoiding them. This chapter draws together the knowledge of cryopreservation gained in other systems with the current state-of-the-art for embryonic and induced pluripotent stem cell preservation in an attempt to provide the background for future attempts to optimize cryopreservation protocols.

  13. Three-Signal Method for Accurate Measurements of Depolarization Ratio with Lidar

    NASA Technical Reports Server (NTRS)

    Reichardt, Jens; Baumgart, Rudolf; McGee, Thomsa J.

    2003-01-01

    A method is presented that permits the determination of atmospheric depolarization-ratio profiles from three elastic-backscatter lidar signals with different sensitivity to the state of polarization of the backscattered light. The three-signal method is insensitive to experimental errors and does not require calibration of the measurement, which could cause large systematic uncertainties of the results, as is the case in the lidar technique conventionally used for the observation of depolarization ratios.

  14. Furfural Synthesis from d-Xylose in the Presence of Sodium Chloride: Microwave versus Conventional Heating.

    PubMed

    Xiouras, Christos; Radacsi, Norbert; Sturm, Guido; Stefanidis, Georgios D

    2016-08-23

    We investigate the existence of specific/nonthermal microwave effects for the dehydration reaction of xylose to furfural in the presence of NaCl. Such effects are reported for sugars dehydration reactions in several literature reports. To this end, we adopted three approaches that compare microwave-assisted experiments with a) conventional heating experiments from the literature; b) simulated conventional heating experiments using microwave-irradiated silicon carbide (SiC) vials; and at c) different power levels but the same temperature by using forced cooling. No significant differences in the reaction kinetics are observed using any of these methods. However, microwave heating still proves advantageous as it requires 30 % less forward power compared to conventional heating (SiC vial) to achieve the same furfural yield at a laboratory scale. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. [Rapid 3-Dimensional Models of Cerebral Aneurysm for Emergency Surgical Clipping].

    PubMed

    Konno, Takehiko; Mashiko, Toshihiro; Oguma, Hirofumi; Kaneko, Naoki; Otani, Keisuke; Watanabe, Eiju

    2016-08-01

    We developed a method for manufacturing solid models of cerebral aneurysms, with a shorter printing time than that involved in conventional methods, using a compact 3D printer with acrylonitrile-butadiene-styrene(ABS)resin. We further investigated the application and utility of this printing system in emergency clipping surgery. A total of 16 patients diagnosed with acute subarachnoid hemorrhage resulting from cerebral aneurysm rupture were enrolled in the present study. Emergency clipping was performed on the day of hospitalization. Digital Imaging and Communication in Medicine(DICOM)data obtained from computed tomography angiography(CTA)scans were edited and converted to stereolithography(STL)file formats, followed by the production of 3D models of the cerebral aneurysm by using the 3D printer. The mean time from hospitalization to the commencement of surgery was 242 min, whereas the mean time required for manufacturing the 3D model was 67 min. The average cost of each 3D model was 194 Japanese Yen. The time required for manufacturing the 3D models shortened to approximately 1 hour with increasing experience of producing 3D models. Favorable impressions for the use of the 3D models in clipping were reported by almost all neurosurgeons included in this study. Although 3D printing is often considered to involve huge costs and long manufacturing time, the method used in the present study requires shorter time and lower costs than conventional methods for manufacturing 3D cerebral aneurysm models, thus making it suitable for use in emergency clipping.

  16. [Spatial domain display for interference image dataset].

    PubMed

    Wang, Cai-Ling; Li, Yu-Shan; Liu, Xue-Bin; Hu, Bing-Liang; Jing, Juan-Juan; Wen, Jia

    2011-11-01

    The requirements of imaging interferometer visualization is imminent for the user of image interpretation and information extraction. However, the conventional researches on visualization only focus on the spectral image dataset in spectral domain. Hence, the quick show of interference spectral image dataset display is one of the nodes in interference image processing. The conventional visualization of interference dataset chooses classical spectral image dataset display method after Fourier transformation. In the present paper, the problem of quick view of interferometer imager in image domain is addressed and the algorithm is proposed which simplifies the matter. The Fourier transformation is an obstacle since its computation time is very large and the complexion would be even deteriorated with the size of dataset increasing. The algorithm proposed, named interference weighted envelopes, makes the dataset divorced from transformation. The authors choose three interference weighted envelopes respectively based on the Fourier transformation, features of interference data and human visual system. After comparing the proposed with the conventional methods, the results show the huge difference in display time.

  17. Conventional and Accelerated-Solvent Extractions of Green Tea (Camellia sinensis) for Metabolomics-based Chemometrics

    PubMed Central

    Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.

    2018-01-01

    Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673

  18. Student Publications Enhance Teaching: Experimental Psychology and Research Methods Courses.

    ERIC Educational Resources Information Center

    Ware, Mark E.; Davis, Stephen F.

    Recent years have witnessed an increased emphasis on the professional development of undergraduate psychology students. One major thrust of this professional development has been on research that results in a convention presentation or journal publication. Research leading to journal publication is becoming a requirement for admission to many…

  19. Access to opioids: a global pain management crisis.

    PubMed

    Buitrago, Rosa

    2013-03-01

    The lack of availability of opioids in many countries has created a pain management crisis. Because the Single Convention on Narcotic Drugs requires governments to report annual opioid statistics, there is a need for methods to calculate individual nations' opioid needs. Ways to address this need are discussed.

  20. Using Visualization and Computation in the Analysis of Separation Processes

    ERIC Educational Resources Information Center

    Joo, Yong Lak; Choudhary, Devashish

    2006-01-01

    For decades, every chemical engineer has been asked to have a background in separations. The required separations course can, however, be uninspiring and superficial because understanding many separation processes involves conventional graphical methods and commercial process simulators. We utilize simple, user-­friendly mathematical software,…

  1. Passive Sampling in Regulatory Chemical Monitoring of Nonpolar Organic Compounds in the Aquatic Environment

    EPA Science Inventory

    We reviewed compliance monitoring requirements in the European Union (EU), the United States(USA), and the Oslo-Paris Convention for the protection of the marine environment of the North-East Atlantic (OSPAR), and evaluated if these are met by passive sampling methods for nonpola...

  2. 50 CFR 23.45 - What are the requirements for a pre-Convention specimen?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... What are the requirements for a pre-Convention specimen? (a) Purpose. Article VII(2) of the Treaty exempts a pre-Convention specimen from standard permitting requirements in Articles III, IV, and V of the... (including a manufactured item) or derivative made from such specimen. (2) The scientific name of the species...

  3. 50 CFR 23.45 - What are the requirements for a pre-Convention specimen?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... What are the requirements for a pre-Convention specimen? (a) Purpose. Article VII(2) of the Treaty exempts a pre-Convention specimen from standard permitting requirements in Articles III, IV, and V of the... (including a manufactured item) or derivative made from such specimen. (2) The scientific name of the species...

  4. Noncontact Measurement of Humidity and Temperature Using Airborne Ultrasound

    NASA Astrophysics Data System (ADS)

    Kon, Akihiko; Mizutani, Koichi; Wakatsuki, Naoto

    2010-04-01

    We describe a noncontact method for measuring humidity and dry-bulb temperature. Conventional humidity sensors are single-point measurement devices, so that a noncontact method for measuring the relative humidity is required. Ultrasonic temperature sensors are noncontact measurement sensors. Because water vapor in the air increases sound velocity, conventional ultrasonic temperature sensors measure virtual temperature, which is higher than dry-bulb temperature. We performed experiments using an ultrasonic delay line, an atmospheric pressure sensor, and either a thermometer or a relative humidity sensor to confirm the validity of our measurement method at relative humidities of 30, 50, 75, and 100% and at temperatures of 283.15, 293.15, 308.15, and 323.15 K. The results show that the proposed method measures relative humidity with an error rate of less than 16.4% and dry-bulb temperature with an error of less than 0.7 K. Adaptations of the measurement method for use in air-conditioning control systems are discussed.

  5. Design of a practical model-observer-based image quality assessment method for x-ray computed tomography imaging systems

    PubMed Central

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.

    2016-01-01

    Abstract. The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. PMID:27493982

  6. Comparison of variational real-space representations of the kinetic energy operator

    NASA Astrophysics Data System (ADS)

    Skylaris, Chris-Kriton; Diéguez, Oswaldo; Haynes, Peter D.; Payne, Mike C.

    2002-08-01

    We present a comparison of real-space methods based on regular grids for electronic structure calculations that are designed to have basis set variational properties, using as a reference the conventional method of finite differences (a real-space method that is not variational) and the reciprocal-space plane-wave method which is fully variational. We find that a definition of the finite-difference method [P. Maragakis, J. Soler, and E. Kaxiras, Phys. Rev. B 64, 193101 (2001)] satisfies one of the two properties of variational behavior at the cost of larger errors than the conventional finite-difference method. On the other hand, a technique which represents functions in a number of plane waves which is independent of system size closely follows the plane-wave method and therefore also the criteria for variational behavior. Its application is only limited by the requirement of having functions strictly localized in regions of real space, but this is a characteristic of an increasing number of modern real-space methods, as they are designed to have a computational cost that scales linearly with system size.

  7. Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.

    PubMed

    Wieczerzak, M; Namieśnik, J; Kudłak, B

    2016-09-01

    For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Evaluation of magnetic nanoparticle samples made from biocompatible ferucarbotran by time-correlation magnetic particle imaging reconstruction method

    PubMed Central

    2013-01-01

    Background Molecular imaging using magnetic nanoparticles (MNPs)—magnetic particle imaging (MPI)—has attracted interest for the early diagnosis of cancer and cardiovascular disease. However, because a steep local magnetic field distribution is required to obtain a defined image, sophisticated hardware is required. Therefore, it is desirable to realize excellent image quality even with low-performance hardware. In this study, the spatial resolution of MPI was evaluated using an image reconstruction method based on the correlation information of the magnetization signal in a time domain and by applying MNP samples made from biocompatible ferucarbotran that have adjusted particle diameters. Methods The magnetization characteristics and particle diameters of four types of MNP samples made from ferucarbotran were evaluated. A numerical analysis based on our proposed method that calculates the image intensity from correlation information between the magnetization signal generated from MNPs and the system function was attempted, and the obtained image quality was compared with that using the prototype in terms of image resolution and image artifacts. Results MNP samples obtained by adjusting ferucarbotran showed superior properties to conventional ferucarbotran samples, and numerical analysis showed that the same image quality could be obtained using a gradient magnetic field generator with 0.6 times the performance. However, because image blurring was included theoretically by the proposed method, an algorithm will be required to improve performance. Conclusions MNP samples obtained by adjusting ferucarbotran showed magnetizing properties superior to conventional ferucarbotran samples, and by using such samples, comparable image quality (spatial resolution) could be obtained with a lower gradient magnetic field intensity. PMID:23734917

  9. Comparison of air space measurement imaged by CT, small-animal CT, and hyperpolarized Xe MRI

    NASA Astrophysics Data System (ADS)

    Madani, Aniseh; White, Steven; Santyr, Giles; Cunningham, Ian

    2005-04-01

    Lung disease is the third leading cause of death in the western world. Lung air volume measurements are thought to be early indicators of lung disease and markers in pharmaceutical research. The purpose of this work is to develop a lung phantom for assessing and comparing the quantitative accuracy of hyperpolarized xenon 129 magnetic resonance imaging (HP 129Xe MRI), conventional computed tomography (HRCT), and highresolution small-animal CT (μCT) in measuring lung gas volumes. We developed a lung phantom consisting of solid cellulose acetate spheres (1, 2, 3, 4 and 5 mm diameter) uniformly packed in circulated air or HP 129Xe gas. Air volume is estimated based on simple thresholding algorithm. Truth is calculated from the sphere diameters and validated using μCT. While this phantom is not anthropomorphic, it enables us to directly measure air space volume and compare these imaging methods as a function of sphere diameter for the first time. HP 129Xe MRI requires partial volume analysis to distinguish regions with and without 129Xe gas and results are within %5 of truth but settling of the heavy 129Xe gas complicates this analysis. Conventional CT demonstrated partial-volume artifacts for the 1mm spheres. μCT gives the most accurate air-volume results. Conventional CT and HP 129Xe MRI give similar results although non-uniform densities of 129Xe require more sophisticated algorithms than simple thresholding. The threshold required to give the true air volume in both HRCT and μCT, varies with sphere diameters calling into question the validity of thresholding method.

  10. Slide-free histology via MUSE: UV surface excitation microscopy for imaging unsectioned tissue (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Levenson, Richard M.; Harmany, Zachary; Demos, Stavros G.; Fereidouni, Farzad

    2016-03-01

    Widely used methods for preparing and viewing tissue specimens at microscopic resolution have not changed for over a century. They provide high-quality images but can involve time-frames of hours or even weeks, depending on logistics. There is increasing interest in slide-free methods for rapid tissue analysis that can both decrease turn-around times and reduce costs. One new approach is MUSE (microscopy with UV surface excitation), which exploits the shallow penetration of UV light to excite fluorescent signals from only the most superficial tissue elements. The method is non-destructive, and eliminates requirement for conventional histology processing, formalin fixation, paraffin embedding, or thin sectioning. It requires no lasers, confocal, multiphoton or optical coherence tomography optics. MUSE generates diagnostic-quality histological images that can be rendered to resemble conventional hematoxylin- and eosin-stained samples, with enhanced topographical information, from fresh or fixed, but unsectioned tissue, rapidly, with high resolution, simply and inexpensively. We anticipate that there could be widespread adoption in research facilities, hospital-based and stand-alone clinical settings, in local or regional pathology labs, as well as in low-resource environments.

  11. Driving bubbles out of glass

    NASA Technical Reports Server (NTRS)

    Mattox, D. M.

    1981-01-01

    Surface tension gradient in melt forces gas bubbles to surface, increasing glass strength and transparency. Conventional chemical and buoyant fining are extremely slow in viscous glasses, but tension gradient method moves 250 um bubbles as rapidly as 30 um/s. Heat required for high temperature part of melt is furnished by stationary electrical or natural-gas heater; induction and laser heating are also possible. Method has many applications in industry processes.

  12. Teaching Dental Students to Understand the Temporomandibular Joint Using MRI: Comparison of Conventional and Digital Learning Methods.

    PubMed

    Arús, Nádia A; da Silva, Átila M; Duarte, Rogério; da Silveira, Priscila F; Vizzotto, Mariana B; da Silveira, Heraldo L D; da Silveira, Heloisa E D

    2017-06-01

    The aims of this study were to evaluate and compare the performance of dental students in interpreting the temporomandibular joint (TMJ) with magnetic resonance imaging (MRI) scans using two learning methods (conventional and digital interactive learning) and to examine the usability of the digital learning object (DLO). The DLO consisted of tutorials about MRI and anatomic and functional aspects of the TMJ. In 2014, dental students in their final year of study who were enrolled in the elective "MRI Interpretation of the TMJ" course comprised the study sample. After exclusions for nonattendance and other reasons, 29 of the initial 37 students participated in the study, for a participation rate of 78%. The participants were divided into two groups: a digital interactive learning group (n=14) and a conventional learning group (n=15). Both methods were assessed by an objective test applied before and after training and classes. Aspects such as support and training requirements, complexity, and consistency of the DLO were also evaluated using the System Usability Scale (SUS). A significant between-group difference in the posttest results was found, with the conventional learning group scoring better than the DLO group, indicated by mean scores of 9.20 and 8.11, respectively, out of 10. However, when the pretest and posttest results were compared, both groups showed significantly improved performance. The SUS score was 89, which represented a high acceptance of the DLO by the users. The students who used the conventional method of learning showed superior performance in interpreting the TMJ using MRI compared to the group that used digital interactive learning.

  13. Embryonic development in human oocytes fertilized by split insemination

    PubMed Central

    Kim, Myo Sun; Kim, Jayeon; Youm, Hye Won; Park, Jung Yeon; Choi, Hwa Young

    2015-01-01

    Objective To compare the laboratory outcomes of intracytoplasmic sperm injection (ICSI) and conventional insemination using sibling oocytes in poor prognosis IVF cycles where ICSI is not indicated. Methods Couples undergoing IVF with following conditions were enrolled: history of more than 3 years of unexplained infertility, history of ≥3 failed intrauterine insemination, leukocytospermia or wide variation in semen analysis, poor oocyte quality, or ≥50% of embryos had poor quality in previous IVF cycle(s). Couples with severe male factor requiring ICSI were excluded. Oocytes were randomly assigned to the conventional insemination (conventional group) or ICSI (ICSI group). Fertilization rate (FR), total fertilization failure, and embryonic development at day 3 and day 5 were assessed. Results A total of 309 mature oocytes from 37 IVF cycles (32 couples) were obtained: 161 were assigned to conventional group and 148 to ICSI group. FR was significantly higher in the ICSI group compared to the conventional group (90.5% vs. 72.7%, P<0.001). Total fertilization failure occurred in only one cycle in conventional group. On day 3, the percentage of cleavage stage embryos was higher in ICSI group however the difference was marginally significant (P=0.055). In 11 cycles in which day 5 culture was attempted, the percentage of blastocyst (per cleaved embryo) was significantly higher in the ICSI group than the conventional group (55.9% vs. 25.9%, P=0.029). Conclusion Higher FR and more blastocyst could be achieved by ICSI in specific circumstances. Fertilization method can be tailored accordingly to improve IVF outcomes. PMID:26023671

  14. A modify ant colony optimization for the grid jobs scheduling problem with QoS requirements

    NASA Astrophysics Data System (ADS)

    Pu, Xun; Lu, XianLiang

    2011-10-01

    Job scheduling with customers' quality of service (QoS) requirement is challenging in grid environment. In this paper, we present a modify Ant colony optimization (MACO) for the Job scheduling problem in grid. Instead of using the conventional construction approach to construct feasible schedules, the proposed algorithm employs a decomposition method to satisfy the customer's deadline and cost requirements. Besides, a new mechanism of service instances state updating is embedded to improve the convergence of MACO. Experiments demonstrate the effectiveness of the proposed algorithm.

  15. Fracture control for the Oman India Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruno, T.V.

    1996-12-31

    This paper describes the evaluation of the resistance to fracture initiation and propagation for the high-strength, heavy-wall pipe required for the Oman India Pipeline (OIP). It discusses the unique aspects of this pipeline and their influence on fracture control, reviews conventional fracture control design methods, their limitations with regard to the pipe in question, the extent to which they can be utilized for this project, and other approaches being explored. Test pipe of the size and grade required for the OIP show fracture toughness well in excess of the minimum requirements.

  16. Measurement of Creep Properties of Ultra-High-Temperature Materials by a Novel Non-Contact Technique

    NASA Technical Reports Server (NTRS)

    Hyers, Robert W.; Lee, Jonghyun; Rogers, Jan R.; Liaw, Peter K.

    2007-01-01

    A non-contact technique for measuring the creep properties of materials has been developed and validated as part of a collaboration among the University of Massachusetts, NASA Marshall Space Flight Center Electrostatic Levitation Facility (ESL), and the University of Tennessee. This novel method has several advantages over conventional creep testing. The sample is deformed by the centripetal acceleration from the rapid rotation, and the deformed shapes are analyzed to determine the strain. Since there is no contact with grips, there is no theoretical maximum temperature and no concern about chemical compatibility. Materials may be tested at the service temperature even for extreme environments such as rocket nozzles, or above the service temperature for accelerated testing of materials for applications such as jet engines or turbopumps for liquid-fueled engines. The creep measurements have been demonstrated to 2400 C with niobium, while the test facility, the NASA MSFC ESL, has processed materials up to 3400 C. Furthermore, the ESL creep method employs a distribution of stress to determine the stress exponent from a single test, versus the many tests required by conventional methods. Determination of the stress exponent from the ESL creep tests requires very precise measurement of the surface shape of the deformed sample for comparison to deformations predicted by finite element models for different stress exponents. An error analysis shows that the stress exponent can be determined to about 1% accuracy with the current methods and apparatus. The creep properties of single-crystal niobium at 1985 C showed excellent agreement with conventional tests performed according to ASTM Standard E-139. Tests on other metals, ceramics, and composites relevant to rocket propulsion and turbine engines are underway.

  17. Functional Mobility Testing: A Novel Method to Create Suit Design Requirements

    NASA Technical Reports Server (NTRS)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.

    2008-01-01

    This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.

  18. A novel implementation of homodyne time interval analysis method for primary vibration calibration

    NASA Astrophysics Data System (ADS)

    Sun, Qiao; Zhou, Ling; Cai, Chenguang; Hu, Hongbo

    2011-12-01

    In this paper, the shortcomings and their causes of the conventional homodyne time interval analysis (TIA) method is described with respect to its software algorithm and hardware implementation, based on which a simplified TIA method is proposed with the help of virtual instrument technology. Equipped with an ordinary Michelson interferometer and dual channel synchronous data acquisition card, the primary vibration calibration system using the simplified method can perform measurements of complex sensitivity of accelerometers accurately, meeting the uncertainty requirements laid down in pertaining ISO standard. The validity and accuracy of the simplified TIA method is verified by simulation and comparison experiments with its performance analyzed. This simplified method is recommended to apply in national metrology institute of developing countries and industrial primary vibration calibration labs for its simplified algorithm and low requirements on hardware.

  19. A Robust Cooperated Control Method with Reinforcement Learning and Adaptive H∞ Control

    NASA Astrophysics Data System (ADS)

    Obayashi, Masanao; Uchiyama, Shogo; Kuremoto, Takashi; Kobayashi, Kunikazu

    This study proposes a robust cooperated control method combining reinforcement learning with robust control to control the system. A remarkable characteristic of the reinforcement learning is that it doesn't require model formula, however, it doesn't guarantee the stability of the system. On the other hand, robust control system guarantees stability and robustness, however, it requires model formula. We employ both the actor-critic method which is a kind of reinforcement learning with minimal amount of computation to control continuous valued actions and the traditional robust control, that is, H∞ control. The proposed system was compared method with the conventional control method, that is, the actor-critic only used, through the computer simulation of controlling the angle and the position of a crane system, and the simulation result showed the effectiveness of the proposed method.

  20. Numerical simulation of immiscible viscous fingering using adaptive unstructured meshes

    NASA Astrophysics Data System (ADS)

    Adam, A.; Salinas, P.; Percival, J. R.; Pavlidis, D.; Pain, C.; Muggeridge, A. H.; Jackson, M.

    2015-12-01

    Displacement of one fluid by another in porous media occurs in various settings including hydrocarbon recovery, CO2 storage and water purification. When the invading fluid is of lower viscosity than the resident fluid, the displacement front is subject to a Saffman-Taylor instability and is unstable to transverse perturbations. These instabilities can grow, leading to fingering of the invading fluid. Numerical simulation of viscous fingering is challenging. The physics is controlled by a complex interplay of viscous and diffusive forces and it is necessary to ensure physical diffusion dominates numerical diffusion to obtain converged solutions. This typically requires the use of high mesh resolution and high order numerical methods. This is computationally expensive. We demonstrate here the use of a novel control volume - finite element (CVFE) method along with dynamic unstructured mesh adaptivity to simulate viscous fingering with higher accuracy and lower computational cost than conventional methods. Our CVFE method employs a discontinuous representation for both pressure and velocity, allowing the use of smaller control volumes (CVs). This yields higher resolution of the saturation field which is represented CV-wise. Moreover, dynamic mesh adaptivity allows high mesh resolution to be employed where it is required to resolve the fingers and lower resolution elsewhere. We use our results to re-examine the existing criteria that have been proposed to govern the onset of instability.Mesh adaptivity requires the mapping of data from one mesh to another. Conventional methods such as consistent interpolation do not readily generalise to discontinuous fields and are non-conservative. We further contribute a general framework for interpolation of CV fields by Galerkin projection. The method is conservative, higher order and yields improved results, particularly with higher order or discontinuous elements where existing approaches are often excessively diffusive.

  1. Dosimetric comparison between conventional and conformal radiotherapy for carcinoma cervix: Are we treating the right volumes?

    PubMed Central

    Goswami, Jyotirup; Patra, Niladri B.; Sarkar, Biplab; Basu, Ayan; Pal, Santanu

    2013-01-01

    Background and Purpose: Conventional portals, based on bony anatomy, for external beam radiotherapy for cervical cancer have been repeatedly demonstrated as inadequate. Conversely, with image-based conformal radiotherapy, better target coverage may be offset by the greater toxicities and poorer compliance associated with treating larger volumes. This study was meant to dosimetrically compare conformal and conventional radiotherapy. Materials and Methods: Five patients of carcinoma cervix underwent planning CT scan with IV contrast and targets, and organs at risk (OAR) were contoured. Two sets of plans-conventional and conformal were generated for each patient. Field sizes were recorded, and dose volume histograms of both sets of plans were generated and compared on the basis of target coverage and OAR sparing. Results: Target coverage was significantly improved with conformal plans though field sizes required were significantly larger. On the other hand, dose homogeneity was not significantly improved. Doses to the OARs (rectum, urinary bladder, and small bowel) were not significantly different across the 2 arms. Conclusion: Three-dimensional conformal radiotherapy gives significantly better target coverage, which may translate into better local control and survival. On the other hand, it also requires significantly larger field sizes though doses to the OARs are not significantly increased. PMID:24455584

  2. Simultaneous Microwave Extraction and Separation of Volatile and Non-Volatile Organic Compounds of Boldo Leaves. From Lab to Industrial Scale

    PubMed Central

    Petigny, Loïc; Périno, Sandrine; Minuti, Matteo; Visinoni, Francesco; Wajsman, Joël; Chemat, Farid

    2014-01-01

    Microwave extraction and separation has been used to increase the concentration of the extract compared to the conventional method with the same solid/liquid ratio, reducing extraction time and separate at the same time Volatile Organic Compounds (VOC) from non-Volatile Organic Compounds (NVOC) of boldo leaves. As preliminary study, a response surface method has been used to optimize the extraction of soluble material and the separation of VOC from the plant in laboratory scale. The results from the statistical analysis revealed that the optimized conditions were: microwave power 200 W, extraction time 56 min and solid liquid ratio of 7.5% of plants in water. Lab scale optimized microwave method is compared to conventional distillation, and requires a power/mass ratio of 0.4 W/g of water engaged. This power/mass ratio is kept in order to upscale from lab to pilot plant. PMID:24776762

  3. A fast and accurate frequency estimation algorithm for sinusoidal signal with harmonic components

    NASA Astrophysics Data System (ADS)

    Hu, Jinghua; Pan, Mengchun; Zeng, Zhidun; Hu, Jiafei; Chen, Dixiang; Tian, Wugang; Zhao, Jianqiang; Du, Qingfa

    2016-10-01

    Frequency estimation is a fundamental problem in many applications, such as traditional vibration measurement, power system supervision, and microelectromechanical system sensors control. In this paper, a fast and accurate frequency estimation algorithm is proposed to deal with low efficiency problem in traditional methods. The proposed algorithm consists of coarse and fine frequency estimation steps, and we demonstrate that it is more efficient than conventional searching methods to achieve coarse frequency estimation (location peak of FFT amplitude) by applying modified zero-crossing technique. Thus, the proposed estimation algorithm requires less hardware and software sources and can achieve even higher efficiency when the experimental data increase. Experimental results with modulated magnetic signal show that the root mean square error of frequency estimation is below 0.032 Hz with the proposed algorithm, which has lower computational complexity and better global performance than conventional frequency estimation methods.

  4. Accurate Modeling Method for Cu Interconnect

    NASA Astrophysics Data System (ADS)

    Yamada, Kenta; Kitahara, Hiroshi; Asai, Yoshihiko; Sakamoto, Hideo; Okada, Norio; Yasuda, Makoto; Oda, Noriaki; Sakurai, Michio; Hiroi, Masayuki; Takewaki, Toshiyuki; Ohnishi, Sadayuki; Iguchi, Manabu; Minda, Hiroyasu; Suzuki, Mieko

    This paper proposes an accurate modeling method of the copper interconnect cross-section in which the width and thickness dependence on layout patterns and density caused by processes (CMP, etching, sputtering, lithography, and so on) are fully, incorporated and universally expressed. In addition, we have developed specific test patterns for the model parameters extraction, and an efficient extraction flow. We have extracted the model parameters for 0.15μm CMOS using this method and confirmed that 10%τpd error normally observed with conventional LPE (Layout Parameters Extraction) was completely dissolved. Moreover, it is verified that the model can be applied to more advanced technologies (90nm, 65nm and 55nm CMOS). Since the interconnect delay variations due to the processes constitute a significant part of what have conventionally been treated as random variations, use of the proposed model could enable one to greatly narrow the guardbands required to guarantee a desired yield, thereby facilitating design closure.

  5. 46 CFR 2.01-25 - International Convention for Safety of Life at Sea, 1974.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false International Convention for Safety of Life at Sea, 1974... Convention for Safety of Life at Sea, 1974. (a) Certificates required. (1) The International Convention for Safety of Life at Sea, 1974, requires one or more of the following certificates to be carried on board...

  6. 46 CFR 2.01-25 - International Convention for Safety of Life at Sea, 1974.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false International Convention for Safety of Life at Sea, 1974... Convention for Safety of Life at Sea, 1974. (a) Certificates required. (1) The International Convention for Safety of Life at Sea, 1974, requires one or more of the following certificates to be carried on board...

  7. Conventional and Advanced Separations in Mass Spectrometry-Based Metabolomics: Methodologies and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heyman, Heino M.; Zhang, Xing; Tang, Keqi

    2016-02-16

    Metabolomics is the quantitative analysis of all metabolites in a given sample. Due to the chemical complexity of the metabolome, optimal separations are required for comprehensive identification and quantification of sample constituents. This chapter provides an overview of both conventional and advanced separations methods in practice for reducing the complexity of metabolite extracts delivered to the mass spectrometer detector, and covers gas chromatography (GC), liquid chromatography (LC), capillary electrophoresis (CE), supercritical fluid chromatography (SFC) and ion mobility spectrometry (IMS) separation techniques coupled with mass spectrometry (MS) as both uni-dimensional and as multi-dimensional approaches.

  8. High Aspect-Ratio Neural Probes using Conventional Blade Dicing

    NASA Astrophysics Data System (ADS)

    Goncalves, S. B.; Ribeiro, J. F.; Silva, A. F.; Correia, J. H.

    2016-10-01

    Exploring deep neural circuits has triggered the development of long penetrating neural probes. Moreover, driven by brain displacement, the long neural probes require also a high aspect-ratio shafts design. In this paper, a simple and reproducible method of manufacturing long-shafts neural probes using blade dicing technology is presented. Results shows shafts up to 8 mm long and 200 µm wide, features competitive to the current state-of-art, being its outline simply accomplished by a single blade dicing program. Therefore, conventional blade dicing presents itself as a viable option to manufacture long neural probes.

  9. Sensory descriptors, hedonic perception and consumer’s attitudes to Sangiovese red wine deriving from organically and conventionally grown grapes

    PubMed Central

    Pagliarini, Ella; Laureati, Monica; Gaeta, Davide

    2013-01-01

    In recent years, produce obtained from organic farming methods (i.e., a system that minimizes pollution and avoids the use of synthetic fertilizers and pesticides) has rapidly increased in developed countries. This may be explained by the fact that organic food meets the standard requirements for quality and healthiness. Among organic products, wine has greatly attracted the interest of the consumers. In the present study, trained assessors and regular wine consumers were respectively required to identify the sensory properties (e.g., odor, taste, flavor, and mouthfeel sensations) and to evaluate the hedonic dimension of red wines deriving from organically and conventionally grown grapes. Results showed differences related mainly to taste (sour and bitter) and mouthfeel (astringent) sensations, with odor and flavor playing a minor role. However, these differences did not influence liking, as organic and conventional wines were hedonically comparable. Interestingly, 61% of respondents would be willing to pay more for organically produced wines, which suggests that environmentally sustainable practices related to wine quality have good market prospects. PMID:24348447

  10. Sensory descriptors, hedonic perception and consumer's attitudes to Sangiovese red wine deriving from organically and conventionally grown grapes.

    PubMed

    Pagliarini, Ella; Laureati, Monica; Gaeta, Davide

    2013-01-01

    In recent years, produce obtained from organic farming methods (i.e., a system that minimizes pollution and avoids the use of synthetic fertilizers and pesticides) has rapidly increased in developed countries. This may be explained by the fact that organic food meets the standard requirements for quality and healthiness. Among organic products, wine has greatly attracted the interest of the consumers. In the present study, trained assessors and regular wine consumers were respectively required to identify the sensory properties (e.g., odor, taste, flavor, and mouthfeel sensations) and to evaluate the hedonic dimension of red wines deriving from organically and conventionally grown grapes. Results showed differences related mainly to taste (sour and bitter) and mouthfeel (astringent) sensations, with odor and flavor playing a minor role. However, these differences did not influence liking, as organic and conventional wines were hedonically comparable. Interestingly, 61% of respondents would be willing to pay more for organically produced wines, which suggests that environmentally sustainable practices related to wine quality have good market prospects.

  11. Theory and implementation of summarization: Improving sensor interpretation for spacecraft operations

    NASA Astrophysics Data System (ADS)

    Swartwout, Michael Alden

    New paradigms in space missions require radical changes in spacecraft operations. In the past, operations were insulated from competitive pressures of cost, quality and time by system infrastructures, technological limitations and historical precedent. However, modern demands now require that operations meet competitive performance goals. One target for improvement is the telemetry downlink, where significant resources are invested to acquire thousands of measurements for human interpretation. This cost-intensive method is used because conventional operations are not based on formal methodologies but on experiential reasoning and incrementally adapted procedures. Therefore, to improve the telemetry downlink it is first necessary to invent a rational framework for discussing operations. This research explores operations as a feedback control problem, develops the conceptual basis for the use of spacecraft telemetry, and presents a method to improve performance. The method is called summarization, a process to make vehicle data more useful to operators. Summarization enables rational trades for telemetry downlink by defining and quantitatively ranking these elements: all operational decisions, the knowledge needed to inform each decision, and all possible sensor mappings to acquire that knowledge. Summarization methods were implemented for the Sapphire microsatellite; conceptual health management and system models were developed and a degree-of-observability metric was defined. An automated tool was created to generate summarization methods from these models. Methods generated using a Sapphire model were compared against the conventional operations plan. Summarization was shown to identify the key decisions and isolate the most appropriate sensors. Secondly, a form of summarization called beacon monitoring was experimentally verified. Beacon monitoring automates the anomaly detection and notification tasks and migrates these responsibilities to the space segment. A set of experiments using Sapphire demonstrated significant cost and time savings compared to conventional operations. Summarization is based on rational concepts for defining and understanding operations. Therefore, it enables additional trade studies that were formerly not possible and also can form the basis for future detailed research into spacecraft operations.

  12. Reinforcing the role of the conventional C-arm - a novel method for simplified distal interlocking

    PubMed Central

    2012-01-01

    Background The common practice for insertion of distal locking screws of intramedullary nails is a freehand technique under fluoroscopic control. The process is technically demanding, time-consuming and afflicted to considerable radiation exposure of the patient and the surgical personnel. A new concept is introduced utilizing information from within conventional radiographic images to help accurately guide the surgeon to place the interlocking bolt into the interlocking hole. The newly developed technique was compared to conventional freehand in an operating room (OR) like setting on human cadaveric lower legs in terms of operating time and radiation exposure. Methods The proposed concept (guided freehand), generally based on the freehand gold standard, additionally guides the surgeon by means of visible landmarks projected into the C-arm image. A computer program plans the correct drilling trajectory by processing the lens-shaped hole projections of the interlocking holes from a single image. Holes can be drilled by visually aligning the drill to the planned trajectory. Besides a conventional C-arm, no additional tracking or navigation equipment is required. Ten fresh frozen human below-knee specimens were instrumented with an Expert Tibial Nail (Synthes GmbH, Switzerland). The implants were distally locked by performing the newly proposed technique as well as the conventional freehand technique on each specimen. An orthopedic resident surgeon inserted four distal screws per procedure. Operating time, number of images and radiation time were recorded and statistically compared between interlocking techniques using non-parametric tests. Results A 58% reduction in number of taken images per screw was found for the guided freehand technique (7.4 ± 3.4) (mean ± SD) compared to the freehand technique (17.6 ± 10.3) (p < 0.001). Total radiation time (all 4 screws) was 55% lower for the guided freehand technique compared to conventional freehand (p = 0.001). Operating time per screw (from first shot to screw tightened) was on average 22% reduced by guided freehand (p = 0.018). Conclusions In an experimental setting, the newly developed guided freehand technique for distal interlocking has proven to markedly reduce radiation exposure when compared to the conventional freehand technique. The method utilizes established clinical workflows and does not require cost intensive add-on devices or extensive training. The underlying principle carries potential to assist implant positioning in numerous other applications within orthopedics and trauma from screw insertions to placement of plates, nails or prostheses. PMID:22276698

  13. Catalyzed Reporter Deposition-Fluorescence In Situ Hybridization Allows for Enrichment-Independent Detection of Microcolony-Forming Soil Bacteria

    PubMed Central

    Ferrari, Belinda C.; Tujula, Niina; Stoner, Kate; Kjelleberg, Staffan

    2006-01-01

    Advances in the growth of hitherto unculturable soil bacteria have emphasized the requirement for rapid bacterial identification methods. Due to the slow-growing strategy of microcolony-forming soil bacteria, successful fluorescence in situ hybridization (FISH) requires an rRNA enrichment step for visualization. In this study, catalyzed reporter deposition (CARD)-FISH was employed as an alternative method to rRNA enhancement and was found to be superior to conventional FISH for the detection of microcolonies that are cultivated by using the soil substrate membrane system. CARD-FISH enabled real-time identification of oligophilic microcolony-forming soil bacteria without the requirement for enrichment on complex media and the associated shifts in community composition. PMID:16391135

  14. A novel test cage with an air ventilation system as an alternative to conventional cages for the efficacy testing of mosquito repellents.

    PubMed

    Obermayr, U; Rose, A; Geier, M

    2010-11-01

    We have developed a novel test cage and improved method for the evaluation of mosquito repellents. The method is compatible with the United States Environmental Protection Agency, 2000 draft OPPTS 810.3700 Product Performance Test Guidelines for Testing of Insect Repellents. The Biogents cages (BG-cages) require fewer test mosquitoes than conventional cages and are more comfortable for the human volunteers. The novel cage allows a section of treated forearm from a volunteer to be exposed to mosquito probing through a window. This design minimizes residual contamination of cage surfaces with repellent. In addition, an air ventilation system supplies conditioned air to the cages after each single test, to flush out and prevent any accumulation of test substances. During biting activity tests, the untreated skin surface does not receive bites because of a screen placed 150 mm above the skin. Compared with the OPPTS 810.3700 method, the BG-cage is smaller (27 liters, compared with 56 liters) and contains 30 rather than hundreds of blood-hungry female mosquitoes. We compared the performance of a proprietary repellent formulation containing 20% KBR3023 with four volunteers on Aedes aegypti (L.) (Diptera: Culicidae) in BG- and conventional cages. Repellent protection time was shorter in tests conducted with conventional cages. The average 95% protection time was 4.5 +/- 0.4 h in conventional cages and 7.5 +/- 0.6 h in the novel BG-cages. The protection times measured in BG-cages were more similar to the protection times determined with these repellents in field tests.

  15. Oppugning the assumptions of spatial averaging of segment and joint orientations.

    PubMed

    Pierrynowski, Michael Raymond; Ball, Kevin Arthur

    2009-02-09

    Movement scientists frequently calculate "arithmetic averages" when examining body segment or joint orientations. Such calculations appear routinely, yet are fundamentally flawed. Three-dimensional orientation data are computed as matrices, yet three-ordered Euler/Cardan/Bryant angle parameters are frequently used for interpretation. These parameters are not geometrically independent; thus, the conventional process of averaging each parameter is incorrect. The process of arithmetic averaging also assumes that the distances between data are linear (Euclidean); however, for the orientation data these distances are geodesically curved (Riemannian). Therefore we question (oppugn) whether use of the conventional averaging approach is an appropriate statistic. Fortunately, exact methods of averaging orientation data have been developed which both circumvent the parameterization issue, and explicitly acknowledge the Euclidean or Riemannian distance measures. The details of these matrix-based averaging methods are presented and their theoretical advantages discussed. The Euclidian and Riemannian approaches offer appealing advantages over the conventional technique. With respect to practical biomechanical relevancy, examinations of simulated data suggest that for sets of orientation data possessing characteristics of low dispersion, an isotropic distribution, and less than 30 degrees second and third angle parameters, discrepancies with the conventional approach are less than 1.1 degrees . However, beyond these limits, arithmetic averaging can have substantive non-linear inaccuracies in all three parameterized angles. The biomechanics community is encouraged to recognize that limitations exist with the use of the conventional method of averaging orientations. Investigations requiring more robust spatial averaging over a broader range of orientations may benefit from the use of matrix-based Euclidean or Riemannian calculations.

  16. Characterization of Organic and Conventional Coffee Using Neutron Activation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. A. De Nadai Fernandes; P. Bode; F. S. Tagliaferro

    2000-11-12

    Countries importing organic coffee are facing the difficulty of assessing the quality of the product to distinguish original organic coffee from other coffees, thereby eliminating possible fraud. Many analytical methods are matrix sensitive and require matrix-matching reference materials for validation, which are currently nonexistent. This work aims to establish the trace element characterization of organic and conventional Brazilian coffees and to establish correlations with the related soil and the type of fertilizer and agrochemicals applied. It was observed that the variability in element concentrations between the various types of coffee is not so large, which emphasizes the need for analyticalmore » methods of high accuracy, reproducibility, and a well-known uncertainty. Moreover, the analyses indicate that sometimes the coffee packages may contain some soil remnants.« less

  17. Applicability of a Conservative Margin Approach for Assessing NDE Flaw Detectability

    NASA Technical Reports Server (NTRS)

    Koshti, ajay M.

    2007-01-01

    Nondestructive Evaluation (NDE) procedures are required to detect flaws in structures with a high percentage detectability and high confidence. Conventional Probability of Detection (POD) methods are statistical in nature and require detection data from a relatively large number of flaw specimens. In many circumstances, due to the high cost and long lead time, it is impractical to build the large set of flaw specimens that is required by the conventional POD methodology. Therefore, in such situations it is desirable to have a flaw detectability estimation approach that allows for a reduced number of flaw specimens but provides a high degree of confidence in establishing the flaw detectability size. This paper presents an alternative approach called the conservative margin approach (CMA). To investigate the applicability of the CMA approach, flaw detectability sizes determined by the CMA and POD approaches have been compared on actual datasets. The results of these comparisons are presented and the applicability of the CMA approach is discussed.

  18. Monitoring Poisson's Ratio Degradation of FRP Composites under Fatigue Loading Using Biaxially Embedded FBG Sensors.

    PubMed

    Akay, Erdem; Yilmaz, Cagatay; Kocaman, Esat S; Turkmen, Halit S; Yildiz, Mehmet

    2016-09-19

    The significance of strain measurement is obvious for the analysis of Fiber-Reinforced Polymer (FRP) composites. Conventional strain measurement methods are sufficient for static testing in general. Nevertheless, if the requirements exceed the capabilities of these conventional methods, more sophisticated techniques are necessary to obtain strain data. Fiber Bragg Grating (FBG) sensors have many advantages for strain measurement over conventional ones. Thus, the present paper suggests a novel method for biaxial strain measurement using embedded FBG sensors during the fatigue testing of FRP composites. Poisson's ratio and its reduction were monitored for each cyclic loading by using embedded FBG sensors for a given specimen and correlated with the fatigue stages determined based on the variations of the applied fatigue loading and temperature due to the autogenous heating to predict an oncoming failure of the continuous fiber-reinforced epoxy matrix composite specimens under fatigue loading. The results show that FBG sensor technology has a remarkable potential for monitoring the evolution of Poisson's ratio on a cycle-by-cycle basis, which can reliably be used towards tracking the fatigue stages of composite for structural health monitoring purposes.

  19. Monitoring Poisson’s Ratio Degradation of FRP Composites under Fatigue Loading Using Biaxially Embedded FBG Sensors

    PubMed Central

    Akay, Erdem; Yilmaz, Cagatay; Kocaman, Esat S.; Turkmen, Halit S.; Yildiz, Mehmet

    2016-01-01

    The significance of strain measurement is obvious for the analysis of Fiber-Reinforced Polymer (FRP) composites. Conventional strain measurement methods are sufficient for static testing in general. Nevertheless, if the requirements exceed the capabilities of these conventional methods, more sophisticated techniques are necessary to obtain strain data. Fiber Bragg Grating (FBG) sensors have many advantages for strain measurement over conventional ones. Thus, the present paper suggests a novel method for biaxial strain measurement using embedded FBG sensors during the fatigue testing of FRP composites. Poisson’s ratio and its reduction were monitored for each cyclic loading by using embedded FBG sensors for a given specimen and correlated with the fatigue stages determined based on the variations of the applied fatigue loading and temperature due to the autogenous heating to predict an oncoming failure of the continuous fiber-reinforced epoxy matrix composite specimens under fatigue loading. The results show that FBG sensor technology has a remarkable potential for monitoring the evolution of Poisson’s ratio on a cycle-by-cycle basis, which can reliably be used towards tracking the fatigue stages of composite for structural health monitoring purposes. PMID:28773901

  20. Massively parallel whole genome amplification for single-cell sequencing using droplet microfluidics.

    PubMed

    Hosokawa, Masahito; Nishikawa, Yohei; Kogawa, Masato; Takeyama, Haruko

    2017-07-12

    Massively parallel single-cell genome sequencing is required to further understand genetic diversities in complex biological systems. Whole genome amplification (WGA) is the first step for single-cell sequencing, but its throughput and accuracy are insufficient in conventional reaction platforms. Here, we introduce single droplet multiple displacement amplification (sd-MDA), a method that enables massively parallel amplification of single cell genomes while maintaining sequence accuracy and specificity. Tens of thousands of single cells are compartmentalized in millions of picoliter droplets and then subjected to lysis and WGA by passive droplet fusion in microfluidic channels. Because single cells are isolated in compartments, their genomes are amplified to saturation without contamination. This enables the high-throughput acquisition of contamination-free and cell specific sequence reads from single cells (21,000 single-cells/h), resulting in enhancement of the sequence data quality compared to conventional methods. This method allowed WGA of both single bacterial cells and human cancer cells. The obtained sequencing coverage rivals those of conventional techniques with superior sequence quality. In addition, we also demonstrate de novo assembly of uncultured soil bacteria and obtain draft genomes from single cell sequencing. This sd-MDA is promising for flexible and scalable use in single-cell sequencing.

  1. Greener Techniques for the Synthesis of Silver Nanoparticles Using Plant Extracts, Enzymes, Bacteria, Biodegradable Polymers, and Microwaves

    EPA Science Inventory

    The use of silver nanoparticles (AgNPs) is gaining in popularity due to silver’s antibacterial properties. Conventional methods for AgNP synthesis require dangerous chemicals and large quantities of energy (heat) and can result in formation of hazardous by-products. This article ...

  2. 50 CFR 300.104 - Scientific research.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to be used. (vi) Survey design and methods of data analyses. (vii) Data to be collected. (3) A... for Finfish Surveys in the Convention Area when the Total Catch is Expected to be More Than 50 Tons to... Administrator. (2) The format requires: (i) The name of the CCAMLR Member. (ii) Survey details. (iii...

  3. 50 CFR 300.104 - Scientific research.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to be used. (vi) Survey design and methods of data analyses. (vii) Data to be collected. (3) A... for Finfish Surveys in the Convention Area when the Total Catch is Expected to be More Than 50 Tons to... Administrator. (2) The format requires: (i) The name of the CCAMLR Member. (ii) Survey details. (iii...

  4. 50 CFR 300.104 - Scientific research.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to be used. (vi) Survey design and methods of data analyses. (vii) Data to be collected. (3) A... for Finfish Surveys in the Convention Area when the Total Catch is Expected to be More Than 50 Tons to... Administrator. (2) The format requires: (i) The name of the CCAMLR Member. (ii) Survey details. (iii...

  5. 50 CFR 300.104 - Scientific research.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to be used. (vi) Survey design and methods of data analyses. (vii) Data to be collected. (3) A... for Finfish Surveys in the Convention Area when the Total Catch is Expected to be More Than 50 Tons to... Administrator. (2) The format requires: (i) The name of the CCAMLR Member. (ii) Survey details. (iii...

  6. 50 CFR 300.104 - Scientific research.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to be used. (vi) Survey design and methods of data analyses. (vii) Data to be collected. (3) A... for Finfish Surveys in the Convention Area when the Total Catch is Expected to be More Than 50 Tons to... Administrator. (2) The format requires: (i) The name of the CCAMLR Member. (ii) Survey details. (iii...

  7. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part I. Model development

    USDA-ARS?s Scientific Manuscript database

    Infrared (IR) dry-peeling has emerged as an effective non-chemical alternative to conventional lye and steam methods of peeling tomatoes. Successful peel separation induced by IR radiation requires the delivery of a sufficient amount of thermal energy onto tomato surface in a very short duration. Th...

  8. CACTUS: Command and Control Training Using Knowledge-Based Simulations

    ERIC Educational Resources Information Center

    Hartley, Roger; Ravenscroft, Andrew; Williams, R. J.

    2008-01-01

    The CACTUS project was concerned with command and control training of large incidents where public order may be at risk, such as large demonstrations and marches. The training requirements and objectives of the project are first summarized justifying the use of knowledge-based computer methods to support and extend conventional training…

  9. Detection limits and cost comparisons of human- and gull-associated conventional and quantitative PCR assays in artificial and environmental waters

    EPA Science Inventory

    Modern techniques for tracking fecal pollution in environmental waters require investing in DNA-based methods to determine the presence of specific fecal sources. To help water quality managers decide whether to employ routine polymerase chain reaction (PCR) or quantitative PC...

  10. An Alternative Educational Method in Early Childhood: Museum Education

    ERIC Educational Resources Information Center

    Akamca, Güzin Özyilmaz; Yildirim, R. Gunseli; Ellez, A. Murat

    2017-01-01

    According to the preschool education program that came into effect by Turkish Ministry of Education in Turkey in 2013, teaching should be offered not only in classrooms but also in places outside classrooms likely to boost learning. The program required utilizing learning techniques, and environments different from conventional ones. The aim of…

  11. Rapid protein concentration, efficient fluorescence labeling and purification on a micro/nanofluidics chip.

    PubMed

    Wang, Chen; Ouyang, Jun; Ye, De-Kai; Xu, Jing-Juan; Chen, Hong-Yuan; Xia, Xing-Hua

    2012-08-07

    Fluorescence analysis has proved to be a powerful detection technique for achieving single molecule analysis. However, it usually requires the labeling of targets with bright fluorescent tags since most chemicals and biomolecules lack fluorescence. Conventional fluorescence labeling methods require a considerable quantity of biomolecule samples, long reaction times and extensive chromatographic purification procedures. Herein, a micro/nanofluidics device integrating a nanochannel in a microfluidics chip has been designed and fabricated, which achieves rapid protein concentration, fluorescence labeling, and efficient purification of product in a miniaturized and continuous manner. As a demonstration, labeling of the proteins bovine serum albumin (BSA) and IgG with fluorescein isothiocyanate (FITC) is presented. Compared to conventional methods, the present micro/nanofluidics device performs about 10(4)-10(6) times faster BSA labeling with 1.6 times higher yields due to the efficient nanoconfinement effect, improved mass, and heat transfer in the chip device. The results demonstrate that the present micro/nanofluidics device promises rapid and facile fluorescence labeling of small amount of reagents such as proteins, nucleic acids and other biomolecules with high efficiency.

  12. A Non-Intrusive Algorithm for Sensitivity Analysis of Chaotic Flow Simulations

    NASA Technical Reports Server (NTRS)

    Blonigan, Patrick J.; Wang, Qiqi; Nielsen, Eric J.; Diskin, Boris

    2017-01-01

    We demonstrate a novel algorithm for computing the sensitivity of statistics in chaotic flow simulations to parameter perturbations. The algorithm is non-intrusive but requires exposing an interface. Based on the principle of shadowing in dynamical systems, this algorithm is designed to reduce the effect of the sampling error in computing sensitivity of statistics in chaotic simulations. We compare the effectiveness of this method to that of the conventional finite difference method.

  13. Continuous Dynamic Simulation of Nonlinear Aerodynamics/Nonlinear Structure Interaction (NANSI) for Morphing Vehicles

    DTIC Science & Technology

    2010-03-31

    presented in the AFRL organized Aeroelastic Workshop in Sedona October 2008, and at the AVT-168 Symposium on Morphing Vehicles, Lisbon, Portugal April 2009...surface geometry. - Conventional deforming grid methods will fail at a point when the geometry change becomes large, no matter how good the method...Numb’ Martian Entry* Knudson number: Kn _ M.a GasKinetic parameter ASU . flttA TKHNOLOGY Overview • Ballute aeroelastic problem requires

  14. Visualizing Similarity of Appearance by Arrangement of Cards

    PubMed Central

    Nakatsuji, Nao; Ihara, Hisayasu; Seno, Takeharu; Ito, Hiroshi

    2016-01-01

    This study proposes a novel method to extract the configuration of the psychological space by directly measuring subjects' similarity rating without computational work. Although multidimensional scaling (MDS) is well-known as a conventional method for extracting the psychological space, the method requires many pairwise evaluations. The times taken for evaluations increase in proportion to the square of the number of objects in MDS. The proposed method asks subjects to arrange cards on a poster sheet according to the degree of similarity of the objects. To compare the performance of the proposed method with the conventional one, we developed similarity maps of typefaces through the proposed method and through non-metric MDS. We calculated the trace correlation coefficient among all combinations of the configuration for both methods to evaluate the degree of similarity in the obtained configurations. The threshold value of trace correlation coefficient for statistically discriminating similar configuration was decided based on random data. The ratio of the trace correlation coefficient exceeding the threshold value was 62.0% so that the configurations of the typefaces obtained by the proposed method closely resembled those obtained by non-metric MDS. The required duration for the proposed method was approximately one third of the non-metric MDS's duration. In addition, all distances between objects in all the data for both methods were calculated. The frequency for the short distance in the proposed method was lower than that of the non-metric MDS so that a relatively small difference was likely to be emphasized among objects in the configuration by the proposed method. The card arrangement method we here propose, thus serves as a easier and time-saving tool to obtain psychological structures in the fields related to similarity of appearance. PMID:27242611

  15. Comparing conventional Descriptive Analysis and Napping®-UFP against physiochemical measurements: a case study using apples.

    PubMed

    Pickup, William; Bremer, Phil; Peng, Mei

    2018-03-01

    The extensive time and cost associated with conventional sensory profiling methods has spurred sensory researchers to develop rapid method alternatives, such as Napping® with Ultra-Flash Profiling (UFP). Napping®-UFP generates sensory maps by requiring untrained panellists to separate samples based on perceived sensory similarities. Evaluations of this method have been restrained to manufactured/formulated food models, and predominantly structured on comparisons against the conventional descriptive method. The present study aims to extend the validation of Napping®-UFP (N = 72) to natural biological products; and to evaluate this method against Descriptive Analysis (DA; N = 8) with physiochemical measurements as an additional evaluative criterion. The results revealed that sample configurations generated by DA and Napping®-UFP were not significantly correlated (RV = 0.425, P = 0.077); however, they were both correlated with the product map generated based on the instrumental measures (P < 0.05). The finding also noted that sample characterisations from DA and Napping®-UFP were driven by different sensory attributes, indicating potential structural differences between these two methods in configuring samples. Overall, these findings lent support for the extended use of Napping®-UFP for evaluations of natural biological products. Although DA was shown to be a better method for establishing sensory-instrumental relationships, Napping®-UFP exhibited strengths in generating informative sample configurations based on holistic perception of products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  16. DNA-conjugated gold nanoparticles based colorimetric assay to assess helicase activity: a novel route to screen potential helicase inhibitors

    NASA Astrophysics Data System (ADS)

    Deka, Jashmini; Mojumdar, Aditya; Parisse, Pietro; Onesti, Silvia; Casalis, Loredana

    2017-03-01

    Helicase are essential enzymes which are widespread in all life-forms. Due to their central role in nucleic acid metabolism, they are emerging as important targets for anti-viral, antibacterial and anti-cancer drugs. The development of easy, cheap, fast and robust biochemical assays to measure helicase activity, overcoming the limitations of the current methods, is a pre-requisite for the discovery of helicase inhibitors through high-throughput screenings. We have developed a method which exploits the optical properties of DNA-conjugated gold nanoparticles (AuNP) and meets the required criteria. The method was tested with the catalytic domain of the human RecQ4 helicase and compared with a conventional FRET-based assay. The AuNP-based assay produced similar results but is simpler, more robust and cheaper than FRET. Therefore, our nanotechnology-based platform shows the potential to provide a useful alternative to the existing conventional methods for following helicase activity and to screen small-molecule libraries as potential helicase inhibitors.

  17. Combination of ray-tracing and the method of moments for electromagnetic radiation analysis using reduced meshes

    NASA Astrophysics Data System (ADS)

    Delgado, Carlos; Cátedra, Manuel Felipe

    2018-05-01

    This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.

  18. Development of an image operation system with a motion sensor in dental radiology.

    PubMed

    Sato, Mitsuru; Ogura, Toshihiro; Yasumoto, Yoshiaki; Kadowaki, Yuta; Hayashi, Norio; Doi, Kunio

    2015-07-01

    During examinations and/or treatment, a dentist in the examination room needs to view images with a proper display system. However, they cannot operate the image display system by hands, because dentists always wear gloves to be kept their hands away from unsanitized materials. Therefore, we developed a new image operating system that uses a motion sensor. We used the Leap motion sensor technique to read the hand movements of a dentist. We programmed the system using C++ to enable various movements of the display system, i.e., click, double click, drag, and drop. Thus, dentists with their gloves on in the examination room can control dental and panoramic images on the image display system intuitively and quickly with movement of their hands only. We investigated the time required with the conventional method using a mouse and with the new method using the finger operation. The average operation time with the finger method was significantly shorter than that with the mouse method. This motion sensor method, with appropriate training for finger movements, can provide a better operating performance than the conventional mouse method.

  19. [Ultrahigh dose-rate, "flash" irradiation minimizes the side-effects of radiotherapy].

    PubMed

    Favaudon, V; Fouillade, C; Vozenin, M-C

    2015-10-01

    Pencil beam scanning and filter free techniques may involve dose-rates considerably higher than those used in conventional external-beam radiotherapy. Our purpose was to investigate normal tissue and tumour responses in vivo to short pulses of radiation. C57BL/6J mice were exposed to bilateral thorax irradiation using pulsed (at least 40 Gy/s, flash) or conventional dose-rate irradiation (0.03 Gy/s or less) in single dose. Immunohistochemical and histological methods were used to compare early radio-induced apoptosis and the development of lung fibrosis in the two situations. The response of two human (HBCx-12A, HEp-2) tumour xenografts in nude mice and one syngeneic, orthotopic lung carcinoma in C57BL/6J mice (TC-1 Luc+), was monitored in both radiation modes. A 17 Gy conventional irradiation induced pulmonary fibrosis and activation of the TGF-beta cascade in 100% of the animals 24-36 weeks post-treatment, as expected, whereas no animal developed complications below 23 Gy flash irradiation, and a 30 Gy flash irradiation was required to induce the same extent of fibrosis as 17 Gy conventional irradiation. Cutaneous lesions were also reduced in severity. Flash irradiation protected vascular and bronchial smooth muscle cells as well as epithelial cells of bronchi against acute apoptosis as shown by analysis of caspase-3 activation and TUNEL staining. In contrast, the antitumour effectiveness of flash irradiation was maintained and not different from that of conventional irradiation. Flash irradiation shifted by a large factor the threshold dose required to initiate lung fibrosis without loss of the antitumour efficiency, suggesting that the method might be used to advantage to minimize the complications of radiotherapy. Copyright © 2015 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  20. Development of Al2O3 electrospun fibers prepared by conventional sintering method or plasma assisted surface calcination

    NASA Astrophysics Data System (ADS)

    Mudra, E.; Streckova, M.; Pavlinak, D.; Medvecka, V.; Kovacik, D.; Kovalcikova, A.; Zubko, P.; Girman, V.; Dankova, Z.; Koval, V.; Duzsa, J.

    2017-09-01

    In this paper, the electrospinning method was used for preparation of α-Al2O3 microfibers from PAN/Al(NO3)3 precursor solution. The precursor fibers were thermally treated by conventional method in furnace or low-temperature plasma induced surface sintering method in ambient air. The four different temperatures of PAN/Al(NO3)3 precursors were chosen for formation of α-Al2O3 phase by conventional sintering way according to the transition features observed in the TG/DSC analysis. In comparison, the low-temperature plasma treatment at atmospheric pressure was used as an alternative sintering method at the exposure times of 5, 10 and 30 min. FTIR analysis was used for evaluation of residual polymer after plasma induced calcination and for studying the mechanism of polymer degradation. The polycrystalline alumina fibers arranged with the nanoparticles was created continuously throughout the whole volume of the sample. On the other side the low temperature approach, high density of reactive species and high power density of plasma generated at atmospheric pressure by used plasma source allowed rapid removal of polymer in preference from the surface of fibers leading to the formation of composite ceramic/polymer fibers. This plasma induced sintering of PAN/Al(NO3)3 can have obvious importance in industrial applications where the ceramic character of surface with higher toughness of the fibers are required.

  1. Cold Pad-Batch dyeing method for cotton fabric dyeing with reactive dyes using ultrasonic energy.

    PubMed

    Khatri, Zeeshan; Memon, Muhammad Hanif; Khatri, Awais; Tanwari, Anwaruddin

    2011-11-01

    Reactive dyes are vastly used in dyeing and printing of cotton fibre. These dyes have a distinctive reactive nature due to active groups which form covalent bonds with -OH groups of cotton through substitution and/or addition mechanism. Among many methods used for dyeing cotton with reactive dyes, the Cold Pad Batch (CPB) method is relatively more environment friendly due to high dye fixation and non requirement of thermal energy. The dyed fabric production rate is low due to requirement of at least twelve hours batching time for dye fixation. The proposed CPB method for dyeing cotton involves ultrasonic energy resulting into a one third decrease in batching time. The dyeing of cotton fibre was carried out with CI reactive red 195 and CI reactive black 5 by conventional and ultrasonic (US) method. The study showed that the use of ultrasonic energy not only shortens the batching time but the alkalis concentrations can considerably be reduced. In this case, the colour strength (K/S) and dye fixation (%F) also enhances without any adverse effect on colour fastness of the dyed fabric. The appearance of dyed fibre surface using scanning electron microscope (SEM) showed relative straightening of fibre convolutions and significant swelling of the fibre upon ultrasonic application. The total colour difference values ΔE (CMC) for the proposed method, were found within close proximity to the conventionally dyed sample. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Real-time fluorescence loop mediated isothermal amplification for the diagnosis of malaria.

    PubMed

    Lucchi, Naomi W; Demas, Allison; Narayanan, Jothikumar; Sumari, Deborah; Kabanywanyi, Abdunoor; Kachur, S Patrick; Barnwell, John W; Udhayakumar, Venkatachalam

    2010-10-29

    Molecular diagnostic methods can complement existing tools to improve the diagnosis of malaria. However, they require good laboratory infrastructure thereby restricting their use to reference laboratories and research studies. Therefore, adopting molecular tools for routine use in malaria endemic countries will require simpler molecular platforms. The recently developed loop-mediated isothermal amplification (LAMP) method is relatively simple and can be improved for better use in endemic countries. In this study, we attempted to improve this method for malaria diagnosis by using a simple and portable device capable of performing both the amplification and detection (by fluorescence) of LAMP in one platform. We refer to this as the RealAmp method. Published genus-specific primers were used to test the utility of this method. DNA derived from different species of malaria parasites was used for the initial characterization. Clinical samples of P. falciparum were used to determine the sensitivity and specificity of this system compared to microscopy and a nested PCR method. Additionally, directly boiled parasite preparations were compared with a conventional DNA isolation method. The RealAmp method was found to be simple and allowed real-time detection of DNA amplification. The time to amplification varied but was generally less than 60 minutes. All human-infecting Plasmodium species were detected. The sensitivity and specificity of RealAmp in detecting P. falciparum was 96.7% and 91.7% respectively, compared to microscopy and 98.9% and 100% respectively, compared to a standard nested PCR method. In addition, this method consistently detected P. falciparum from directly boiled blood samples. This RealAmp method has great potential as a field usable molecular tool for diagnosis of malaria. This tool can provide an alternative to conventional PCR based diagnostic methods for field use in clinical and operational programs.

  3. A simple and rapid microwave-assisted hematoxylin and eosin staining method using 1,1,1 trichloroethane as a dewaxing and a clearing agent.

    PubMed

    Temel, S G; Noyan, S; Cavusoglu, I; Kahveci, Z

    2005-01-01

    The use and practicability of microwave-assisted staining procedures in routine histopathology has been well established for more than 17 years. In the study reported here, we aimed to examine an alternative approach that would shorten the duration of dewaxing and clearing steps of hematoxylin and eosin (H & E) staining of paraffin sections by using a microwave oven. Although xylene is one of the most popular dewaxing and clearing agents, its flammability restricts its use in a microwave oven; thus we preferred 1,1,1 trichloroethane, which is not flammable, as the dewaxing and clearing agent in the present study. In Group I and Group II (control groups), intestine was processed with xylene and 1,1,1 trichloroethane, respectively. The sections were then stained with H & E according to the conventional staining protocol at room temperature and subdivided into two groups according to the duration of dewaxing and clearing in xylene. In Groups III and IV (experimental groups) similar tissues were processed with xylene and 1,1,1 trichloroethane, respectively; however, sections from these groups were divided into four subgroups to study the period required for dewaxing and clearing in 1,1,1 trichloroethane, then stained with H & E in the microwave oven at 360 W for 30 sec. Our conventional H & E staining procedure, which includes dewaxing, staining and clearing of sections, requires approximately 90 min, while our method using 1,1,1 trichloroethane and microwave heating required only 2 min. Our alternative method for H & E staining not only reduced the procedure time significantly, but also yielded staining quality equal or superior to those stained the conventional way. Our results suggest that 1,1,1 trichloroethane can be used effectively and safely as a dewaxing and clearing agent for H & E staining in a microwave oven.

  4. In vitro starch digestibility and expected glycemic index of pound cakes baked in two-cycle microwave-toaster and conventional oven.

    PubMed

    García-zaragoza, Francisco J; Sánchez-Pardo, María E; Ortiz-Moreno, Alicia; Bello-Pérez, Luis A

    2010-11-01

    Bread baking technology has an important effect on starch digestibility measured as its predicted glycemic index tested in vitro. The aim of this work was to evaluate the changes in predicted glycemic index of pound cake baked in a two-cycle microwave toaster and a conventional oven. The glycemic index was calculated from hydrolysis index values by the Granfeldt method. Non-significant differences (P > 0.05) were found in hydrolysis index (60.67 ± 3.96 for the product baked in microwave oven and 65.94 ± 4.09 for the product baked in conventional oven) and predicted glycemic index content (60.5 for product baked in microwave oven and 65 for the product baked in conventional oven) in freshly-baked samples. Results clearly demonstrate that the baking pound cake conventional process could be replicated using a two-cycle multifunction microwave oven, reducing the traditional baking time. Further research is required in order to achieve pound cake crumb uniformity.

  5. Food powders flowability characterization: theory, methods, and applications.

    PubMed

    Juliano, Pablo; Barbosa-Cánovas, Gustavo V

    2010-01-01

    Characterization of food powders flowability is required for predicting powder flow from hoppers in small-scale systems such as vending machines or at the industrial scale from storage silos or bins dispensing into powder mixing systems or packaging machines. This review covers conventional and new methods used to measure flowability in food powders. The method developed by Jenike (1964) for determining hopper outlet diameter and hopper angle has become a standard for the design of bins and is regarded as a standard method to characterize flowability. Moreover, there are a number of shear cells that can be used to determine failure properties defined by Jenike's theory. Other classic methods (compression, angle of repose) and nonconventional methods (Hall flowmeter, Johanson Indicizer, Hosokawa powder tester, tensile strength tester, powder rheometer), used mainly for the characterization of food powder cohesiveness, are described. The effect of some factors preventing flow, such as water content, temperature, time consolidation, particle composition and size distribution, is summarized for the characterization of specific food powders with conventional and other methods. Whereas time-consuming standard methods established for hopper design provide flow properties, there is yet little comparative evidence demonstrating that other rapid methods may provide similar flow prediction.

  6. A coupling method for a cardiovascular simulation model which includes the Kalman filter.

    PubMed

    Hasegawa, Yuki; Shimayoshi, Takao; Amano, Akira; Matsuda, Tetsuya

    2012-01-01

    Multi-scale models of the cardiovascular system provide new insight that was unavailable with in vivo and in vitro experiments. For the cardiovascular system, multi-scale simulations provide a valuable perspective in analyzing the interaction of three phenomenons occurring at different spatial scales: circulatory hemodynamics, ventricular structural dynamics, and myocardial excitation-contraction. In order to simulate these interactions, multiscale cardiovascular simulation systems couple models that simulate different phenomena. However, coupling methods require a significant amount of calculation, since a system of non-linear equations must be solved for each timestep. Therefore, we proposed a coupling method which decreases the amount of calculation by using the Kalman filter. In our method, the Kalman filter calculates approximations for the solution to the system of non-linear equations at each timestep. The approximations are then used as initial values for solving the system of non-linear equations. The proposed method decreases the number of iterations required by 94.0% compared to the conventional strong coupling method. When compared with a smoothing spline predictor, the proposed method required 49.4% fewer iterations.

  7. Comparisons of fully automated syphilis tests with conventional VDRL and FTA-ABS tests.

    PubMed

    Choi, Seung Jun; Park, Yongjung; Lee, Eun Young; Kim, Sinyoung; Kim, Hyon-Suk

    2013-06-01

    Serologic tests are widely used for the diagnosis of syphilis. However, conventional methods require well-trained technicians to produce reliable results. We compared automated nontreponemal and treponemal tests with conventional methods. The HiSens Auto Rapid Plasma Reagin (AutoRPR) and Treponema Pallidum particle agglutination (AutoTPPA) tests, which utilize latex turbidimetric immunoassay, were assessed. A total of 504 sera were assayed by AutoRPR, AutoTPPA, conventional VDRL and FTA-ABS. Among them, 250 samples were also tested by conventional TPPA. The concordance rate between the results of VDRL and AutoRPR was 67.5%, and 164 discrepant cases were all VDRL reactive but AutoRPR negative. In the 164 cases, 133 showed FTA-ABS reactivity. Medical records of 106 among the 133 cases were reviewed, and 82 among 106 specimens were found to be collected from patients already treated for syphilis. The concordance rate between the results of AutoTPPA and FTA-ABS was 97.8%. The results of conventional TPPA and AutoTPPA for 250 samples were concordant in 241 cases (96.4%). AutoRPR showed higher specificity than that of VDRL, while VDRL demonstrated higher sensitivity than that of AutoRPR regardless of whether the patients had been already treated for syphilis or not. Both FTA-ABS and AutoTPPA showed high sensitivities and specificities greater than 98.0%. Automated RPR and TPPA tests could be alternatives to conventional syphilis tests, and AutoRPR would be particularly suitable in treatment monitoring, since results by AutoRPR in cases after treatment became negative more rapidly than by VDRL. Copyright © 2013. Published by Elsevier Inc.

  8. Plants with genetically modified events combined by conventional breeding: an assessment of the need for additional regulatory data.

    PubMed

    Pilacinski, W; Crawford, A; Downey, R; Harvey, B; Huber, S; Hunst, P; Lahman, L K; MacIntosh, S; Pohl, M; Rickard, C; Tagliani, L; Weber, N

    2011-01-01

    Crop varieties with multiple GM events combined by conventional breeding have become important in global agriculture. The regulatory requirements in different countries for such products vary considerably, placing an additional burden on regulatory agencies in countries where the submission of additional data is required and delaying the introduction of innovative products to meet agricultural needs. The process of conventional plant breeding has predictably provided safe food and feed products both historically and in the modern era of plant breeding. Thus, previously approved GM events that have been combined by conventional plant breeding and contain GM traits that are not likely to interact in a manner affecting safety should be considered to be as safe as their conventional counterparts. Such combined GM event crop varieties should require little, if any, additional regulatory data to meet regulatory requirements. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Optical sectioning microscopy using two-frame structured illumination and Hilbert-Huang data processing

    NASA Astrophysics Data System (ADS)

    Trusiak, M.; Patorski, K.; Tkaczyk, T.

    2014-12-01

    We propose a fast, simple and experimentally robust method for reconstructing background-rejected optically-sectioned microscopic images using two-shot structured illumination approach. Innovative data demodulation technique requires two grid-illumination images mutually phase shifted by π (half a grid period) but precise phase displacement value is not critical. Upon subtraction of the two frames the input pattern with increased grid modulation is computed. The proposed demodulation procedure comprises: (1) two-dimensional data processing based on the enhanced, fast empirical mode decomposition (EFEMD) method for the object spatial frequency selection (noise reduction and bias term removal), and (2) calculating high contrast optically-sectioned image using the two-dimensional spiral Hilbert transform (HS). The proposed algorithm effectiveness is compared with the results obtained for the same input data using conventional structured-illumination (SIM) and HiLo microscopy methods. The input data were collected for studying highly scattering tissue samples in reflectance mode. In comparison with the conventional three-frame SIM technique we need one frame less and no stringent requirement on the exact phase-shift between recorded frames is imposed. The HiLo algorithm outcome is strongly dependent on the set of parameters chosen manually by the operator (cut-off frequencies for low-pass and high-pass filtering and η parameter value for optically-sectioned image reconstruction) whereas the proposed method is parameter-free. Moreover very short processing time required to efficiently demodulate the input pattern predestines proposed method for real-time in-vivo studies. Current implementation completes full processing in 0.25s using medium class PC (Inter i7 2,1 GHz processor and 8 GB RAM). Simple modification employed to extract only first two BIMFs with fixed filter window size results in reducing the computing time to 0.11s (8 frames/s).

  10. Rearranging the lenslet array of the compact passive interference imaging system with high resolution

    NASA Astrophysics Data System (ADS)

    Liu, Gang; Wen, Desheng; Song, Zongxi

    2017-10-01

    With the development of aeronautics and astronautics, higher resolution requirement of the telescope was necessary. However, the increase in resolution of conventional telescope required larger apertures, whose size, weight and power consumption could be prohibitively expensive. This limited the further development of the telescope. This paper introduced a new imaging technology using interference—Compact Passive Interference Imaging Technology with High Resolution, and proposed a rearranging method for the arrangement of the lenslet array to obtain continuously object spatial frequency.

  11. A simplified flight-test method for determining aircraft takeoff performance that includes effects of pilot technique

    NASA Technical Reports Server (NTRS)

    Larson, T. J.; Schweikhard, W. G.

    1974-01-01

    A method for evaluating aircraft takeoff performance from brake release to air-phase height that requires fewer tests than conventionally required is evaluated with data for the XB-70 airplane. The method defines the effects of pilot technique on takeoff performance quantitatively, including the decrease in acceleration from drag due to lift. For a given takeoff weight and throttle setting, a single takeoff provides enough data to establish a standardizing relationship for the distance from brake release to any point where velocity is appropriate to rotation. The lower rotation rates penalized takeoff performance in terms of ground roll distance; the lowest observed rotation rate required a ground roll distance that was 19 percent longer than the highest. Rotations at the minimum rate also resulted in lift-off velocities that were approximately 5 knots lower than the highest rotation rate at any given lift-off distance.

  12. Solar cell efficiency and high temperature processing of n-type silicon grown by the noncontact crucible method

    DOE PAGES

    Jensen, Mallory A.; LaSalvia, Vincenzo; Morishige, Ashley E.; ...

    2016-08-01

    The capital expense (capex) of conventional crystal growth methods is a barrier to sustainable growth of the photovoltaic industry. It is challenging for innovative techniques to displace conventional growth methods due the low dislocation density and high lifetime required for high efficiency devices. One promising innovation in crystal growth is the noncontact crucible method (NOC-Si), which combines aspects of Czochralski (Cz) and conventional casting. This material has the potential to satisfy the dual requirements, with capex likely between that of Cz (high capex) and multicrystalline silicon (mc-Si, low capex). In this contribution, we observe a strong dependence of solar cellmore » efficiency on ingot height, correlated with the evolution of swirl-like defects, for single crystalline n-type silicon grown by the NOC-Si method. We posit that these defects are similar to those observed in Cz, and we explore the response of NOC-Si to high temperature treatments including phosphorous diffusion gettering (PDG) and Tabula Rasa (TR). The highest lifetimes (2033 us for the top of the ingot and 342 us for the bottom of the ingot) are achieved for TR followed by a PDG process comprising a standard plateau and a low temperature anneal. Further improvements can be gained by tailoring the time-temperature profiles of each process. Lifetime analysis after the PDG process indicates the presence of a getterable impurity in the as-grown material, while analysis after TR points to the presence of oxide precipitates especially at the bottom of the ingot. Uniform lifetime degradation is observed after TR which we assign to a presently unknown defect. Lastly, future work includes additional TR processing to uncover the nature of this defect, microstructural characterization of suspected oxide precipitates, and optimization of the TR process to achieve the dual goals of high lifetime and spatial homogenization.« less

  13. Applying Standard Independent Verification and Validation (IVV) Techniques Within an Agile Framework: Is There a Compatibility Issue?

    NASA Technical Reports Server (NTRS)

    Dabney, James B.; Arthur, James Douglas

    2017-01-01

    Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.

  14. The Detection Method of Escherichia coli in Water Resources: A Review

    NASA Astrophysics Data System (ADS)

    Nurliyana, M. R.; Sahdan, M. Z.; Wibowo, K. M.; Muslihati, A.; Saim, H.; Ahmad, S. A.; Sari, Y.; Mansor, Z.

    2018-04-01

    This article reviews several approaches for Escherichia coli (E. coli) bacteria detection from conventional methods, emerging method and goes to biosensor-based techniques. Detection and enumeration of E. coli bacteria usually required long duration of time in obtaining the result since laboratory-based approach is normally used in its assessment. It requires 24 hours to 72 hours after sampling to process the culturing samples before results are available. Although faster technique for detecting E. coli in water such as Polymerase Chain Reaction (PCR) and Enzyme-Linked Immunosorbent Assay (ELISA) have been developed, it still required transporting the samples from water resources to the laboratory, high-cost, complicated equipment usage, complex procedures, as well as the requirement of skilled specialist to cope with the complexity which limit their wide spread practice in water quality detection. Recently, development of biosensor device that is easy to perform, portable, highly sensitive and selective becomes indispensable in detecting extremely lower consolidation of pathogenic E. coli bacteria in water samples.

  15. Reinforcing the role of the conventional C-arm--a novel method for simplified distal interlocking.

    PubMed

    Windolf, Markus; Schroeder, Josh; Fliri, Ladina; Dicht, Benno; Liebergall, Meir; Richards, R Geoff

    2012-01-25

    The common practice for insertion of distal locking screws of intramedullary nails is a freehand technique under fluoroscopic control. The process is technically demanding, time-consuming and afflicted to considerable radiation exposure of the patient and the surgical personnel. A new concept is introduced utilizing information from within conventional radiographic images to help accurately guide the surgeon to place the interlocking bolt into the interlocking hole. The newly developed technique was compared to conventional freehand in an operating room (OR) like setting on human cadaveric lower legs in terms of operating time and radiation exposure. The proposed concept (guided freehand), generally based on the freehand gold standard, additionally guides the surgeon by means of visible landmarks projected into the C-arm image. A computer program plans the correct drilling trajectory by processing the lens-shaped hole projections of the interlocking holes from a single image. Holes can be drilled by visually aligning the drill to the planned trajectory. Besides a conventional C-arm, no additional tracking or navigation equipment is required.Ten fresh frozen human below-knee specimens were instrumented with an Expert Tibial Nail (Synthes GmbH, Switzerland). The implants were distally locked by performing the newly proposed technique as well as the conventional freehand technique on each specimen. An orthopedic resident surgeon inserted four distal screws per procedure. Operating time, number of images and radiation time were recorded and statistically compared between interlocking techniques using non-parametric tests. A 58% reduction in number of taken images per screw was found for the guided freehand technique (7.4 ± 3.4) (mean ± SD) compared to the freehand technique (17.6 ± 10.3) (p < 0.001). Total radiation time (all 4 screws) was 55% lower for the guided freehand technique compared to conventional freehand (p = 0.001). Operating time per screw (from first shot to screw tightened) was on average 22% reduced by guided freehand (p = 0.018). In an experimental setting, the newly developed guided freehand technique for distal interlocking has proven to markedly reduce radiation exposure when compared to the conventional freehand technique. The method utilizes established clinical workflows and does not require cost intensive add-on devices or extensive training. The underlying principle carries potential to assist implant positioning in numerous other applications within orthopedics and trauma from screw insertions to placement of plates, nails or prostheses.

  16. Objective structured clinical examination "Death Certificate" station - Computer-based versus conventional exam format.

    PubMed

    Biolik, A; Heide, S; Lessig, R; Hachmann, V; Stoevesandt, D; Kellner, J; Jäschke, C; Watzke, S

    2018-04-01

    One option for improving the quality of medical post mortem examinations is through intensified training of medical students, especially in countries where such a requirement exists regardless of the area of specialisation. For this reason, new teaching and learning methods on this topic have recently been introduced. These new approaches include e-learning modules or SkillsLab stations; one way to objectify the resultant learning outcomes is by means of the OSCE process. However, despite offering several advantages, this examination format also requires considerable resources, in particular in regards to medical examiners. For this reason, many clinical disciplines have already implemented computer-based OSCE examination formats. This study investigates whether the conventional exam format for the OSCE forensic "Death Certificate" station could be replaced with a computer-based approach in future. For this study, 123 students completed the OSCE "Death Certificate" station, using both a computer-based and conventional format, half starting with the Computer the other starting with the conventional approach in their OSCE rotation. Assignment of examination cases was random. The examination results for the two stations were compared and both overall results and the individual items of the exam checklist were analysed by means of inferential statistics. Following statistical analysis of examination cases of varying difficulty levels and correction of the repeated measures effect, the results of both examination formats appear to be comparable. Thus, in the descriptive item analysis, while there were some significant differences between the computer-based and conventional OSCE stations, these differences were not reflected in the overall results after a correction factor was applied (e.g. point deductions for assistance from the medical examiner was possible only at the conventional station). Thus, we demonstrate that the computer-based OSCE "Death Certificate" station is a cost-efficient and standardised format for examination that yields results comparable to those from a conventional format exam. Moreover, the examination results also indicate the need to optimize both the test itself (adjusting the degree of difficulty of the case vignettes) and the corresponding instructional and learning methods (including, for example, the use of computer programmes to complete the death certificate in small group formats in the SkillsLab). Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  17. Beyond the conventional: meeting the challenges of landscape governance within the European Landscape Convention?

    PubMed

    Scott, Alister

    2011-10-01

    Academics and policy makers seeking to deconstruct landscape face major challenges conceptually, methodologically and institutionally. The meaning(s), identity(ies) and management of landscape are controversial and contested. The European Landscape Convention provides an opportunity for action and change set within new governance agendas addressing interdisciplinarity and spatial planning. This paper critically reviews the complex web of conceptual and methodological frameworks that characterise landscape planning and management and then focuses on emerging landscape governance in Scotland within a mixed method approach involving policy analyses, semi-structured interviews and best practice case studies. Using Dower's (2008) criteria from the Articles of the European Landscape Convention, the results show that whilst some progress has been made in landscape policy and practice, largely through the actions of key individuals and champions, there are significant institutional hurdles and resource limitations to overcome. The need to mainstream positive landscape outcomes requires a significant culture change where a one-size-fits-all approach does not work. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Denoising embolic Doppler ultrasound signals using Dual Tree Complex Discrete Wavelet Transform.

    PubMed

    Serbes, Gorkem; Aydin, Nizamettin

    2010-01-01

    Early and accurate detection of asymptomatic emboli is important for monitoring of preventive therapy in stroke-prone patients. One of the problems in detection of emboli is the identification of an embolic signal caused by very small emboli. The amplitude of the embolic signal may be so small that advanced processing methods are required to distinguish these signals from Doppler signals arising from red blood cells. In this study instead of conventional discrete wavelet transform, the Dual Tree Complex Discrete Wavelet Transform was used for denoising embolic signals. Performances of both approaches were compared. Unlike the conventional discrete wavelet transform discrete complex wavelet transform is a shift invariant transform with limited redundancy. Results demonstrate that the Dual Tree Complex Discrete Wavelet Transform based denoising outperforms conventional discrete wavelet denoising. Approximately 8 dB improvement is obtained by using the Dual Tree Complex Discrete Wavelet Transform compared to the improvement provided by the conventional Discrete Wavelet Transform (less than 5 dB).

  19. Development of a new method for the noninvasive measurement of deep body temperature without a heater.

    PubMed

    Kitamura, Kei-Ichiro; Zhu, Xin; Chen, Wenxi; Nemoto, Tetsu

    2010-01-01

    The conventional zero-heat-flow thermometer, which measures the deep body temperature from the skin surface, is widely used at present. However, this thermometer requires considerable electricity to power the electric heater that compensates for heat loss from the probe; thus, AC power is indispensable for its use. Therefore, this conventional thermometer is inconvenient for unconstrained monitoring. We have developed a new dual-heat-flux method that can measure the deep body temperature from the skin surface without a heater. Our method is convenient for unconstrained and long-term measurement because the instrument is driven by a battery and its design promotes energy conservation. Its probe consists of dual-heat-flow channels with different thermal resistances, and each heat-flow-channel has a pair of IC sensors attached on its top and bottom. The average deep body temperature measurements taken using both the dual-heat-flux and then the zero-heat-flow thermometers from the foreheads of 17 healthy subjects were 37.08 degrees C and 37.02 degrees C, respectively. In addition, the correlation coefficient between the values obtained by the 2 methods was 0.970 (p<0.001). These results show that our method can be used for monitoring the deep body temperature as accurately as the conventional method, and it overcomes the disadvantage of the necessity of AC power supply. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  20. Discharge measurements using a broad-band acoustic Doppler current profiler

    USGS Publications Warehouse

    Simpson, Michael R.

    2002-01-01

    The measurement of unsteady or tidally affected flow has been a problem faced by hydrologists for many years. Dynamic discharge conditions impose an unreasonably short time constraint on conventional current-meter discharge-measurement methods, which typically last a minimum of 1 hour. Tidally affected discharge can change more than 100 percent during a 10-minute period. Over the years, the U.S. Geological Survey (USGS) has developed moving-boat discharge-measurement techniques that are much faster but less accurate than conventional methods. For a bibliography of conventional moving-boat publications, see Simpson and Oltmann (1993, page 17). The advent of the acoustic Doppler current profiler (ADCP) made possible the development of a discharge-measurement system capable of more accurately measuring unsteady or tidally affected flow. In most cases, an ADCP discharge-measurement system is dramatically faster than conventional discharge-measurement systems, and has comparable or better accuracy. In many cases, an ADCP discharge-measurement system is the only choice for use at a particular measurement site. ADCP systems are not yet ?turnkey;? they are still under development, and for proper operation, require a significant amount of operator training. Not only must the operator have a rudimentary knowledge of acoustic physics, but also a working knowledge of ADCP operation, the manufacturer's discharge-measurement software, and boating techniques and safety.

  1. Non-invasive peripheral nerve stimulation via focused ultrasound in vivo

    NASA Astrophysics Data System (ADS)

    Downs, Matthew E.; Lee, Stephen A.; Yang, Georgiana; Kim, Seaok; Wang, Qi; Konofagou, Elisa E.

    2018-02-01

    Focused ultrasound (FUS) has been employed on a wide range of clinical applications to safely and non-invasively achieve desired effects that have previously required invasive and lengthy procedures with conventional methods. Conventional electrical neuromodulation therapies that are applied to the peripheral nervous system (PNS) are invasive and/or non-specific. Recently, focused ultrasound has demonstrated the ability to modulate the central nervous system and ex vivo peripheral neurons. Here, for the first time, noninvasive stimulation of the sciatic nerve eliciting a physiological response in vivo is demonstrated with FUS. FUS was applied on the sciatic nerve in mice with simultaneous electromyography (EMG) on the tibialis anterior muscle. EMG signals were detected during or directly after ultrasound stimulation along with observable muscle contraction of the hind limb. Transecting the sciatic nerve downstream of FUS stimulation eliminated EMG activity during FUS stimulation. Peak-to-peak EMG response amplitudes and latency were found to be comparable to conventional electrical stimulation methods. Histology along with behavioral and thermal testing did not indicate damage to the nerve or surrounding regions. The findings presented herein demonstrate that FUS can serve as a targeted, safe and non-invasive alternative to conventional peripheral nervous system stimulation to treat peripheral neuropathic diseases in the clinic.

  2. Use of multispectral Ikonos imagery for discriminating between conventional and conservation agricultural tillage practices

    USGS Publications Warehouse

    Vina, Andres; Peters, Albert J.; Ji, Lei

    2003-01-01

    There is a global concern about the increase in atmospheric concentrations of greenhouse gases. One method being discussed to encourage greenhouse gas mitigation efforts is based on a trading system whereby carbon emitters can buy effective mitigation efforts from farmers implementing conservation tillage practices. These practices sequester carbon from the atmosphere, and such a trading system would require a low-cost and accurate method of verification. Remote sensing technology can offer such a verification technique. This paper is focused on the use of standard image processing procedures applied to a multispectral Ikonos image, to determine whether it is possible to validate that farmers have complied with agreements to implement conservation tillage practices. A principal component analysis (PCA) was performed in order to isolate image variance in cropped fields. Analyses of variance (ANOVA) statistical procedures were used to evaluate the capability of each Ikonos band and each principal component to discriminate between conventional and conservation tillage practices. A logistic regression model was implemented on the principal component most effective in discriminating between conventional and conservation tillage, in order to produce a map of the probability of conventional tillage. The Ikonos imagery, in combination with ground-reference information, proved to be a useful tool for verification of conservation tillage practices.

  3. Conventional and accelerated-solvent extractions of green tea (camellia sinensis) for metabolomics-based chemometrics.

    PubMed

    Kellogg, Joshua J; Wallace, Emily D; Graf, Tyler N; Oberlies, Nicholas H; Cech, Nadja B

    2017-10-25

    Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. Copyright © 2017. Published by Elsevier B.V.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  5. A simple and novel method for RNA-seq library preparation of single cell cDNA analysis by hyperactive Tn5 transposase.

    PubMed

    Brouilette, Scott; Kuersten, Scott; Mein, Charles; Bozek, Monika; Terry, Anna; Dias, Kerith-Rae; Bhaw-Rosun, Leena; Shintani, Yasunori; Coppen, Steven; Ikebe, Chiho; Sawhney, Vinit; Campbell, Niall; Kaneko, Masahiro; Tano, Nobuko; Ishida, Hidekazu; Suzuki, Ken; Yashiro, Kenta

    2012-10-01

    Deep sequencing of single cell-derived cDNAs offers novel insights into oncogenesis and embryogenesis. However, traditional library preparation for RNA-seq analysis requires multiple steps with consequent sample loss and stochastic variation at each step significantly affecting output. Thus, a simpler and better protocol is desirable. The recently developed hyperactive Tn5-mediated library preparation, which brings high quality libraries, is likely one of the solutions. Here, we tested the applicability of hyperactive Tn5-mediated library preparation to deep sequencing of single cell cDNA, optimized the protocol, and compared it with the conventional method based on sonication. This new technique does not require any expensive or special equipment, which secures wider availability. A library was constructed from only 100 ng of cDNA, which enables the saving of precious specimens. Only a few steps of robust enzymatic reaction resulted in saved time, enabling more specimens to be prepared at once, and with a more reproducible size distribution among the different specimens. The obtained RNA-seq results were comparable to the conventional method. Thus, this Tn5-mediated preparation is applicable for anyone who aims to carry out deep sequencing for single cell cDNAs. Copyright © 2012 Wiley Periodicals, Inc.

  6. Evaluation of an improved technique for lumen path definition and lumen segmentation of atherosclerotic vessels in CT angiography.

    PubMed

    van Velsen, Evert F S; Niessen, Wiro J; de Weert, Thomas T; de Monyé, Cécile; van der Lugt, Aad; Meijering, Erik; Stokking, Rik

    2007-07-01

    Vessel image analysis is crucial when considering therapeutical options for (cardio-) vascular diseases. Our method, VAMPIRE (Vascular Analysis using Multiscale Paths Inferred from Ridges and Edges), involves two parts: a user defines a start- and endpoint upon which a lumen path is automatically defined, and which is used for initialization; the automatic segmentation of the vessel lumen on computed tomographic angiography (CTA) images. Both parts are based on the detection of vessel-like structures by analyzing intensity, edge, and ridge information. A multi-observer evaluation study was performed to compare VAMPIRE with a conventional method on the CTA data of 15 patients with carotid artery stenosis. In addition to the start- and endpoint, the two radiologists required on average 2.5 (SD: 1.9) additional points to define a lumen path when using the conventional method, and 0.1 (SD: 0.3) when using VAMPIRE. The segmentation results were quantitatively evaluated using Similarity Indices, which were slightly lower between VAMPIRE and the two radiologists (respectively 0.90 and 0.88) compared with the Similarity Index between the radiologists (0.92). The evaluation shows that the improved definition of a lumen path requires minimal user interaction, and that using this path as initialization leads to good automatic lumen segmentation results.

  7. Traceability in hardness measurements: from the definition to industry

    NASA Astrophysics Data System (ADS)

    Germak, Alessandro; Herrmann, Konrad; Low, Samuel

    2010-04-01

    The measurement of hardness has been and continues to be of significant importance to many of the world's manufacturing industries. Conventional hardness testing is the most commonly used method for acceptance testing and production quality control of metals and metallic products. Instrumented indentation is one of the few techniques available for obtaining various property values for coatings and electronic products in the micrometre and nanometre dimensional scales. For these industries to be successful, it is critical that measurements made by suppliers and customers agree within some practical limits. To help assure this measurement agreement, a traceability chain for hardness measurement traceability from the hardness definition to industry has developed and evolved over the past 100 years, but its development has been complicated. A hardness measurement value not only requires traceability of force, length and time measurements but also requires traceability of the hardness values measured by the hardness machine. These multiple traceability paths are needed because a hardness measurement is affected by other influence parameters that are often difficult to identify, quantify and correct. This paper describes the current situation of hardness measurement traceability that exists for the conventional hardness methods (i.e. Rockwell, Brinell, Vickers and Knoop hardness) and for special-application hardness and indentation methods (i.e. elastomer, dynamic, portables and instrumented indentation).

  8. Fuzzy-C-Means Clustering Based Segmentation and CNN-Classification for Accurate Segmentation of Lung Nodules

    PubMed

    K, Jalal Deen; R, Ganesan; A, Merline

    2017-07-27

    Objective: Accurate segmentation of abnormal and healthy lungs is very crucial for a steadfast computer-aided disease diagnostics. Methods: For this purpose a stack of chest CT scans are processed. In this paper, novel methods are proposed for segmentation of the multimodal grayscale lung CT scan. In the conventional methods using Markov–Gibbs Random Field (MGRF) model the required regions of interest (ROI) are identified. Result: The results of proposed FCM and CNN based process are compared with the results obtained from the conventional method using MGRF model. The results illustrate that the proposed method can able to segment the various kinds of complex multimodal medical images precisely. Conclusion: However, in this paper, to obtain an exact boundary of the regions, every empirical dispersion of the image is computed by Fuzzy C-Means Clustering segmentation. A classification process based on the Convolutional Neural Network (CNN) classifier is accomplished to distinguish the normal tissue and the abnormal tissue. The experimental evaluation is done using the Interstitial Lung Disease (ILD) database. Creative Commons Attribution License

  9. Fuzzy-C-Means Clustering Based Segmentation and CNN-Classification for Accurate Segmentation of Lung Nodules

    PubMed Central

    K, Jalal Deen; R, Ganesan; A, Merline

    2017-01-01

    Objective: Accurate segmentation of abnormal and healthy lungs is very crucial for a steadfast computer-aided disease diagnostics. Methods: For this purpose a stack of chest CT scans are processed. In this paper, novel methods are proposed for segmentation of the multimodal grayscale lung CT scan. In the conventional methods using Markov–Gibbs Random Field (MGRF) model the required regions of interest (ROI) are identified. Result: The results of proposed FCM and CNN based process are compared with the results obtained from the conventional method using MGRF model. The results illustrate that the proposed method can able to segment the various kinds of complex multimodal medical images precisely. Conclusion: However, in this paper, to obtain an exact boundary of the regions, every empirical dispersion of the image is computed by Fuzzy C-Means Clustering segmentation. A classification process based on the Convolutional Neural Network (CNN) classifier is accomplished to distinguish the normal tissue and the abnormal tissue. The experimental evaluation is done using the Interstitial Lung Disease (ILD) database. PMID:28749127

  10. Nonlinear Unsteady Aerodynamic Modeling Using Wind Tunnel and Computational Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.

    2016-01-01

    Extensions to conventional aircraft aerodynamic models are required to adequately predict responses when nonlinear unsteady flight regimes are encountered, especially at high incidence angles and under maneuvering conditions. For a number of reasons, such as loss of control, both military and civilian aircraft may extend beyond normal and benign aerodynamic flight conditions. In addition, military applications may require controlled flight beyond the normal envelope, and civilian flight may require adequate recovery or prevention methods from these adverse conditions. These requirements have led to the development of more general aerodynamic modeling methods and provided impetus for researchers to improve both techniques and the degree of collaboration between analytical and experimental research efforts. In addition to more general mathematical model structures, dynamic test methods have been designed to provide sufficient information to allow model identification. This paper summarizes research to develop a modeling methodology appropriate for modeling aircraft aerodynamics that include nonlinear unsteady behaviors using both experimental and computational test methods. This work was done at Langley Research Center, primarily under the NASA Aviation Safety Program, to address aircraft loss of control, prevention, and recovery aerodynamics.

  11. Battlefield MRI

    DOE PAGES

    Espy, Michelle

    2015-06-01

    Magnetic Resonance Imaging is the best method for non-invasive imaging of soft tissue anatomy, saving countless lives each year. It is regarded as the gold standard for diagnosis of mild to moderate traumatic brain injuries. Furthermore, conventional MRI relies on very high, fixed strength magnetic fields (> 1.5 T) with parts-per-million homogeneity, which requires very large and expensive magnets.

  12. Array-Based Discovery of Aptamer Pairs

    DTIC Science & Technology

    2014-12-11

    affinities greatly exceeding either monovalent component. DNA aptamers are especially well-suited for such constructs, because they can be linked via...standard synthesis techniques without requiring chemical conjugation. Unfortunately, aptamer pairs are difficult to generate, primarily because...conventional selection methods preferentially yield aptamers that recognize a dominant “hot spot” epitope. Our 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  13. Self Diagnostic Adhesive for Bonded Joints in Aircraft Structures

    DTIC Science & Technology

    2016-10-04

    validated under the fatigue/dynamic loading condition. 3) Both SEM (Spectral Element Modeling) and FEM ( Finite Element Modeling) simulation of the...Sensors ..................................................................... 22 Parametric Study of Sensor Performance via Finite Element Simulation...The frequency range that we are interested is around 800 kHz. Conventional linear finite element method (FEM) requires a very fine spatial

  14. Imaging of high-velocity very small subjects

    NASA Astrophysics Data System (ADS)

    Haddleton, Graham P.

    1993-01-01

    The imaging of high velocity (> 2000 m/s), 7 mm Cuboids impacting on various targets is discussed. The reasons why conventional H.S. Cine techniques, even framing at 40,000 pps, are inadequate to record the detail required are outlined. Four different methods of image capture are illustrated giving a direct comparison between state-of-the-art technologies.

  15. Battlefield MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Espy, Michelle

    Magnetic Resonance Imaging is the best method for non-invasive imaging of soft tissue anatomy, saving countless lives each year. It is regarded as the gold standard for diagnosis of mild to moderate traumatic brain injuries. Furthermore, conventional MRI relies on very high, fixed strength magnetic fields (> 1.5 T) with parts-per-million homogeneity, which requires very large and expensive magnets.

  16. Backscatter particle image velocimetry via optical time-of-flight sectioning

    DOE PAGES

    Paciaroni, Megan E.; Chen, Yi; Lynch, Kyle Patrick; ...

    2018-01-11

    Conventional particle image velocimetry (PIV) configurations require a minimum of two optical access ports, inherently restricting the technique to a limited class of flows. Here, the development and application of a novel method of backscattered time-gated PIV requiring a single-optical-access port is described along with preliminary results. The light backscattered from a seeded flow is imaged over a narrow optical depth selected by an optical Kerr effect (OKE) time gate. The picosecond duration of the OKE time gate essentially replicates the width of the laser sheet of conventional PIV by limiting detected photons to a narrow time-of-flight within the flow.more » Thus, scattering noise from outside the measurement volume is eliminated. In conclusion, this PIV via the optical time-of-flight sectioning technique can be useful in systems with limited optical access and in flows near walls or other scattering surfaces.« less

  17. Backscatter particle image velocimetry via optical time-of-flight sectioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paciaroni, Megan E.; Chen, Yi; Lynch, Kyle Patrick

    Conventional particle image velocimetry (PIV) configurations require a minimum of two optical access ports, inherently restricting the technique to a limited class of flows. Here, the development and application of a novel method of backscattered time-gated PIV requiring a single-optical-access port is described along with preliminary results. The light backscattered from a seeded flow is imaged over a narrow optical depth selected by an optical Kerr effect (OKE) time gate. The picosecond duration of the OKE time gate essentially replicates the width of the laser sheet of conventional PIV by limiting detected photons to a narrow time-of-flight within the flow.more » Thus, scattering noise from outside the measurement volume is eliminated. In conclusion, this PIV via the optical time-of-flight sectioning technique can be useful in systems with limited optical access and in flows near walls or other scattering surfaces.« less

  18. Water quality assessment of the Li Canal using a functional fuzzy synthetic evaluation model.

    PubMed

    Feng, Yan; Ling, Liu

    2014-07-01

    Through introducing functional data analysis (FDA) theory into the conventional fuzzy synthetic evaluation (FSE) method, the functional fuzzy synthetic evaluation (FFSE) model is established. FFSE keeps the property of the conventional FSE that the fuzziness in the water quality condition can be suitably measured. Furthermore, compared with FSE, FFSE has the following advantages: (1) FFSE requires fewer conditions for observation, for example, pollutants can be monitored at different times, and missing data is accepted; (2) the dynamic variation of the water quality condition can be represented more comprehensively and intuitively. The procedure of FFSE is discussed and the water quality of the Li Canal in 2012 is evaluated as an illustration. The synthetic classification of the Li Canal is "II" in January, February and July, and "I" in other months, which can satisfy the requirement of the Chinese South-to-North Water Diversion Project.

  19. Current status and biotechnological advances in genetic engineering of ornamental plants.

    PubMed

    Azadi, Pejman; Bagheri, Hedayat; Nalousi, Ayoub Molaahmad; Nazari, Farzad; Chandler, Stephen F

    2016-11-01

    Cut flower markets are developing in many countries as the international demand for cut flowers is rapidly growing. Developing new varieties with modified characteristics is an important aim in floriculture. Production of transgenic ornamental plants can shorten the time required in the conventional breeding of a cultivar. Biotechnology tools in combination with conventional breeding methods have been used by cut flower breeders to change flower color, plant architecture, post-harvest traits, and disease resistance. In this review, we describe advances in genetic engineering that have led to the development of new cut flower varieties. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Lightweight Structures

    NASA Technical Reports Server (NTRS)

    Whittenberger, J. Daniel

    2001-01-01

    Present structural concepts for hot static structures are conventional "sheet & stringer" or truss core construction. More weight-efficient concepts such as honeycomb and lattice block are being investigated, in combination with both conventional superalloys and TiAl. Development efforts for components made from TiAl sheet are centered on lower cost methods for sheet and foil production, plus alloy development for higher temperature capability. A low-cost casting technology recently developed for aluminum and steel lattice blocks has demonstrated the required higher strength and stiffness, with weight efficiency approach- ing honeycombs. The current effort is based on extending the temperature capability by developing lattice block materials made from IN-718 and Mar-M247.

  1. Management system to a photovoltaic panel based on the measurement of short-circuit currents

    NASA Astrophysics Data System (ADS)

    Dordescu, M.

    2016-12-01

    This article is devoted to fundamental issues arising from operation in terms of increased energy efficiency for photovoltaic panel (PV). By measuring the current from functioning cage determine the current value prescribed amount corresponding to maximum power point results obtained by requiring proof of pregnancy with this method are the maximum energy possible, thus justifying the usefulness of this process very simple and inexpensive to implement in practice. The proposed adjustment method is much simpler and more economical than conventional methods that rely on measuring power cut.

  2. Application of COLD-PCR for improved detection of KRAS mutations in clinical samples.

    PubMed

    Zuo, Zhuang; Chen, Su S; Chandra, Pranil K; Galbincea, John M; Soape, Matthew; Doan, Steven; Barkoh, Bedia A; Koeppen, Hartmut; Medeiros, L Jeffrey; Luthra, Rajyalakshmi

    2009-08-01

    KRAS mutations have been detected in approximately 30% of all human tumors, and have been shown to predict response to some targeted therapies. The most common KRAS mutation-detection strategy consists of conventional PCR and direct sequencing. This approach has a 10-20% detection sensitivity depending on whether pyrosequencing or Sanger sequencing is used. To improve detection sensitivity, we compared our conventional method with the recently described co-amplification-at-lower denaturation-temperature PCR (COLD-PCR) method, which selectively amplifies minority alleles. In COLD-PCR, the critical denaturation temperature is lowered to 80 degrees C (vs 94 degrees C in conventional PCR). The sensitivity of COLD-PCR was determined by assessing serial dilutions. Fifty clinical samples were used, including 20 fresh bone-marrow aspirate specimens and the formalin-fixed paraffin-embedded (FFPE) tissue of 30 solid tumors. Implementation of COLD-PCR was straightforward and required no additional cost for reagents or instruments. The method was specific and reproducible. COLD-PCR successfully detected mutations in all samples that were positive by conventional PCR, and enhanced the mutant-to-wild-type ratio by >4.74-fold, increasing the mutation detection sensitivity to 1.5%. The enhancement of mutation detection by COLD-PCR inversely correlated with the tumor-cell percentage in a sample. In conclusion, we validated the utility and superior sensitivity of COLD-PCR for detecting KRAS mutations in a variety of hematopoietic and solid tumors using either fresh or fixed, paraffin-embedded tissue.

  3. Discharge-measurement system using an acoustic Doppler current profiler with applications to large rivers and estuaries

    USGS Publications Warehouse

    Simpson, Michael R.; Oltmann, Richard N.

    1993-01-01

    Discharge measurement of large rivers and estuaries is difficult, time consuming, and sometimes dangerous. Frequently, discharge measurements cannot be made in tide-affected rivers and estuaries using conventional discharge-measurement techniques because of dynamic discharge conditions. The acoustic Doppler discharge-measurement system (ADDMS) was developed by the U.S. Geological Survey using a vessel-mounted acoustic Doppler current profiler coupled with specialized computer software to measure horizontal water velocity at 1-meter vertical intervals in the water column. The system computes discharge from water-and vessel-velocity data supplied by the ADDMS using vector-algebra algorithms included in the discharge-measurement software. With this system, a discharge measurement can be obtained by engaging the computer software and traversing a river or estuary from bank to bank; discharge in parts of the river or estuarine cross sections that cannot be measured because of ADDMS depth limitations are estimated by the system. Comparisons of ADDMS-measured discharges with ultrasonic-velocity-meter-measured discharges, along with error-analysis data, have confirmed that discharges provided by the ADDMS are at least as accurate as those produced using conventional methods. In addition, the advantage of a much shorter measurement time (2 minutes using the ADDMS compared with 1 hour or longer using conventional methods) has enabled use of the ADDMS for several applications where conventional discharge methods could not have been used with the required accuracy because of dynamic discharge conditions.

  4. Biointervention makes leather processing greener: an integrated cleansing and tanning system.

    PubMed

    Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2003-06-01

    The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.

  5. Air sampling with solid phase microextraction

    NASA Astrophysics Data System (ADS)

    Martos, Perry Anthony

    There is an increasing need for simple yet accurate air sampling methods. The acceptance of new air sampling methods requires compatibility with conventional chromatographic equipment, and the new methods have to be environmentally friendly, simple to use, yet with equal, or better, detection limits, accuracy and precision than standard methods. Solid phase microextraction (SPME) satisfies the conditions for new air sampling methods. Analyte detection limits, accuracy and precision of analysis with SPME are typically better than with any conventional air sampling methods. Yet, air sampling with SPME requires no pumps, solvents, is re-usable, extremely simple to use, is completely compatible with current chromatographic equipment, and requires a small capital investment. The first SPME fiber coating used in this study was poly(dimethylsiloxane) (PDMS), a hydrophobic liquid film, to sample a large range of airborne hydrocarbons such as benzene and octane. Quantification without an external calibration procedure is possible with this coating. Well understood are the physical and chemical properties of this coating, which are quite similar to those of the siloxane stationary phase used in capillary columns. The log of analyte distribution coefficients for PDMS are linearly related to chromatographic retention indices and to the inverse of temperature. Therefore, the actual chromatogram from the analysis of the PDMS air sampler will yield the calibration parameters which are used to quantify unknown airborne analyte concentrations (ppb v to ppm v range). The second fiber coating used in this study was PDMS/divinyl benzene (PDMS/DVB) onto which o-(2,3,4,5,6- pentafluorobenzyl) hydroxylamine (PFBHA) was adsorbed for the on-fiber derivatization of gaseous formaldehyde (ppb v range), with and without external calibration. The oxime formed from the reaction can be detected with conventional gas chromatographic detectors. Typical grab sampling times were as small as 5 seconds. With 300 seconds sampling, the formaldehyde detection limit was 2.1 ppbv, better than any other 5 minute sampling device for formaldehyde. The first-order rate constant for product formation was used to quantify formaldehyde concentrations without a calibration curve. This spot sampler was used to sample the headspace of hair gel, particle board, plant material and coffee grounds for formaldehyde, and other carbonyl compounds, with extremely promising results. The SPME sampling devices were also used for time- weighted average sampling (30 minutes to 16 hours). Finally, the four new SPME air sampling methods were field tested with side-by-side comparisons to standard air sampling methods, showing a tremendous use of SPME as an air sampler.

  6. Malaria control in Tanzania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yhdego, M.; Majura, P.

    A review of the malaria control programs and the problem encountered in the United Republic of Tanzania since 1945 to the year 1986 is discussed. Buguruni, one of the squatter areas in the city of Dar es Salaam, is chosen as a case study in order to evaluate the economic advantage of engineering methods for the control of malaria infection. Although the initial capital cost of engineering methods may be high, the cost effectiveness requires a much lower financial burden of only about Tshs. 3 million compared with the conventional methods of larviciding and insecticiding which requires more than Tshs.more » 10 million. Finally, recommendations for the adoption of engineering methods are made concerning the upgrading of existing roads and footpaths in general with particular emphasis on drainage of large pools of water which serve as breeding sites for mosquitoes.« less

  7. Extreme-phenotype genome-wide association study (XP-GWAS): a method for identifying trait-associated variants by sequencing pools of individuals selected from a diversity panel.

    PubMed

    Yang, Jinliang; Jiang, Haiying; Yeh, Cheng-Ting; Yu, Jianming; Jeddeloh, Jeffrey A; Nettleton, Dan; Schnable, Patrick S

    2015-11-01

    Although approaches for performing genome-wide association studies (GWAS) are well developed, conventional GWAS requires high-density genotyping of large numbers of individuals from a diversity panel. Here we report a method for performing GWAS that does not require genotyping of large numbers of individuals. Instead XP-GWAS (extreme-phenotype GWAS) relies on genotyping pools of individuals from a diversity panel that have extreme phenotypes. This analysis measures allele frequencies in the extreme pools, enabling discovery of associations between genetic variants and traits of interest. This method was evaluated in maize (Zea mays) using the well-characterized kernel row number trait, which was selected to enable comparisons between the results of XP-GWAS and conventional GWAS. An exome-sequencing strategy was used to focus sequencing resources on genes and their flanking regions. A total of 0.94 million variants were identified and served as evaluation markers; comparisons among pools showed that 145 of these variants were statistically associated with the kernel row number phenotype. These trait-associated variants were significantly enriched in regions identified by conventional GWAS. XP-GWAS was able to resolve several linked QTL and detect trait-associated variants within a single gene under a QTL peak. XP-GWAS is expected to be particularly valuable for detecting genes or alleles responsible for quantitative variation in species for which extensive genotyping resources are not available, such as wild progenitors of crops, orphan crops, and other poorly characterized species such as those of ecological interest. © 2015 The Authors The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.

  8. The influence of sorghum grain decortication on bioethanol production and quality of the distillers' dried grains with solubles using cold and conventional warm starch processing.

    PubMed

    Nkomba, Edouard Y; van Rensburg, Eugéne; Chimphango, Annie F A; Görgens, Johann F

    2016-03-01

    Very high gravity hydrolysis-fermentation of whole and decorticated sorghum grains were compared using conventional and cold hydrolysis methods to assess the extent by which decortication could minimize enzymes dosages and affect the quality of the distillers' dried grains with solubles (DDGS). All processing configurations achieved ethanol concentrations between 126 and 132 g/L (16.0-16.7%v/v), although decortication resulted in a decreased ethanol yield. Decortication resulted in a decreased volumetric productivity during warm processing from 1.55 to 1.25 g L(-1)h(-1), whereas the required enzyme dosage for cold processing was decreased from 250 to 221 μl/100 gstarch. Cold processing decreased the average acid detergent fibre (ADF) from 35.59% to 29.32% and neutral detergent fibre (NDF) from 44.04% to 32.28% in the DDGS compared to the conventional (warm) processing. Due to lower enzyme requirements, the use of decorticated grains combined with cold processing presents a favourable process configuration and source of DDGS for non-ruminants. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Biomagnetic separation of Salmonella Typhimurium with high affine and specific ligand peptides isolated by phage display technique

    NASA Astrophysics Data System (ADS)

    Steingroewer, Juliane; Bley, Thomas; Bergemann, Christian; Boschke, Elke

    2007-04-01

    Analyses of food-borne pathogens are of great importance in order to minimize the health risk for customers. Thus, very sensitive and rapid detection methods are required. Current conventional culture techniques are very time consuming. Modern immunoassays and biochemical analysis also require pre-enrichment steps resulting in a turnaround time of at least 24 h. Biomagnetic separation (BMS) is a promising more rapid method. In this study we describe the isolation of high affine and specific peptides from a phage-peptide library, which combined with BMS allows the detection of Salmonella spp. with a similar sensitivity as that of immunomagnetic separation using antibodies.

  10. Aerodynamic laser-heated contactless furnace for neutron scattering experiments at elevated temperatures

    NASA Astrophysics Data System (ADS)

    Landron, Claude; Hennet, Louis; Coutures, Jean-Pierre; Jenkins, Tudor; Alétru, Chantal; Greaves, Neville; Soper, Alan; Derbyshire, Gareth

    2000-04-01

    Conventional radiative furnaces require sample containment that encourages contamination at elevated temperatures and generally need windows which restrict the entrance and exit solid angles required for diffraction and scattering measurements. We describe a contactless windowless furnace based on aerodynamic levitation and laser heating which has been designed for high temperature neutron scattering experiments. Data from initial experiments are reported for crystalline and amorphous oxides at temperatures up to 1900 °C, using the spallation neutron source ISIS together with our laser-heated aerodynamic levitator. Accurate reproduction of thermal expansion coefficients and radial distribution functions have been obtained, demonstrating the utility of aerodynamic levitation methods for neutron scattering methods.

  11. Harmonizing national forest inventories

    Treesearch

    Ronald E. McRoberts; Erkki O. Tomppo; Klemens Schadauer; Göran Ståhl

    2012-01-01

    International agreements increasingly require that countries report estimates of national forest resources. The United Nations Framework Convention on Climate Change requires that countries submit annual reports of greenhouse gas emissions and removals by sources and sinks. The Convention on Biological Diversity requires that countries identify and monitor components...

  12. Rapid fabrication of microfluidic chips based on the simplest LED lithography

    NASA Astrophysics Data System (ADS)

    Li, Yue; Wu, Ping; Luo, Zhaofeng; Ren, Yuxuan; Liao, Meixiang; Feng, Lili; Li, Yuting; He, Liqun

    2015-05-01

    Microfluidic chips are generally fabricated by a soft lithography method employing commercial lithography equipment. These heavy machines require a critical room environment and high lamp power, and the cost remains too high for most normal laboratories. Here we present a novel microfluidics fabrication method utilizing a portable ultraviolet (UV) LED as an alternative UV source for photolithography. With this approach, we can repeat several common microchannels as do these conventional commercial exposure machines, and both the verticality of the channel sidewall and lithography resolution are proved to be acceptable. Further microfluidics applications such as mixing, blood typing and microdroplet generation are implemented to validate the practicability of the chips. This simple but innovative method decreases the cost and requirement of chip fabrication dramatically and may be more popular with ordinary laboratories.

  13. Levonorgestrel-Releasing Intrauterine System versus Medical Therapy for Menorrhagia: A Systematic Review and Meta-Analysis

    PubMed Central

    Qiu, Jin; Cheng, Jiajing; Wang, Qingying; Hua, Jie

    2014-01-01

    Background The aim of this study was to compare the effects of the levonorgestrel-releasing intrauterine system (LNG-IUS) with conventional medical treatment in reducing heavy menstrual bleeding. Material/Methods Relevant studies were identified by a search of MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials, and clinical trials registries (from inception to April 2014). Randomized controlled trials comparing the LNG-IUS with conventional medical treatment (mefenamic acid, tranexamic acid, norethindrone, medroxyprogesterone acetate injection, or combined oral contraceptive pills) in patients with menorrhagia were included. Results Eight randomized controlled trials that included 1170 women (LNG-IUS, n=562; conventional medical treatment, n=608) met inclusion criteria. The LNG-IUS was superior to conventional medical treatment in reducing menstrual blood loss (as measured by the alkaline hematin method or estimated by pictorial bleeding assessment chart scores). More women were satisfied with the LNG-IUS than with the use of conventional medical treatment (odds ratio [OR] 5.19, 95% confidence interval [CI] 2.73–9.86). Compared with conventional medical treatment, the LNG-IUS was associated with a lower rate of discontinuation (14.6% vs. 28.9%, OR 0.39, 95% CI 0.20–0.74) and fewer treatment failures (9.2% vs. 31.0%, OR 0.18, 95% CI 0.10–0.34). Furthermore, quality of life assessment favored LNG-IUS over conventional medical treatment, although use of various measurements limited our ability to pool the data for more powerful evidence. Serious adverse events were statistically comparable between treatments. Conclusions The LNG-IUS was the more effective first choice for management of menorrhagia compared with conventional medical treatment. Long-term, randomized trials are required to further investigate patient-based outcomes and evaluate the cost-effectiveness of the LNG-IUS and other medical treatments. PMID:25245843

  14. Effects of soldering methods on tensile strength of a gold-palladium metal ceramic alloy.

    PubMed

    Ghadhanfari, Husain A; Khajah, Hasan M; Monaco, Edward A; Kim, Hyeongil

    2014-10-01

    The tensile strength obtained by conventional postceramic application soldering and laser postceramic welding may require more energy than microwave postceramic soldering, which could provide similar tensile strength values. The purpose of the study was to compare the tensile strength obtained by microwave postceramic soldering, conventional postceramic soldering, and laser postceramic welding. A gold-palladium metal ceramic alloy and gold-based solder were used in this study. Twenty-seven wax specimens were cast in gold-palladium noble metal and divided into 4 groups: laser welding with a specific postfiller noble metal, microwave soldering with a postceramic solder, conventional soldering with the same postceramic solder used in the microwave soldering group, and a nonsectioned control group. All the specimens were heat treated to simulate a normal porcelain sintering sequence. An Instron Universal Testing Machine was used to measure the tensile strength for the 4 groups. The means were analyzed statistically with 1-way ANOVA. The surface and fracture sites of the specimens were subjectively evaluated for fracture type and porosities by using a scanning electron microscope. The mean (standard deviation) ultimate tensile strength values were as follows: nonsectioned control 818 ±30 MPa, microwave 516 ±34 MPa, conventional 454 ±37 MPa, and laser weld 191 ±39 MPa. A 1-way ANOVA showed a significant difference in ultimate tensile strength among the groups (F3,23=334.5; P<.001). Follow-up multiple comparisons showed a significant difference among all the groups. Microwave soldering resulted in a higher tensile strength for gold and palladium noble metals than either conventional soldering or laser welding. Conventional soldering resulted in a higher tensile strength than laser welding. Under the experimental conditions described, either microwave or conventional postceramic soldering would appear to satisfy clinical requirements related to tensile strength. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  15. Examination of bacteria drug resistance utilizing surface plasmon resonance

    NASA Astrophysics Data System (ADS)

    Chiang, Ya-Ling; Chen, How-Foo; Lin, Chi-Hung; Chen, Shean-Jen

    2007-05-01

    An antimicrobial testing method using surface plasmon resonance to improve the present techniques is reported in this paper. Different from conventional methods, namely Kirby-Bauer disk diffusion and variations of broth microdilution, the examination time is reduced from 18-24 hours or more to less than one hour after the treatment of antibiotics. E-coli resistant and susceptible to ampicillin are used in the test to demonstrate this innovative method. It is generally known that discovering a method to exam the bacterium resistance rapidly and correctly is very important for patients and for preventing infective disease from spreading. The method reported can benefit this requirement.

  16. Geometric Integration of Weakly Dissipative Systems

    NASA Astrophysics Data System (ADS)

    Modin, K.; Führer, C.; Soöderlind, G.

    2009-09-01

    Some problems in mechanics, e.g. in bearing simulation, contain subsystems that are conservative as well as weakly dissipative subsystems. Our experience is that geometric integration methods are often superior for such systems, as long as the dissipation is weak. Here we develop adaptive methods for dissipative perturbations of Hamiltonian systems. The methods are "geometric" in the sense that the form of the dissipative perturbation is preserved. The methods are linearly explicit, i.e., they require the solution of a linear subsystem. We sketch an analysis in terms of backward error analysis and numerical comparisons with a conventional RK method of the same order is given.

  17. Utility of Modified Ultrafast Papanicolaou Stain in Cytological Diagnosis

    PubMed Central

    Arakeri, Surekha Ulhas

    2017-01-01

    Introduction Need for minimal turnaround time for assessing Fine Needle Aspiration Cytology (FNAC) has encouraged innovations in staining techniques that require lesser staining time with unequivocal cell morphology. The standard protocol for conventional Papanicolaou (PAP) stain requires about 40 minutes. To overcome this, Ultrafast Papanicolaou (UFP) stain was introduced which reduces staining time to 90 seconds and also enhances the quality. However, reagents required for this were not easily available hence, Modified Ultrafast Papanicolaou (MUFP) stain was introduced subsequently. Aim To assess the efficacy of MUFP staining by comparing the quality of MUFP stain with conventional PAP stain. Materials and Methods FNAC procedure was performed by using 10 ml disposable syringe and 22-23 G needle. Total 131 FNAC cases were studied which were lymph node (30), thyroid (38), breast (22), skin and soft tissue (24), salivary gland (11) and visceral organs (6). Two smears were prepared and stained by MUFP and conventional PAP stain. Scores were given on four parameters: background of smears, overall staining pattern, cell morphology and nuclear staining. Quality Index (QI) was calculated from ratio of total score achieved to maximum score possible. Statistical analysis using chi square test was applied to each of the four parameters before obtaining the QI in both stains. Students t-test was applied to evaluate the efficacy of MUFP in comparison with conventional PAP stain. Results The QI of MUFP for thyroid, breast, lymph node, skin and soft tissue, salivary gland and visceral organs was 0.89, 0.85, 0.89, 0.83, 0.92, and 0.78 respectively. Compared to conventional PAP stain QI of MUFP smears was better in all except visceral organ cases and was statistically significant. MUFP showed clear red blood cell background, transparent cytoplasm and crisp nuclear features. Conclusion MUFP is fast, reliable and can be done with locally available reagents with unequivocal morphology which is the need of the hour for a cytopathology set-up. PMID:28511391

  18. Cotton textiles modified with citric acid as efficient anti-bacterial agent for prevention of nosocomial infections

    PubMed Central

    Bischof Vukušić, Sandra; Flinčec Grgac, Sandra; Budimir, Ana; Kalenić, Smilja

    2011-01-01

    Aim To study the antimicrobial activity of citric acid (CA) and sodium hypophosphite monohydrate (SHP) against gram-positive and gram-negative bacteria, and to determine the influence of conventional and microwave thermal treatments on the effectiveness of antimicrobial treatment of cotton textiles. Method Textile material was impregnated with CA and SHP solution and thermally treated by either conventional or microwave drying/curing treatment. Antibacterial effectiveness was tested according to the ISO 20743:2009 standard, using absorption method. The surfaces were morphologically observed by scanning electron microscopy, while physical characteristics were determined by wrinkle recovery angles method (DIN 53 891), tensile strength (DIN 53 837), and whiteness degree method (AATCC 110-2000). Results Cotton fabric treated with CA and SHP showed significant antibacterial activity against MRSA (6.38 log10 treated by conventional drying and 6.46 log10 treated by microwave drying before washing, and 6.90 log10 and 7.86 log10, respectively, after 1 cycle of home domestic laundering washing [HDLW]). Antibacterial activity was also remarkable against S. aureus (4.25 log10 by conventional drying, 4.58 log10 by microwave drying) and against P. aeruginosa (1.93 log10 by conventional and 4.66 log10 by microwave drying). Antibacterial activity against P. aeruginosa was higher in samples subjected to microwave drying/curing than in those subjected to conventional drying/curing. As expected, antibacterial activity was reduced after 10 HDLW cycles but the compound was still effective. The surface of the untreated cotton polymer was smooth, while minor erosion stripes appeared on the surfaces treated with antimicrobial agent, and long and deep stripes were found on the surface of the washed sample. Conclusion CA can be used both for the disposable (non-durable) materials (gowns, masks, and cuffs for blood pressure measurement) and the materials that require durability to laundering. The current protocols and initiatives in infection control could be improved by the use of antimicrobial agents applied on cotton carbohydrate polymer. PMID:21328723

  19. Dissolving and biodegradable microneedle technologies for transdermal sustained delivery of drug and vaccine

    PubMed Central

    Hong, Xiaoyun; Wei, Liangming; Wu, Fei; Wu, Zaozhan; Chen, Lizhu; Liu, Zhenguo; Yuan, Weien

    2013-01-01

    Microneedles were first conceptualized for drug delivery many decades ago, overcoming the shortages and preserving the advantages of hypodermic needle and conventional transdermal drug-delivery systems to some extent. Dissolving and biodegradable microneedle technologies have been used for transdermal sustained deliveries of different drugs and vaccines. This review describes microneedle geometry and the representative dissolving and biodegradable microneedle delivery methods via the skin, followed by the fabricating methods. Finally, this review puts forward some perspectives that require further investigation. PMID:24039404

  20. Production of hydrogen by electron transfer catalysis using conventional and photochemical means

    NASA Technical Reports Server (NTRS)

    Rillema, D. P.

    1981-01-01

    Alternate methods of generating hydrogen from the sulfuric acid thermal or electrochemical cycles are presented. A number of processes requiring chemical, electrochemical or photochemical methods are also presented. These include the design of potential photoelectrodes and photocatalytic membranes using Ru impregnated nafion tubing, and the design of experiments to study the catalyzed electrolytic formation of hydrogen and sulfuric acid from sulfur dioxide and water using quinones as catalysts. Experiments are carried out to determine the value of these approaches to energy conversion.

  1. Evaluation of the Use of Supercritical Fluids for the Extraction of Explosives and Their Degradation Products from Soil

    DTIC Science & Technology

    1994-04-01

    and nontoxic is a major pounds. advantage . The accepted analytical method for explosives, The basic equipment required to conduct SFE is SW846 Method...theoretical advantage of SFE tion (SlE) with 18-hour sonic extraction with ACN. compared to conventional solvent extraction. II T r Figure 1. Phase...diagram of C0 2.Temperature 31"C Shut-off Hewler Figure 2. Design for a basic SFE apparaztus. (After Hawthorne 1993.) The advantages of extraction

  2. Generalized multiscale finite-element method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Kai; Fu, Shubin; Gibson, Richard L.

    It is important to develop fast yet accurate numerical methods for seismic wave propagation to characterize complex geological structures and oil and gas reservoirs. However, the computational cost of conventional numerical modeling methods, such as finite-difference method and finite-element method, becomes prohibitively expensive when applied to very large models. We propose a Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media, where we construct basis functions from multiple local problems for both the boundaries and interior of a coarse node support or coarse element. The application of multiscale basis functions can capture the fine scale mediummore » property variations, and allows us to greatly reduce the degrees of freedom that are required to implement the modeling compared with conventional finite-element method for wave equation, while restricting the error to low values. We formulate the continuous Galerkin and discontinuous Galerkin formulation of the multiscale method, both of which have pros and cons. Applications of the multiscale method to three heterogeneous models show that our multiscale method can effectively model the elastic wave propagation in anisotropic media with a significant reduction in the degrees of freedom in the modeling system.« less

  3. Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Kai, E-mail: kaigao87@gmail.com; Fu, Shubin, E-mail: shubinfu89@gmail.com; Gibson, Richard L., E-mail: gibson@tamu.edu

    It is important to develop fast yet accurate numerical methods for seismic wave propagation to characterize complex geological structures and oil and gas reservoirs. However, the computational cost of conventional numerical modeling methods, such as finite-difference method and finite-element method, becomes prohibitively expensive when applied to very large models. We propose a Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media, where we construct basis functions from multiple local problems for both the boundaries and interior of a coarse node support or coarse element. The application of multiscale basis functions can capture the fine scale mediummore » property variations, and allows us to greatly reduce the degrees of freedom that are required to implement the modeling compared with conventional finite-element method for wave equation, while restricting the error to low values. We formulate the continuous Galerkin and discontinuous Galerkin formulation of the multiscale method, both of which have pros and cons. Applications of the multiscale method to three heterogeneous models show that our multiscale method can effectively model the elastic wave propagation in anisotropic media with a significant reduction in the degrees of freedom in the modeling system.« less

  4. Generalized multiscale finite-element method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media

    DOE PAGES

    Gao, Kai; Fu, Shubin; Gibson, Richard L.; ...

    2015-04-14

    It is important to develop fast yet accurate numerical methods for seismic wave propagation to characterize complex geological structures and oil and gas reservoirs. However, the computational cost of conventional numerical modeling methods, such as finite-difference method and finite-element method, becomes prohibitively expensive when applied to very large models. We propose a Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media, where we construct basis functions from multiple local problems for both the boundaries and interior of a coarse node support or coarse element. The application of multiscale basis functions can capture the fine scale mediummore » property variations, and allows us to greatly reduce the degrees of freedom that are required to implement the modeling compared with conventional finite-element method for wave equation, while restricting the error to low values. We formulate the continuous Galerkin and discontinuous Galerkin formulation of the multiscale method, both of which have pros and cons. Applications of the multiscale method to three heterogeneous models show that our multiscale method can effectively model the elastic wave propagation in anisotropic media with a significant reduction in the degrees of freedom in the modeling system.« less

  5. Structural Deterministic Safety Factors Selection Criteria and Verification

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1992-01-01

    Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.

  6. Beam hardening correction for interior tomography based on exponential formed model and radon inversion transform

    NASA Astrophysics Data System (ADS)

    Chen, Siyu; Zhang, Hanming; Li, Lei; Xi, Xiaoqi; Han, Yu; Yan, Bin

    2016-10-01

    X-ray computed tomography (CT) has been extensively applied in industrial non-destructive testing (NDT). However, in practical applications, the X-ray beam polychromaticity often results in beam hardening problems for image reconstruction. The beam hardening artifacts, which manifested as cupping, streaks and flares, not only debase the image quality, but also disturb the subsequent analyses. Unfortunately, conventional CT scanning requires that the scanned object is completely covered by the field of view (FOV), the state-of-art beam hardening correction methods only consider the ideal scanning configuration, and often suffer problems for interior tomography due to the projection truncation. Aiming at this problem, this paper proposed a beam hardening correction method based on radon inversion transform for interior tomography. Experimental results show that, compared to the conventional correction algorithms, the proposed approach has achieved excellent performance in both beam hardening artifacts reduction and truncation artifacts suppression. Therefore, the presented method has vitally theoretic and practicable meaning in artifacts correction of industrial CT.

  7. Applications of rapid prototyping technology in maxillofacial prosthetics.

    PubMed

    Sykes, Leanne M; Parrott, Andrew M; Owen, C Peter; Snaddon, Donald R

    2004-01-01

    The purpose of this study was to compare the accuracy, required time, and potential advantages of rapid prototyping technology with traditional methods in the manufacture of wax patterns for two facial prostheses. Two clinical situations were investigated: the production of an auricular prosthesis and the duplication of an existing maxillary prosthesis, using a conventional and a rapid prototyping method for each. Conventional wax patterns were created from impressions taken of a patient's remaining ear and an oral prosthesis. For the rapid prototyping method, a cast of the ear and the original maxillary prosthesis were scanned, and rapid prototyping was used to construct the wax patterns. For the auricular prosthesis, both patterns were refined clinically and then flasked and processed in silicone using routine procedures. Twenty-six independent observers evaluated these patterns by comparing them to the cast of the patient's remaining ear. For the duplication procedure, both wax patterns were scanned and compared to scans of the original prosthesis by generating color error maps to highlight volumetric changes. There was a significant difference in opinions for the two auricular prostheses with regard to shape and esthetic appeal, where the hand-carved prosthesis was found to be of poorer quality. The color error maps showed higher errors with the conventional duplication process compared with the rapid prototyping method. The main advantage of rapid prototyping is the ability to produce physical models using digital methods instead of traditional impression techniques. The disadvantage of equipment costs could be overcome by establishing a centralized service.

  8. SU-E-T-273: Radiation Shielding for a Fixed Horizontal-Beam Linac in a Shipping Container and a Conventional Treatment Vault

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsieh, M; Balter, P; Beadle, B

    Purpose: A fixed horizontal-beam linac, where the patient is treated in a seated position, could lower the overall costs of the treatment unit and room shielding substantially. This design also allows the treatment room and control area to be contained within a reduced space, such as a shipping container. The main application is the introduction of low-cost, high-quality radiation therapy to low- and middle-income regions. Here we consider shielding for upright treatments with a fixed-6MV-beam linac in a shipping container and a conventional treatment vault. Methods: Shielding calculations were done for two treatment room layouts using calculation methods in NCRPmore » Report 151: (1) a shipping container (6m × 2.4m with the remaining space occupied by the console area), and (2) the treatment vault in NCRP 151 (7.8m by 5.4m by 3.4m). The shipping container has a fixed gantry that points in one direction at all times. For the treatment vault, various beam directions were evaluated. Results: The shipping container requires a primary barrier of 168cm concrete (4.5 TVL), surrounded by a secondary barrier of 3.6 TVL. The other walls require between 2.8–3.3 TVL. Multiple shielding calculations were done along the side wall. The results show that patient scatter increases in the forward direction and decreases dramatically in the backward direction. Leakage scatter also varies along the wall, depending largely on the distance between the gantry and the wall. For the treatment room, fixed-beam requires a slightly thicker primary barrier than the conventional linac (0.6 TVL), although this barrier is only needed in the center of one wall. The secondary barrier is different only by 0–0.2 TVL. Conclusion: This work shows that (1) the shipping container option is achievable, using indigenous materials for shielding and (2) upright treatments can be performed in a conventional treatment room with minimal additional shielding. Varian Medical Systems.« less

  9. Guanidinium ionic liquid-based surfactants as low cytotoxic extractants: Analytical performance in an in-situ dispersive liquid-liquid microextraction method for determining personal care products.

    PubMed

    Pacheco-Fernández, Idaira; Pino, Verónica; Ayala, Juan H; Afonso, Ana M

    2018-07-20

    The IL-based surfactant octylguanidinium chloride (C 8 Gu-Cl) was designed and synthetized with the purpose of obtaining a less harmful surfactant: containing guanidinium as core cation and a relatively short alkyl chain. Its interfacial and aggregation behavior was evaluated through conductivity and fluorescence measurements, presenting a critical micelle concentration value of 42.5 and 44.6mmolL -1 , respectively. Cytotoxicity studies were carried out with C 8 Gu-Cl and other IL-based and conventional surfactants, specifically the analogue 1-octyl-3-methylimidazolium chloride (C 8 MIm-Cl), and other imidazolium- (C 16 MIm-Br) and pyridinium- (C 16 Py-Cl) based surfactants, together with the conventional cationic CTAB and the conventional anionic SDS. From these studies, C 8 Gu-Cl was the only one to achieve the classification of low cytotoxicity. An in situ dispersive liquid-liquid microextraction (DLLME) method based on transforming the water-soluble C 8 Gu-Cl IL-based surfactant into a water-insoluble IL microdroplet via a simple metathesis reaction was then selected as the extraction/preconcentration method for a group of 6 personal care products (PCPs) present in cosmetic samples. The method was carried out in combination with high-performance liquid chromatography (HPLC) and diode array detection (DAD). The method was properly optimized, requiring the use of only 30μL of C 8 Gu-Cl for 10mL of aqueous sample with a NaCl content of 8% (w/v) to adjust the ionic strength and pH value of 5. The metathesis reaction required the addition of the anion exchange reagent (bis[(trifluoromethyl)sulfonyl]imide - 1:1 molar ratio), followed by vortex and centrifugation, and dilution of the final microdroplet up to 60μL with acetonitrile before the injection in the HPLC-DAD system. The optimum in situ DLLME-HPLC-DAD method takes ∼10min for the extraction step and ∼22min for the chromatographic separation, with analytical features of low detection limits: down to 0.4μgL -1 ; high reproducibility: with RSD values lower than 10% (intra-day) and 16% (inter-day) for a spiked level of 15μgL -1 ; and an average enrichment factor of 89. The requirement of low volumes (30μL) of a low cytotoxic IL-based surfactant allows the method to be considered less harmful than other common analytical microextraction approaches. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    ERIC Educational Resources Information Center

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  11. Managed forest carbon estimates for the US greenhouse gas inventory, 1990-2008

    Treesearch

    Linda S. Heath; James E. Smith; Kenneth E. Skog; David J. Nowak; Christopher W. Woodall

    2011-01-01

    Land-use change and forestry is the major category featuring carbon sequestration in the annual US Greenhouse Gas Inventory, required by the United Nations Framework Convention on Climate Change. We describe the National Greenhouse Gas Inventory and present the sources of our data and methods and the most recent results. Forests and forest products in the United States...

  12. Use of Slow-Scan Television Systems in Telemedicine, Distance Education, Government, and Industrial Applications.

    ERIC Educational Resources Information Center

    Southworth, Glen

    Reducing the costs of teaching by television through slow-scan methods is discussed. Conventional television is costly to use, largely because the wide-band communications circuits required are in limited supply. One technical answer is bandwidth compression to fit an image into less spectrum space. A simpler and far less costly answer is to…

  13. Mistaken identity of an open reading frame proposed for PCR-based identification of Mycoplasma bovis and the effect of polymorphisms and insertions on assay performance

    USDA-ARS?s Scientific Manuscript database

    Mycoplasma bovis is an important cause of disease in cattle and bison. Because the bacterium requires specialized growth conditions many diagnostic laboratories routinely use PCR to replace or complement conventional isolation and identification methods. A frequently used target of such assays is th...

  14. Controlled sound field with a dual layer loudspeaker array

    NASA Astrophysics Data System (ADS)

    Shin, Mincheol; Fazi, Filippo M.; Nelson, Philip A.; Hirono, Fabio C.

    2014-08-01

    Controlled sound interference has been extensively investigated using a prototype dual layer loudspeaker array comprised of 16 loudspeakers. Results are presented for measures of array performance such as input signal power, directivity of sound radiation and accuracy of sound reproduction resulting from the application of conventional control methods such as minimization of error in mean squared pressure, maximization of energy difference and minimization of weighted pressure error and energy. Procedures for selecting the tuning parameters have also been introduced. With these conventional concepts aimed at the production of acoustically bright and dark zones, all the control methods used require a trade-off between radiation directivity and reproduction accuracy in the bright zone. An alternative solution is proposed which can achieve better performance based on the measures presented simultaneously by inserting a low priority zone named as the “gray” zone. This involves the weighted minimization of mean-squared errors in both bright and dark zones together with the gray zone in which the minimization error is given less importance. This results in the production of directional bright zone in which the accuracy of sound reproduction is maintained with less required input power. The results of simulations and experiments are shown to be in excellent agreement.

  15. Alternative method to trace sediment sources in a subtropical rural catchment of southern Brazil by using near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tiecher, Tales; Caner, Laurent; Gomes Minella, Jean Paolo; Henrique Ciotti, Lucas; Antônio Bender, Marcos; dos Santos Rheinheimer, Danilo

    2014-05-01

    Conventional fingerprinting methods based on geochemical composition still require a time-consuming and critical preliminary sample preparation. Thus, fingerprinting characteristics that can be measured in a rapid and cheap way requiring a minimal sample preparation, such as spectroscopy methods, should be used. The present study aimed to evaluate the sediment sources contribution in a rural catchment by using conventional method based on geochemical composition and on an alternative method based on near-infrared spectroscopy. This study was carried out in a rural catchment with an area of 1,19 km2 located in southern Brazil. The sediment sources evaluated were crop fields (n=20), unpaved roads (n=10) and stream channels (n=10). Thirty suspended sediment samples were collected from eight significant storm runoff events between 2009 and 2011. Sources and sediment samples were dried at 50oC and sieved at 63 µm. The total concentration of Ag, As, B, Ba, Be, Ca, Cd, Co, Cr, Cu, Fe, K, La, Li, Mg, Mn, Mo, Na, Ni, P, Pb, Sb, Se, Sr, Ti, Tl, V and Zn were estimated by ICP-OES after microwave assisted digestion with concentrated HNO3 and HCl. Total organic carbon (TOC) was estimated by wet oxidation with K2Cr2O7 and H2SO4. The near-infrared spectra scan range was 4000 to 10000 cm-1 at a resolution of 2 cm-1, with 100 co added scans per spectrum. The steps used in the conventional method were: i) tracer selection based on Kruskal-Wallis test, ii) selection of the best set of tracers using discriminant analyses and finally iii) the use of a mixed linear model to calculate the sediment sources contribution. The steps used in the alternative method were i) principal component analyses to reduce the number of variables, ii) discriminant analyses to determine the tracer potential of the near-infrared spectroscopy, and finally iii) the use of past least square based on 48 mixtures of the sediment sources in various weight proportions to calculate the sediment sources contribution. Both conventional and alternative methods were capable to discriminate 100% of the sediment sources. Conventional fingerprinting method provided a sediment sources contribution of 33±19% by crop fields, 25±13% by unpaved roads and 42±19% by stream channels. The contribution of sediment sources obtained by alternative fingerprinting method using near-infrared spectroscopy was 71±22% of crop fields, 21±12% of unpaved roads and 14±19% of stream channels. No correlation was observed between source contribution assessed by the two methods. Notwithstanding, the average contribution of the unpaved roads was similar by both methods. The highest difference in the average contribution of crop fields and stream channels estimated by the two methods was due to similar organic matter content of these two sediment sources which hampers their discrimination by assessing the near-infrared spectra, where much of the bands are highly correlated with the TOC levels. Efforts should be taken to try to combine both the geochemical composition and near-infrared spectroscopy information on a single estimative of the sediment sources contribution.

  16. Quantitative T1 and T2* carotid atherosclerotic plaque imaging using a three-dimensional multi-echo phase-sensitive inversion recovery sequence: a feasibility study.

    PubMed

    Fujiwara, Yasuhiro; Maruyama, Hirotoshi; Toyomaru, Kanako; Nishizaka, Yuri; Fukamatsu, Masahiro

    2018-06-01

    Magnetic resonance imaging (MRI) is widely used to detect carotid atherosclerotic plaques. Although it is important to evaluate vulnerable carotid plaques containing lipids and intra-plaque hemorrhages (IPHs) using T 1 -weighted images, the image contrast changes depending on the imaging settings. Moreover, to distinguish between a thrombus and a hemorrhage, it is useful to evaluate the iron content of the plaque using both T 1 -weighted and T 2 *-weighted images. Therefore, a quantitative evaluation of carotid atherosclerotic plaques using T 1 and T 2 * values may be necessary for the accurate evaluation of plaque components. The purpose of this study was to determine whether the multi-echo phase-sensitive inversion recovery (mPSIR) sequence can improve T 1 contrast while simultaneously providing accurate T 1 and T 2 * values of an IPH. T 1 and T 2 * values measured using mPSIR were compared to values from conventional methods in phantom and in vivo studies. In the phantom study, the T 1 and T 2 * values estimated using mPSIR were linearly correlated with those of conventional methods. In the in vivo study, mPSIR demonstrated higher T 1 contrast between the IPH phantom and sternocleidomastoid muscle than the conventional method. Moreover, the T 1 and T 2 * values of the blood vessel wall and sternocleidomastoid muscle estimated using mPSIR were correlated with values measured by conventional methods and with values reported previously. The mPSIR sequence improved T 1 contrast while simultaneously providing accurate T 1 and T 2 * values of the neck region. Although further study is required to evaluate the clinical utility, mPSIR may improve carotid atherosclerotic plaque detection and provide detailed information about plaque components.

  17. Conventional Morphology Versus PCR Sequencing, rep-PCR, and MALDI-TOF-MS for Identification of Clinical Aspergillus Isolates Collected Over a 2-Year Period in a University Hospital at Kayseri, Turkey.

    PubMed

    Atalay, Altay; Koc, Ayse Nedret; Suel, Ahmet; Sav, Hafize; Demir, Gonca; Elmali, Ferhan; Cakir, Nuri; Seyedmousavi, Seyedmojtaba

    2016-09-01

    Aspergillus species cause a wide range of diseases in humans, including allergies, localized infections, or fatal disseminated diseases. Rapid detection and identification of Aspergillus spp. facilitate effective patient management. In the current study we compared conventional morphological methods with PCR sequencing, rep-PCR, and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for the identification of Aspergillus strains. A total of 24 consecutive clinical isolates of Aspergillus were collected during 2012-2014. Conventional morphology and rep-PCR were performed in our Mycology Laboratory. The identification, evaluation, and reporting of strains using MALDI-TOF-MS were performed by BioMérieux Diagnostic, Inc. in Istanbul. DNA sequence analysis of the clinical isolates was performed by the BMLabosis laboratory in Ankara. Samples consisted of 18 (75%) lower respiratory tract specimens, 3 otomycosis (12.5%) ear tissues, 1 sample from keratitis, and 1 sample from a cutaneous wound. According to DNA sequence analysis, 12 (50%) specimens were identified as A. fumigatus, 8 (33.3%) as A. flavus, 3 (12.5%) as A. niger, and 1 (4.2%) as A. terreus. Statistically, there was good agreement between the conventional morphology and rep-PCR and MALDI-TOF methods; kappa values were κ = 0.869, 0.871, and 0.916, respectively (P < 0.001). The good level of agreement between the methods included in the present study and sequence method could be due to the identification of Aspergillus strains that were commonly encountered. Therefore, it was concluded that studies conducted with a higher number of isolates, which include other Aspergillus strains, are required. © 2016 Wiley Periodicals, Inc.

  18. From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

    PubMed

    Scandurra, I; Hägglund, M; Koch, S

    2008-08-01

    This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.

  19. Timing Recovery Strategies in Magnetic Recording Systems

    NASA Astrophysics Data System (ADS)

    Kovintavewat, Piya

    At some point in a digital communications receiver, the received analog signal must be sampled. Good performance requires that these samples be taken at the right times. The process of synchronizing the sampler with the received analog waveform is known as timing recovery. Conventional timing recovery techniques perform well only when operating at high signal-to-noise ratio (SNR). Nonetheless, iterative error-control codes allow reliable communication at very low SNR, where conventional techniques fail. This paper provides a detailed review on the timing recovery strategies based on per-survivor processing (PSP) that are capable of working at low SNR. We also investigate their performance in magnetic recording systems because magnetic recording is a primary method of storage for a variety of applications, including desktop, mobile, and server systems. Results indicate that the timing recovery strategies based on PSP perform better than the conventional ones and are thus worth being employed in magnetic recording systems.

  20. Control-structure interaction study for the Space Station solar dynamic power module

    NASA Technical Reports Server (NTRS)

    Cheng, J.; Ianculescu, G.; Ly, J.; Kim, M.

    1991-01-01

    The authors investigate the feasibility of using a conventional PID (proportional plus integral plus derivative) controller design to perform the pointing and tracking functions for the Space Station Freedom solar dynamic power module. Using this simple controller design, the control/structure interaction effects were also studied without assuming frequency bandwidth separation. From the results, the feasibility of a simple solar dynamic control solution with a reduced-order model, which satisfies the basic system pointing and stability requirements, is suggested. However, the conventional control design approach is shown to be very much influenced by the order of reduction of the plant model, i.e., the number of the retained elastic modes from the full-order model. This suggests that, for complex large space structures, such as the Space Station Freedom solar dynamic, the conventional control system design methods may not be adequate.

  1. Differentiating organically and conventionally grown oregano using ultraperformance liquid chromatography mass spectrometry (UPLC-MS), headspace gas chromatography with flame ionization detection (headspace-GC-FID), and flow injection mass spectrum (FIMS) fingerprints combined with multivariate data analysis.

    PubMed

    Gao, Boyan; Qin, Fang; Ding, Tingting; Chen, Yineng; Lu, Weiying; Yu, Liangli Lucy

    2014-08-13

    Ultraperformance liquid chromatography mass spectrometry (UPLC-MS), flow injection mass spectrometry (FIMS), and headspace gas chromatography (headspace-GC) combined with multivariate data analysis techniques were examined and compared in differentiating organically grown oregano from that grown conventionally. It is the first time that headspace-GC fingerprinting technology is reported in differentiating organically and conventionally grown spice samples. The results also indicated that UPLC-MS, FIMS, and headspace-GC-FID fingerprints with OPLS-DA were able to effectively distinguish oreganos under different growing conditions, whereas with PCA, only FIMS fingerprint could differentiate the organically and conventionally grown oregano samples. UPLC fingerprinting provided detailed information about the chemical composition of oregano with a longer analysis time, whereas FIMS finished a sample analysis within 1 min. On the other hand, headspace GC-FID fingerprinting required no sample pretreatment, suggesting its potential as a high-throughput method in distinguishing organically and conventionally grown oregano samples. In addition, chemical components in oregano were identified by their molecular weight using QTOF-MS and headspace-GC-MS.

  2. Colon Cancer Survival With Herbal Medicine and Vitamins Combined With Standard Therapy in a Whole-Systems Approach: Ten-Year Follow-up Data Analyzed With Marginal Structural Models and Propensity Score Methods

    PubMed Central

    McCulloch, Michael; Broffman, Michael; van der Laan, Mark; Hubbard, Alan; Kushi, Lawrence; Abrams, Donald I.; Gao, Jin; Colford, John M.

    2014-01-01

    Although localized colon cancer is often successfully treated with surgery, advanced disease requires aggressive systemic therapy that has lower effectiveness. Approximately 30% to 75% of patients with colon cancer use complementary and alternative medicine (CAM), but there is limited formal evidence of survival efficacy. In a consecutive case series with 10-year follow-up of all colon cancer patients (n = 193) presenting at a San Francisco Bay-Area center for Chinese medicine (Pine Street Clinic, San Anselmo, CA), the authors compared survival in patients choosing short-term treatment lasting the duration of chemotherapy/radiotherapy with those continuing long-term. To put these data into the context of treatment responses seen in conventional medical practice, they also compared survival with Pan-Asian medicine + vitamins (PAM+V) with that of concurrent external controls from Kaiser Permanente Northern California and California Cancer Registries. Kaplan-Meier, traditional Cox regression, and more modern methods were used for causal inference—namely, propensity score and marginal structural models (MSMs), which have not been used before in studies of cancer survival and Chinese herbal medicine. PAM+V combined with conventional therapy, compared with conventional therapy alone, reduced the risk of death in stage I by 95%, stage II by 64%, stage III by 29%, and stage IV by 75%. There was no significant difference between short-term and long-term PAM+V. Combining PAM+V with conventional therapy improved survival, compared with conventional therapy alone, suggesting that prospective trials combining PAM+V with conventional therapy are justified. PMID:21964510

  3. 40 CFR 80.1503 - What are the product transfer document requirements for gasoline-ethanol blends, gasolines, and...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... requirements for gasoline-ethanol blends, gasolines, and conventional blendstocks for oxygenate blending... Gasoline-Ethanol Blends § 80.1503 What are the product transfer document requirements for gasoline-ethanol blends, gasolines, and conventional blendstocks for oxygenate blending subject to this subpart? (a...

  4. 40 CFR 80.1503 - What are the product transfer document requirements for gasoline-ethanol blends, gasolines, and...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirements for gasoline-ethanol blends, gasolines, and conventional blendstocks for oxygenate blending... Gasoline-Ethanol Blends § 80.1503 What are the product transfer document requirements for gasoline-ethanol blends, gasolines, and conventional blendstocks for oxygenate blending subject to this subpart? (a...

  5. 40 CFR 80.1503 - What are the product transfer document requirements for gasoline-ethanol blends, gasolines, and...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... requirements for gasoline-ethanol blends, gasolines, and conventional blendstocks for oxygenate blending... Gasoline-Ethanol Blends § 80.1503 What are the product transfer document requirements for gasoline-ethanol blends, gasolines, and conventional blendstocks for oxygenate blending subject to this subpart? (a...

  6. [The "gentle caesarean section" - an alternative to the classical way of sectio. A prospective comparison between the classical technique and the method of Misgav Ladach].

    PubMed

    Redlich, A; Köppe, I

    2001-11-01

    A new technical variant of caesarean section was described a few years ago, which is characterised by blunt surgical preparation and simplified seam technique. A prospective investigation compared the differences in the surgery and postoperative process as well as the rate of complications between this Misgav Ladach method and the conventional technique of Sectio. The individual postoperative well-being of the women was recorded by visual analog scales. - Women, whom realize the including criterias (first caesarean section, >/= 32. week of pregnancy, one baby), were examined in this study over one year: 105 patients operated with the Misgav Ladach method and 67 conventionally operated patients. The patients were randomized in a function of the first letter of the surname (A-K: Misgav-Ladach method; L-Z: classical technique). - The surgical time from the cut to the seam was significantly shorter (29.8 vs. 49.3 min; p < 0,001) in the Misgav Ladach group. There were no differences between the two methods in the rate of postoperative complications. The febrile morbidity was equivalent in both groups (7.6 % vs. 9 %), likewise the frequency of postoperative hematomas (3.8 % vs. 3 %). The postoperative period with consumption of analgetics was significantly longer in the group of conventionally operated patients (1.9 d vs. 2.4 d; p < 0.01). The postoperative presentness was estimated significantly better (p < 0,.01) by the patients of the Misgav ladach group - probably caused by the significantly earlier mobilization (p < 0.05). - The surgical technique described by Misgav and Ladach allows a safe execution of the caesarean section and represents an alternative to the conventional method. The duration of operation (cut-seam-time) was significantly shorter. The technique of less traumatising of tissue caused a significantly earlier mobilisation and a significantly shorter requirement of analgetics. The women estimated her postoperative physical condition as better.

  7. Distribution of different yeasts isolates among trauma patients and comparison of accuracy in identification of yeasts by automated method versus conventional methods for better use in low resource countries.

    PubMed

    Rajkumari, N; Mathur, P; Xess, I; Misra, M C

    2014-01-01

    As most trauma patients require long-term hospital stay and long-term antibiotic therapy, the risk of fungal infections in such patients is steadily increasing. Early diagnosis and rapid treatment is life saving in such critically ill trauma patients. To see the distribution of various species of Candida among trauma patients and compare the accuracy, rapid identification and cost effectiveness between VITEK 2, CHROMagar and conventional methods. Retrospective laboratory-based surveillance study performed over a period of 52 months (January 2009 to April 2013) at a level I trauma centre in New Delhi, India. All microbiological samples positive for Candida were processed for microbial identification using standard methods. Identification of Candida was done using chromogenic medium and by automated VITEK 2 Compact system and later confirmed using the conventional method. Time to identification in both was noted and accuracy compared with conventional method. Performed using the SPSS software for Windows (SPSS Inc. Chicago, IL, version 15.0). P values calculated using χ2 test for categorical variables. A P<0.05 was considered significant. Out of 445 yeasts isolates, Candida tropicalis (217, 49%) was the species that was maximally isolated. VITEK 2 was able to correctly identify 354 (79.5%) isolates but could not identify 48 (10.7%) isolates and wrongly identified or showed low discrimination in 43 (9.6%) isolates but CHROM agar correctly identified 381 (85.6%) isolates with 64 (14.4%) misidentification. Highest rate of misidentification was seen in C. tropicalis and C. glabrata (13, 27.1% each) by VITEK 2 and among C. albicans (9, 14%) by CHROMagar. Though CHROMagar gives identification at a lower cost compared with VITEK 2 and are more accurate, which is useful in low resource countries, its main drawback is the long duration taken for complete identification.

  8. Metal‐Catalysed Azidation of Organic Molecules

    PubMed Central

    Goswami, Monalisa

    2016-01-01

    The azide moiety is a desirable functionality in organic molecules, useful in a variety of transformations such as olefin aziridination, C–H bond amination, isocyanate synthesis, the Staudinger reaction and the formation of azo compounds. To harness the versatility of the azide functionality fully it is important that these compounds be easy to prepare, in a clean and cost‐effective manner. Conventional (non‐catalysed) methods to synthesise azides generally require quite harsh reaction conditions that are often not tolerant of functional groups. In the last decade, several metal‐catalysed azidations have been developed in attempts to circumvent this problem. These methods are generally faster, cleaner and more functional‐group‐tolerant than conventional methods to prepare azides, and can sometimes even be conveniently combined with one‐pot follow‐up transformations of the installed azide moiety. This review highlights metal‐catalysed approaches to azide synthesis, with a focus on the substrate scopes and mechanisms, as well as on advantages and disadvantages of the methods. Overall, metal‐catalysed azidation reactions provide shorter routes to a variety of potentially useful organic molecules containing the azide moiety. PMID:28344503

  9. Radiological reporting that combine continuous speech recognition with error correction by transcriptionists.

    PubMed

    Ichikawa, Tamaki; Kitanosono, Takashi; Koizumi, Jun; Ogushi, Yoichi; Tanaka, Osamu; Endo, Jun; Hashimoto, Takeshi; Kawada, Shuichi; Saito, Midori; Kobayashi, Makiko; Imai, Yutaka

    2007-12-20

    We evaluated the usefulness of radiological reporting that combines continuous speech recognition (CSR) and error correction by transcriptionists. Four transcriptionists (two with more than 10 years' and two with less than 3 months' transcription experience) listened to the same 100 dictation files and created radiological reports using conventional transcription and a method that combined CSR with manual error correction by the transcriptionists. We compared the 2 groups using the 2 methods for accuracy and report creation time and evaluated the transcriptionists' inter-personal dependence on accuracy rate and report creation time. We used a CSR system that did not require the training of the system to recognize the user's voice. We observed no significant difference in accuracy between the 2 groups and 2 methods that we tested, though transcriptionists with greater experience transcribed faster than those with less experience using conventional transcription. Using the combined method, error correction speed was not significantly different between two groups of transcriptionists with different levels of experience. Combining CSR and manual error correction by transcriptionists enabled convenient and accurate radiological reporting.

  10. Modified ADALINE algorithm for harmonic estimation and selective harmonic elimination in inverters

    NASA Astrophysics Data System (ADS)

    Vasumathi, B.; Moorthi, S.

    2011-11-01

    In digital signal processing, algorithms are very well developed for the estimation of harmonic components. In power electronic applications, an objective like fast response of a system is of primary importance. An effective method for the estimation of instantaneous harmonic components, along with conventional harmonic elimination technique, is presented in this article. The primary function is to eliminate undesirable higher harmonic components from the selected signal (current or voltage) and it requires only the knowledge of the frequency of the component to be eliminated. A signal processing technique using modified ADALINE algorithm has been proposed for harmonic estimation. The proposed method stays effective as it converges to a minimum error and brings out a finer estimation. A conventional control based on pulse width modulation for selective harmonic elimination is used to eliminate harmonic components after its estimation. This method can be applied to a wide range of equipment. The validity of the proposed method to estimate and eliminate voltage harmonics is proved with a dc/ac inverter as a simulation example. Then, the results are compared with existing ADALINE algorithm for illustrating its effectiveness.

  11. Rapid Detection of Escherichia coli O157:H7 in Fresh Lettuce Based on Localized Surface Plasmon Resonance Combined with Immunomagnetic Separation.

    PubMed

    Lee, Nari; Choi, Sung-Wook; Chang, Hyun-Joo; Chun, Hyang Sook

    2018-05-01

    This study presents a method for rapid detection of Escherichia coli O157:H7 in fresh lettuce based on the properties of target separation and localized surface plasmon resonance of immunomagnetic nanoparticles. The multifunctional immunomagnetic nanoparticles enabling simultaneous separation and detection were prepared by synthesizing magnetic nanoparticles (ca. 10 nm in diameter) composed of an iron oxide (Fe 3 O 4 ) core and gold shell and then conjugating these nanoparticles with the anti- E. coli O157:H7 antibodies. The application of multifunctional immunomagnetic nanoparticles for detecting E. coli O157:H7 in a lettuce matrix allowed detection of the presence of <1 log CFU mL -1 without prior enrichment. In contrast, the detection limit of the conventional plating method was 2.74 log CFU mL -1 . The method, which requires no preenrichment, provides an alternative to conventional microbiological detection methods and can be used as a rapid screening tool for a large number of food samples.

  12. Detecting peroxiredoxin hyperoxidation by one-dimensional isoelectric focusing.

    PubMed

    Cao, Zhenbo; Bulleid, Neil J

    The activity of typical 2-cys peroxiredoxin (Prxs) can be regulated by hyperoxidation with a consequent loss of redox activity. Here we developed a simple assay to monitor the level of hyperoxidation of different typical 2-cys prxs simultaneously. This assay only requires standard equipment and can compare different samples on the same gel. It requires much less time than conventional 2D gels and gives more information than Western blotting with an antibody specific for hyperoxidized peroxiredoxin. This method could also be used to monitor protein modification with a charge difference such as phosphorylation.

  13. Characterization of quantum well structures using a photocathode electron microscope

    NASA Technical Reports Server (NTRS)

    Spencer, Michael G.; Scott, Craig J.

    1989-01-01

    Present day integrated circuits pose a challenge to conventional electronic and mechanical test methods. Feature sizes in the submicron and nanometric regime require radical approaches in order to facilitate electrical contact to circuits and devices being tested. In addition, microwave operating frequencies require careful attention to distributed effects when considering the electrical signal paths within and external to the device under test. An alternative testing approach which combines the best of electrical and optical time domain testing is presented, namely photocathode electron microscope quantitative voltage contrast (PEMQVC).

  14. The role of artificial intelligence techniques in scheduling systems

    NASA Technical Reports Server (NTRS)

    Geoffroy, Amy L.; Britt, Daniel L.; Gohring, John R.

    1990-01-01

    Artificial Intelligence (AI) techniques provide good solutions for many of the problems which are characteristic of scheduling applications. However, scheduling is a large, complex heterogeneous problem. Different applications will require different solutions. Any individual application will require the use of a variety of techniques, including both AI and conventional software methods. The operational context of the scheduling system will also play a large role in design considerations. The key is to identify those places where a specific AI technique is in fact the preferable solution, and to integrate that technique into the overall architecture.

  15. A time-dependent neutron transport method of characteristics formulation with time derivative propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Adam J., E-mail: adamhoff@umich.edu; Lee, John C., E-mail: jcl@umich.edu

    2016-02-15

    A new time-dependent Method of Characteristics (MOC) formulation for nuclear reactor kinetics was developed utilizing angular flux time-derivative propagation. This method avoids the requirement of storing the angular flux at previous points in time to represent a discretized time derivative; instead, an equation for the angular flux time derivative along 1D spatial characteristics is derived and solved concurrently with the 1D transport characteristic equation. This approach allows the angular flux time derivative to be recast principally in terms of the neutron source time derivatives, which are approximated to high-order accuracy using the backward differentiation formula (BDF). This approach, called Sourcemore » Derivative Propagation (SDP), drastically reduces the memory requirements of time-dependent MOC relative to methods that require storing the angular flux. An SDP method was developed for 2D and 3D applications and implemented in the computer code DeCART in 2D. DeCART was used to model two reactor transient benchmarks: a modified TWIGL problem and a C5G7 transient. The SDP method accurately and efficiently replicated the solution of the conventional time-dependent MOC method using two orders of magnitude less memory.« less

  16. On advanced configuration enhance adaptive system optimization

    NASA Astrophysics Data System (ADS)

    Liu, Hua; Ding, Quanxin; Wang, Helong; Guo, Chunjie; Chen, Hongliang; Zhou, Liwei

    2017-10-01

    For aim to find an effective method to structure to enhance these adaptive system with some complex function and look forward to establish an universally applicable solution in prototype and optimization. As the most attractive component in adaptive system, wave front corrector is constrained by some conventional technique and components, such as polarization dependence and narrow working waveband. Advanced configuration based on a polarized beam split can optimized energy splitting method used to overcome these problems effective. With the global algorithm, the bandwidth has been amplified by more than five times as compared with that of traditional ones. Simulation results show that the system can meet the application requirements in MTF and other related criteria. Compared with the conventional design, the system has reduced in volume and weight significantly. Therefore, the determining factors are the prototype selection and the system configuration, Results show their effectiveness.

  17. Improvement of spatial resolution in a Timepix based CdTe photon counting detector using ToT method

    NASA Astrophysics Data System (ADS)

    Park, Kyeongjin; Lee, Daehee; Lim, Kyung Taek; Kim, Giyoon; Chang, Hojong; Yi, Yun; Cho, Gyuseong

    2018-05-01

    Photon counting detectors (PCDs) have been recognized as potential candidates in X-ray radiography and computed tomography due to their many advantages over conventional energy-integrating detectors. In particular, a PCD-based X-ray system shows an improved contrast-to-noise ratio, reduced radiation exposure dose, and more importantly, exhibits a capability for material decomposition with energy binning. For some applications, a very high resolution is required, which translates into smaller pixel size. Unfortunately, small pixels may suffer from energy spectral distortions (distortion in energy resolution) due to charge sharing effects (CSEs). In this work, we propose a method for correcting CSEs by measuring the point of interaction of an incident X-ray photon by the time-of-threshold (ToT) method. Moreover, we also show that it is possible to obtain an X-ray image with a reduced pixel size by using the concept of virtual pixels at a given pixel size. To verify the proposed method, modulation transfer function (MTF) and signal-to-noise ratio (SNR) measurements were carried out with the Timepix chip combined with the CdTe pixel sensor. The X-ray test condition was set at 80 kVp with 5 μA, and a tungsten edge phantom and a lead line phantom were used for the measurements. Enhanced spatial resolution was achieved by applying the proposed method when compared to that of the conventional photon counting method. From experiment results, MTF increased from 6.3 (conventional counting method) to 8.3 lp/mm (proposed method) at 0.3 MTF. On the other hand, the SNR decreased from 33.08 to 26.85 dB due to four virtual pixels.

  18. Detection and enumeration of Salmonella enteritidis in homemade ice cream associated with an outbreak: comparison of conventional and real-time PCR methods.

    PubMed

    Seo, K H; Valentin-Bon, I E; Brackett, R E

    2006-03-01

    Salmonellosis caused by Salmonella Enteritidis (SE) is a significant cause of foodborne illnesses in the United States. Consumption of undercooked eggs and egg-containing products has been the primary risk factor for the disease. The importance of the bacterial enumeration technique has been enormously stressed because of the quantitative risk analysis of SE in shell eggs. Traditional enumeration methods mainly depend on slow and tedious most-probable-number (MPN) methods. Therefore, specific, sensitive, and rapid methods for SE quantitation are needed to collect sufficient data for risk assessment and food safety policy development. We previously developed a real-time quantitative PCR assay for the direct detection and enumeration of SE and, in this study, applied it to naturally contaminated ice cream samples with and without enrichment. The detection limit of the real-time PCR assay was determined with artificially inoculated ice cream. When applied to the direct detection and quantification of SE in ice cream, the real-time PCR assay was as sensitive as the conventional plate count method in frequency of detection. However, populations of SE derived from real-time quantitative PCR were approximately 1 log higher than provided by MPN and CFU values obtained by conventional culture methods. The detection and enumeration of SE in naturally contaminated ice cream can be completed in 3 h by this real-time PCR method, whereas the cultural enrichment method requires 5 to 7 days. A commercial immunoassay for the specific detection of SE was also included in the study. The real-time PCR assay proved to be a valuable tool that may be useful to the food industry in monitoring its processes to improve product quality and safety.

  19. Cost-Benefit Analysis for the Advanced Near Net Shape Technology (ANNST) Method for Fabricating Stiffened Cylinders

    NASA Technical Reports Server (NTRS)

    Stoner, Mary Cecilia; Hehir, Austin R.; Ivanco, Marie L.; Domack, Marcia S.

    2016-01-01

    This cost-benefit analysis assesses the benefits of the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. These preliminary, rough order-of-magnitude results report a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Production cost savings of 35 to 58 percent were reported over the composite manufacturing technique used in this study for comparison; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels when compared with conventional metallic manufacturing. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. A case study compared these three alternatives for manufacturing a cylinder of specified geometry, with particular focus placed on production costs and process complexity, with cost analyses performed by the analogy and parametric methods. Furthermore, a scalability study was conducted for three tank diameters to assess the highest potential payoff of the ANNST process for manufacture of large-diameter cryogenic tanks. The analytical hierarchy process (AHP) was subsequently used with a group of selected subject matter experts to assess the value of the various benefits achieved by the ANNST method for potential stakeholders. The AHP study results revealed that decreased final cylinder mass and quality assurance were the most valued benefits of cylinder manufacturing methods, therefore emphasizing the relevance of the benefits achieved with the ANNST process for future projects.

  20. Sensors vs. experts - A performance comparison of sensor-based fall risk assessment vs. conventional assessment in a sample of geriatric patients

    PubMed Central

    2011-01-01

    Background Fall events contribute significantly to mortality, morbidity and costs in our ageing population. In order to identify persons at risk and to target preventive measures, many scores and assessment tools have been developed. These often require expertise and are costly to implement. Recent research investigates the use of wearable inertial sensors to provide objective data on motion features which can be used to assess individual fall risk automatically. So far it is unknown how well this new method performs in comparison with conventional fall risk assessment tools. The aim of our research is to compare the predictive performance of our new sensor-based method with conventional and established methods, based on prospective data. Methods In a first study phase, 119 inpatients of a geriatric clinic took part in motion measurements using a wireless triaxial accelerometer during a Timed Up&Go (TUG) test and a 20 m walk. Furthermore, the St. Thomas Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY) was performed, and the multidisciplinary geriatric care team estimated the patients' fall risk. In a second follow-up phase of the study, 46 of the participants were interviewed after one year, including a fall and activity assessment. The predictive performances of the TUG, the STRATIFY and team scores are compared. Furthermore, two automatically induced logistic regression models based on conventional clinical and assessment data (CONV) as well as sensor data (SENSOR) are matched. Results Among the risk assessment scores, the geriatric team score (sensitivity 56%, specificity 80%) outperforms STRATIFY and TUG. The induced logistic regression models CONV and SENSOR achieve similar performance values (sensitivity 68%/58%, specificity 74%/78%, AUC 0.74/0.72, +LR 2.64/2.61). Both models are able to identify more persons at risk than the simple scores. Conclusions Sensor-based objective measurements of motion parameters in geriatric patients can be used to assess individual fall risk, and our prediction model's performance matches that of a model based on conventional clinical and assessment data. Sensor-based measurements using a small wearable device may contribute significant information to conventional methods and are feasible in an unsupervised setting. More prospective research is needed to assess the cost-benefit relation of our approach. PMID:21711504

  1. 22 CFR 98.2 - Preservation of Convention records.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Preservation of Convention records. 98.2...-CONVENTION RECORD PRESERVATION § 98.2 Preservation of Convention records. Once the Convention has entered into force for the United States, the Secretary and DHS will preserve, or require the preservation of...

  2. New configuration for efficient and durable copper coating on the outer surface of a tube

    DOE PAGES

    Ahmad, Irfan; Chapman, Steven F.; Velas, Katherine M.; ...

    2017-03-27

    A well-adhered copper coating on stainless steel power coupler parts is required in superconducting radio frequency (SRF) accelerators. Radio frequency power coupler parts are complex, tubelike stainless steel structures, which require copper coating on their outer and inner surfaces. Conventional copper electroplating sometimes produces films with inadequate adhesion strength for SRF applications. Electroplating also requires a thin nickel strike layer under the copper coating, whose magnetic properties can be detrimental to SRF applications. Coaxial energetic deposition (CED) and sputtering methods have demonstrated efficient conformal coating on the inner surfaces of tubes but coating the outer surface of a tube ismore » challenging because these coating methods are line of sight. When the substrate is off axis and the plasma source is on axis, only a small section of the substrate’s outer surface is exposed to the source cathode. The conventional approach is to rotate the tube to achieve uniformity across the outer surface. This method results in poor film thickness uniformity and wastes most of the source plasma. Alameda Applied Sciences Corporation (AASC) has developed a novel configuration called hollow external cathode CED (HEC-CED) to overcome these issues. HEC-CED produces a film with uniform thickness and efficiently uses all eroded source material. Furthermore, the Cu film deposited on the outside of a stainless steel tube using the new HEC-CED configuration survived a high pressure water rinse adhesion test. HEC-CED can be used to coat the outside of any cylindrical structure.« less

  3. New configuration for efficient and durable copper coating on the outer surface of a tube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmad, Irfan; Chapman, Steven F.; Velas, Katherine M.

    A well-adhered copper coating on stainless steel power coupler parts is required in superconducting radio frequency (SRF) accelerators. Radio frequency power coupler parts are complex, tubelike stainless steel structures, which require copper coating on their outer and inner surfaces. Conventional copper electroplating sometimes produces films with inadequate adhesion strength for SRF applications. Electroplating also requires a thin nickel strike layer under the copper coating, whose magnetic properties can be detrimental to SRF applications. Coaxial energetic deposition (CED) and sputtering methods have demonstrated efficient conformal coating on the inner surfaces of tubes but coating the outer surface of a tube ismore » challenging because these coating methods are line of sight. When the substrate is off axis and the plasma source is on axis, only a small section of the substrate’s outer surface is exposed to the source cathode. The conventional approach is to rotate the tube to achieve uniformity across the outer surface. This method results in poor film thickness uniformity and wastes most of the source plasma. Alameda Applied Sciences Corporation (AASC) has developed a novel configuration called hollow external cathode CED (HEC-CED) to overcome these issues. HEC-CED produces a film with uniform thickness and efficiently uses all eroded source material. Furthermore, the Cu film deposited on the outside of a stainless steel tube using the new HEC-CED configuration survived a high pressure water rinse adhesion test. HEC-CED can be used to coat the outside of any cylindrical structure.« less

  4. Determination of molybenum in soils and rocks: A geochemical semimicro field method

    USGS Publications Warehouse

    Ward, F.N.

    1951-01-01

    Reconnaissance work in geochemical prospecting requires a simple, rapid, and moderately accurate method for the determination of small amounts of molybdenum in soils and rocks. The useful range of the suggested procedure is from 1 to 32 p.p.m. of molybdenum, but the upper limit can be extended. Duplicate determinations on eight soil samples containing less than 10 p.p.m. of molybdenum agree within 1 p.p.m., and a comparison of field results with those obtained by a conventional laboratory procedure shows that the method is sufficiently accurate for use in geochemical prospecting. The time required for analysis and the quantities of reagents needed have been decreased to provide essentially a "test tube" method for the determination of molybdenum in soils and rocks. With a minimum amount of skill, one analyst can make 30 molybdenum determinations in an 8-hour day.

  5. A machine learning model with human cognitive biases capable of learning from small and biased datasets.

    PubMed

    Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro

    2018-05-09

    Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.

  6. Microwave Landing System signal requirements for conventional aircraft

    DOT National Transportation Integrated Search

    1972-07-01

    The results of analysis directed towards determining Microwave Landing System (MLS) signal requirements for conventional aircraft are discussed. The phases of flight considered include straight-in final approach, flareout, and rollout. A limited numb...

  7. 8 CFR 204.305 - State preadoption requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... preadoption requirements must be complied with when a child is coming into the State as a Convention adoptee to be adopted in the United States. A qualified Convention adoptee is deemed to be coming to be...

  8. 8 CFR 204.305 - State preadoption requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... preadoption requirements must be complied with when a child is coming into the State as a Convention adoptee to be adopted in the United States. A qualified Convention adoptee is deemed to be coming to be...

  9. 8 CFR 204.305 - State preadoption requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... preadoption requirements must be complied with when a child is coming into the State as a Convention adoptee to be adopted in the United States. A qualified Convention adoptee is deemed to be coming to be...

  10. 8 CFR 204.305 - State preadoption requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... preadoption requirements must be complied with when a child is coming into the State as a Convention adoptee to be adopted in the United States. A qualified Convention adoptee is deemed to be coming to be...

  11. Ultrasonic and radiographic evaluation of advanced aerospace materials: Ceramic composites

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    1990-01-01

    Two conventional nondestructive evaluation techniques were used to evaluate advanced ceramic composite materials. It was shown that neither ultrasonic C-scan nor radiographic imaging can individually provide sufficient data for an accurate nondestructive evaluation. Both ultrasonic C-scan and conventional radiographic imaging are required for preliminary evaluation of these complex systems. The material variations that were identified by these two techniques are porosity, delaminations, bond quality between laminae, fiber alignment, fiber registration, fiber parallelism, and processing density flaws. The degree of bonding between fiber and matrix cannot be determined by either of these methods. An alternative ultrasonic technique, angular power spectrum scanning (APSS) is recommended for quantification of this interfacial bond.

  12. Next Generation Non-Vacuum, Maskless, Low Temperature Nanoparticle Ink Laser Digital Direct Metal Patterning for a Large Area Flexible Electronics

    PubMed Central

    Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P.; Ko, Seung Hwan

    2012-01-01

    Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition– and photolithography-based conventional metal patterning processes. The “digital” nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays. PMID:22900011

  13. Next generation non-vacuum, maskless, low temperature nanoparticle ink laser digital direct metal patterning for a large area flexible electronics.

    PubMed

    Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P; Ko, Seung Hwan

    2012-01-01

    Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition- and photolithography-based conventional metal patterning processes. The "digital" nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays.

  14. Simultaneous 3D MR elastography of the in vivo mouse brain

    NASA Astrophysics Data System (ADS)

    Kearney, Steven P.; Majumdar, Shreyan; Royston, Thomas J.; Klatt, Dieter

    2017-10-01

    The feasibility of sample interval modulation (SLIM) magnetic resonance elastography (MRE) for the in vivo mouse brain is assessed, and an alternative SLIM-MRE encoding method is introduced. In SLIM-MRE, the phase accumulation for each motion direction is encoded simultaneously by varying either the start time of the motion encoding gradient (MEG), SLIM-phase constant (SLIM-PC), or the initial phase of the MEG, SLIM-phase varying (SLIM-PV). SLIM-PC provides gradient moment nulling, but the mutual gradient shift necessitates increased echo time (TE). SLIM-PV requires no increased TE, but exhibits non-uniform flow compensation. Comparison was to conventional MRE using six C57BL/6 mice. For SLIM-PC, the Spearman’s rank correlation to conventional MRE for the shear storage and loss modulus images were 80% and 76%, respectively, and likewise for SLIM-PV, 73% and 69%, respectively. The results of the Wilcoxon rank sum test showed that there were no statistically significant differences between the spatially averaged shear moduli derived from conventional-MRE, SLIM-PC, and SLIM-PV acquisitions. Both SLIM approaches were comparable to conventional MRE scans with Spearman’s rank correlation of 69%-80% and with 3 times reduction in scan time. The SLIM-PC method had the best correlation, and SLIM-PV may be a useful tool in experimental conditions, where both measurement time and T2 relaxation is critical.

  15. Maxillary incisors changes during space closure with conventional and skeletal anchorage methods: a systematic review.

    PubMed

    Jayaratne, Yasas Shri Nalaka; Uribe, Flavio; Janakiraman, Nandakumar

    2017-01-01

    The objective of this systematic review was to compare the antero-posterior, vertical and angular changes of maxillary incisors with conventional anchorage control techniques and mini-implant based space closure methods. The electronic databases Pubmed, Scopus, ISI Web of knowledge, Cochrane Library and Open Grey were searched for potentially eligible studies using a set of predetermined keywords. Full texts meeting the inclusion criteria as well as their references were manually searched. The primary outcome data (linear, angular, and vertical maxillary incisor changes) and secondary outcome data (overbite changes, soft tissue changes, biomechanical factors, root resorption and treatment duration) were extracted from the selected articles and entered into spreadsheets based on the type of anchorage used. The methodological quality of each study was assessed. Six studies met the inclusion criteria. The amount of incisor retraction was greater with buccally placed mini-implants than conventional anchorage techniques. The incisor retraction with indirect anchorage from palatal mini-implants was less when compared with buccally placed mini-implants. Incisor intrusion occurred with buccal mini-implants, whereas extrusion was seen with conventional anchorage. Limited data on the biomechanical variables or adverse effects such as root resorption were reported in these studies. More RCT's that take in to account relevant biomechanical variables and employ three-dimensional quantification of tooth movements are required to provide information on incisor changes during space closure.

  16. Simultaneous 3D MR elastography of the in vivo mouse brain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kearney, Steven P.; Majumdar, Shreyan; Royston, Thomas J.

    The feasibility of sample interval modulation (SLIM) magnetic resonance elastography (MRE) for the in vivo mouse brain is assessed, and an alternative SLIM-MRE encoding method is introduced. In SLIMMRE, the phase accumulation for each motion direction is encoded simultaneously by varying either the start time of the motion encoding gradient (MEG), SLIM-phase constant (SLIM-PC), or the initial phase of the MEG, SLIM-phase varying (SLIM-PV). SLIM-PC provides gradient moment nulling, but the mutual gradient shift necessitates increased echo time (TE). SLIM-PV requires no increased TE, but exhibits nonuniform flow compensation. Comparison was to conventional MRE using six C57BL/6 mice. For SLIMPC,more » the Spearman’s rank correlation to conventional MRE for the shear storage and loss modulus images were 80% and 76%, respectively, and likewise for SLIM-PV, 73% and 69%, respectively. The results of the Wilcoxon rank sum test showed that there were no statistically significant differences between the spatially averaged shear moduli derived from conventional-MRE, SLIM-PC, and SLIM-PV acquisitions. Both SLIM approaches were comparable to conventional MRE scans with Spearman’s rank correlation of 69%-80% and with 3 times reduction in scan time. As a result, the SLIM-PC method had the best correlation, and SLIM-PV may be a useful tool in experimental conditions, where both measurement time and T2 relaxation is critical.« less

  17. Simultaneous 3D MR elastography of the in vivo mouse brain

    DOE PAGES

    Kearney, Steven P.; Majumdar, Shreyan; Royston, Thomas J.; ...

    2017-09-15

    The feasibility of sample interval modulation (SLIM) magnetic resonance elastography (MRE) for the in vivo mouse brain is assessed, and an alternative SLIM-MRE encoding method is introduced. In SLIMMRE, the phase accumulation for each motion direction is encoded simultaneously by varying either the start time of the motion encoding gradient (MEG), SLIM-phase constant (SLIM-PC), or the initial phase of the MEG, SLIM-phase varying (SLIM-PV). SLIM-PC provides gradient moment nulling, but the mutual gradient shift necessitates increased echo time (TE). SLIM-PV requires no increased TE, but exhibits nonuniform flow compensation. Comparison was to conventional MRE using six C57BL/6 mice. For SLIMPC,more » the Spearman’s rank correlation to conventional MRE for the shear storage and loss modulus images were 80% and 76%, respectively, and likewise for SLIM-PV, 73% and 69%, respectively. The results of the Wilcoxon rank sum test showed that there were no statistically significant differences between the spatially averaged shear moduli derived from conventional-MRE, SLIM-PC, and SLIM-PV acquisitions. Both SLIM approaches were comparable to conventional MRE scans with Spearman’s rank correlation of 69%-80% and with 3 times reduction in scan time. As a result, the SLIM-PC method had the best correlation, and SLIM-PV may be a useful tool in experimental conditions, where both measurement time and T2 relaxation is critical.« less

  18. Ultrasonography with color Doppler and power Doppler in the diagnosis of periapical lesions

    PubMed Central

    Goel, Sumit; Nagendrareddy, Suma Gundareddy; Raju, Manthena Srinivasa; Krishnojirao, Dayashankara Rao Jingade; Rastogi, Rajul; Mohan, Ravi Prakash Sasankoti; Gupta, Swati

    2011-01-01

    Aim: To evaluate the efficacy of ultrasonography (USG) with color Doppler and power Doppler applications over conventional radiography in the diagnosis of periapical lesions. Materials and Methods: Thirty patients having inflammatory periapical lesions of the maxillary or mandibular anterior teeth and requiring endodontic surgery were selected for inclusion in this study. All patients consented to participate in the study. We used conventional periapical radiographs as well as USG with color Doppler and power Doppler for the diagnosis of these lesions. Their diagnostic performances were compared against histopathologic examination. All data were compared and statistically analyzed. Results: USG examination with color Doppler and power Doppler identified 29 (19 cysts and 10 granulomas) of 30 periapical lesions accurately, with a sensitivity of 100% for cysts and 90.91% for granulomas and a specificity of 90.91% for cysts and 100% for granulomas. In comparison, conventional intraoral radiography identified only 21 lesions (sensitivity of 78.9% for cysts and 45.4% for granulomas and specificity of 45.4% for cysts and 78.9% for granulomas). There was definite correlation between the echotexture of the lesions and the histopathological features except in one case. Conclusions: USG imaging with color Doppler and power Doppler is superior to conventional intraoral radiographic methods for diagnosing the nature of periapical lesions in the anterior jaws. This study reveals the potential of USG examination in the study of other jaw lesions. PMID:22223940

  19. Modified coaxial wire method for measurement of transfer impedance of beam position monitors

    NASA Astrophysics Data System (ADS)

    Kumar, Mukesh; Babbar, L. K.; Deo, R. K.; Puntambekar, T. A.; Senecha, V. K.

    2018-05-01

    The transfer impedance is a very important parameter of a beam position monitor (BPM) which relates its output signal with the beam current. The coaxial wire method is a standard technique to measure transfer impedance of the BPM. The conventional coaxial wire method requires impedance matching between coaxial wire and external circuits (vector network analyzer and associated cables). This paper presents a modified coaxial wire method for bench measurement of the transfer impedance of capacitive pickups like button electrodes and shoe box BPMs. Unlike the conventional coaxial wire method, in the modified coaxial wire method no impedance matching elements have been used between the device under test and the external circuit. The effect of impedance mismatch has been solved mathematically and a new expression of transfer impedance has been derived. The proposed method is verified through simulation of a button electrode BPM using cst studio suite. The new method is also applied to measure transfer impedance of a button electrode BPM developed for insertion devices of Indus-2 and the results are also compared with its simulations. Close agreement between measured and simulation results suggests that the modified coaxial wire setup can be exploited for the measurement of transfer impedance of capacitive BPMs like button electrodes and shoe box BPM.

  20. XFEM-based modeling of successive resections for preoperative image updating

    NASA Astrophysics Data System (ADS)

    Vigneron, Lara M.; Robe, Pierre A.; Warfield, Simon K.; Verly, Jacques G.

    2006-03-01

    We present a new method for modeling organ deformations due to successive resections. We use a biomechanical model of the organ, compute its volume-displacement solution based on the eXtended Finite Element Method (XFEM). The key feature of XFEM is that material discontinuities induced by every new resection can be handled without remeshing or mesh adaptation, as would be required by the conventional Finite Element Method (FEM). We focus on the application of preoperative image updating for image-guided surgery. Proof-of-concept demonstrations are shown for synthetic and real data in the context of neurosurgery.

  1. Application of finite-element methods to dynamic analysis of flexible spatial and co-planar linkage systems, part 2

    NASA Technical Reports Server (NTRS)

    Dubowsky, Steven

    1989-01-01

    An approach is described to modeling the flexibility effects in spatial mechanisms and manipulator systems. The method is based on finite element representations of the individual links in the system. However, it should be noted that conventional finite element methods and software packages will not handle the highly nonlinear dynamic behavior of these systems which results form their changing geometry. In order to design high-performance lightweight systems and their control systems, good models of their dynamic behavior which include the effects of flexibility are required.

  2. Detection of Salmonella sp in chicken cuts using immunomagnetic separation

    PubMed Central

    de Cássia dos Santos da Conceição, Rita; Moreira, Ângela Nunes; Ramos, Roberta Juliano; Goularte, Fabiana Lemos; Carvalhal, José Beiro; Aleixo, José Antonio Guimarães

    2008-01-01

    The immunomagnetic separation (IMS) is a technique that has been used to increase sensitivity and specificity and to decrease the time required for detection of Salmonella in foods through different methodologies. In this work we report on the development of a method for detection of Salmonella in chicken cuts using in house antibody-sensitized microspheres associated to conventional plating in selective agar (IMS-plating). First, protein A-coated microspheres were sensitized with polyclonal antibodies against lipopolysacharide and flagella from salmonellae and used to standardize a procedure for capturing Salmonella Enteritidis from pure cultures and detection in selective agar. Subsequently, samples of chicken meat experimentally contaminated with S. Enteritidis were analyzed immediately after contamination and after 24h of refrigeration using three enrichment protocols. The detection limit of the IMS-plating procedure after standardization with pure culture was about 2x10 CFU/mL. The protocol using non-selective enrichment for 6-8h, selective enrichment for 16-18h and a post-enrichment for 4h gave the best results of S. Enteritidis detection by IMS-plating in experimentally contaminated meat. IMS-plating using this protocol was compared to the standard culture method for salmonellae detection in naturally contaminated chicken cuts and yielded 100% sensitivity and 94% specificity. The method developed using in house prepared magnetic microespheres for IMS and plating in selective agar was able to diminish by at least one day the time required for detection of Salmonella in chicken products by the conventional culture method. PMID:24031199

  3. High strength air-dried aerogels

    DOEpatents

    Coronado, Paul R.; Satcher, Jr., Joe H.

    2012-11-06

    A method for the preparation of high strength air-dried organic aerogels. The method involves the sol-gel polymerization of organic gel precursors, such as resorcinol with formaldehyde (RF) in aqueous solvents with R/C ratios greater than about 1000 and R/F ratios less than about 1:2.1. Using a procedure analogous to the preparation of resorcinol-formaldehyde (RF) aerogels, this approach generates wet gels that can be air dried at ambient temperatures and pressures. The method significantly reduces the time and/or energy required to produce a dried aerogel compared to conventional methods using either supercritical solvent extraction. The air dried gel exhibits typically less than 5% shrinkage.

  4. Calorimetric Measurement for Internal Conversion Efficiency of Photovoltaic Cells/Modules Based on Electrical Substitution Method

    NASA Astrophysics Data System (ADS)

    Saito, Terubumi; Tatsuta, Muneaki; Abe, Yamato; Takesawa, Minato

    2018-02-01

    We have succeeded in the direct measurement for solar cell/module internal conversion efficiency based on a calorimetric method or electrical substitution method by which the absorbed radiant power is determined by replacing the heat absorbed in the cell/module with the electrical power. The technique is advantageous in that the reflectance and transmittance measurements, which are required in the conventional methods, are not necessary. Also, the internal quantum efficiency can be derived from conversion efficiencies by using the average photon energy. Agreements of the measured data with the values estimated from the nominal values support the validity of this technique.

  5. The effect of two fixation methods (TAF and DESS) on morphometric parameters of Aphelenchoides ritzemabosi.

    PubMed

    Chałańska, Aneta; Bogumił, Aleksandra; Malewski, Tadeusz; Kowalewska, Katarzyna

    2016-02-19

    Identification of nematode species by using conventional methods requires fixation of the isolated material and a suitable preparation for further analyses. Tentative identification using microscopic methods should also be performed prior to initiating molecular studies. In the literature, various methods are described for the preparation of nematodes from the genus Aphelenchoides for identification and microscopic studies. The most commonly used fixatives are formalin (Timm 1969; Szczygieł & Cid del Prado Vera 1981, Crozzoli et al. 2008, Khan et al. 2008), FAA (Wasilewska 1969; Vovlas et al. 2005, Khan et al. 2007) and TAF (Hooper 1958, Chizhov et al. 2006, Jagdale & Grewal 2006).

  6. Detection of influenza antigenic variants directly from clinical samples using polyclonal antibody based proximity ligation assays

    PubMed Central

    Martin, Brigitte E.; Jia, Kun; Sun, Hailiang; Ye, Jianqiang; Hall, Crystal; Ware, Daphne; Wan, Xiu-Feng

    2016-01-01

    Identification of antigenic variants is the key to a successful influenza vaccination program. The empirical serological methods to determine influenza antigenic properties require viral propagation. Here a novel quantitative PCR-based antigenic characterization method using polyclonal antibody and proximity ligation assays, or so-called polyPLA, was developed and validated. This method can detect a viral titer that is less than 1000 TCID50/mL. Not only can this method differentiate between different HA subtypes of influenza viruses but also effectively identify antigenic drift events within the same HA subtype of influenza viruses. Applications in H3N2 seasonal influenza data showed that the results from this novel method are consistent with those from the conventional serological assays. This method is not limited to the detection of antigenic variants in influenza but also other pathogens. It has the potential to be applied through a large-scale platform in disease surveillance requiring minimal biosafety and directly using clinical samples. PMID:25546251

  7. Kinematic Determination of an Unmodeled Serial Manipulator by Means of an IMU

    NASA Astrophysics Data System (ADS)

    Ciarleglio, Constance A.

    Kinematic determination for an unmodeled manipulator is usually done through a-priori knowledge of the manipulator physical characteristics or external sensor information. The mathematics of the kinematic estimation, often based on Denavit- Hartenberg convention, are complex and have high computation requirements, in addition to being unique to the manipulator for which the method is developed. Analytical methods that can compute kinematics on-the fly have the potential to be highly beneficial in dynamic environments where different configurations and variable manipulator types are often required. This thesis derives a new screw theory based method of kinematic determination, using a single inertial measurement unit (IMU), for use with any serial, revolute manipulator. The method allows the expansion of reconfigurable manipulator design and simplifies the kinematic process for existing manipulators. A simulation is presented where the theory of the method is verified and characterized with error. The method is then implemented on an existing manipulator as a verification of functionality.

  8. Advances in Laser Microprobe (U-Th)/He Geochronology

    NASA Astrophysics Data System (ADS)

    van Soest, M. C.; Monteleone, B. D.; Boyce, J. W.; Hodges, K. V.

    2008-12-01

    The development of the laser microprobe (U-Th)/He dating method has the potential to overcome many of the limitations that affect conventional (U-Th)/He geochronology. Conventional single- or multi-crystal (U- Th)/He geochronology requires the use of pristine, inclusion-free, euhedral crystals. Furthermore, the ages that are obtained require corrections for the effects of zoning and alpha ejection based on an ensemble of assumptions before interpretation of their geological relevance is possible. With the utilization of microbeam techniques many of the limitations of conventional (U-Th)/He geochronology can either be eliminated by careful spot selection or accounted for by detailed depth profiling analyses of He, U and Th on the same crystal. Combined He, Th, and U depth profiling on the same crystal potentially even offers the ability to extract thermal histories from the analyzed grains. Boyce et al. (2006) first demonstrated the laser microprobe (U-Th)/He dating technique by successfully dating monazite crystals using UV laser ablation to liberate He and determined U and Th concentrations using a Cameca SX-Ultrachron microprobe. At Arizona State University, further development of the microprobe (U-Th)/He dating technique continues using an ArF Excimer laser connected to a GVI Helix Split Flight Tube noble gas mass spectrometer for He analysis and SIMS techniques for U and Th. The Durango apatite age standard has been successfully dated at 30.7 +/- 1.7 Ma (2SD). Work on dating zircons by laser ablation is currently underway, with initial results from Sri Lanka zircon at 437 +/- 14 Ma (2SD) confirmed by conventional (U-Th)/He analysis and in agreement with the published (U-Th)/He age of 443 +/- 9 Ma (2SD) for zircons from this region in Sri Lanka (Nasdala et al., 2004). The results presented here demonstrate the laser microprobe (U-Th)/He method as a powerful tool that allows application of (U- Th)/He dating to areas of research such as detrital apatite and zircon dating, where conventional (U-Th)/He geochronology has limited applicability. Boyce et al. (2006) GCA 70 (3031-3039), Nasdala et al. (2004) Am. Min. 89 (219-231)

  9. A special ionisation chamber for quality control of diagnostic and mammography X ray equipment.

    PubMed

    Costa, A M; Caldas, L V E

    2003-01-01

    A quality control program for X ray equipment used for conventional radiography and mammography requires the constancy check of the beam qualities in terms of the half-value layers. In this work, a special double-faced parallel-plate ionisation chamber was developed with inner electrodes of different materials, in a tandem system. Its application will be in quality control programs of diagnostic and mammography X ray equipment for confirmation of half-value layers previously determined by the conventional method. Moreover, the chamber also may be utilised for measurements of air kerma values (and air kerma rates) in X radiation fields used for conventional radiography and mammography. The chamber was studied in relation to the characteristics of saturation, ion collection efficiency, polarity effects, leakage current, and short-term stability. The energy dependence in response of each of the two faces of the chamber was determined over the conventional radiography and mammography X ray ranges (unattenuated beams). The different energy response of the two faces of the chamber allowed the formation of a tandem system useful for the constancy check of beam qualities.

  10. A work study of the CAD/CAM method and conventional manual method in the fabrication of spinal orthoses for patients with adolescent idiopathic scoliosis.

    PubMed

    Wong, M S; Cheng, J C Y; Wong, M W; So, S F

    2005-04-01

    A study was conducted to compare the CAD/CAM method with the conventional manual method in fabrication of spinal orthoses for patients with adolescent idiopathic scoliosis. Ten subjects were recruited for this study. Efficiency analyses of the two methods were performed from cast filling/ digitization process to completion of cast/image rectification. The dimensional changes of the casts/ models rectified by the two cast rectification methods were also investigated. The results demonstrated that the CAD/CAM method was faster than the conventional manual method in the studied processes. The mean rectification time of the CAD/CAM method was shorter than that of the conventional manual method by 108.3 min (63.5%). This indicated that the CAD/CAM method took about 1/3 of the time of the conventional manual to finish cast rectification. In the comparison of cast/image dimensional differences between the conventional manual method and the CAD/CAM method, five major dimensions in each of the five rectified regions namely the axilla, thoracic, lumbar, abdominal and pelvic regions were involved. There were no significant dimensional differences (p < 0.05) in 19 out of the 25 studied dimensions. This study demonstrated that the CAD/CAM system could save the time in the rectification process and offer a relatively high resemblance in cast rectification as compared with the conventional manual method.

  11. Solving large scale traveling salesman problems by chaotic neurodynamics.

    PubMed

    Hasegawa, Mikio; Ikeguch, Tohru; Aihara, Kazuyuki

    2002-03-01

    We propose a novel approach for solving large scale traveling salesman problems (TSPs) by chaotic dynamics. First, we realize the tabu search on a neural network, by utilizing the refractory effects as the tabu effects. Then, we extend it to a chaotic neural network version. We propose two types of chaotic searching methods, which are based on two different tabu searches. While the first one requires neurons of the order of n2 for an n-city TSP, the second one requires only n neurons. Moreover, an automatic parameter tuning method of our chaotic neural network is presented for easy application to various problems. Last, we show that our method with n neurons is applicable to large TSPs such as an 85,900-city problem and exhibits better performance than the conventional stochastic searches and the tabu searches.

  12. Novel Hybrid Operating Table for Neuroendovascular Treatment.

    PubMed

    Jong-Hyun, Park; Jonghyeon, Mun; Dong-Seung, Shin; Bum-Tae, Kim

    2017-03-25

    The integration of interventional and surgical techniques is requiring the development of a new working environment equipped for the needs of an interdisciplinary neurovascular team. However, conventional surgical and interventional tables have only a limited ability to provide for these needs. We have developed a concept mobile hybrid operating table that provides the ability for such a team to conduct both endovascular and surgical procedures in a single session. We developed methods that provide surgeons with angiography-guided surgery techniques for use in a conventional operating room environment. In order to design a convenient device ideal for practical use, we consulted with mechanical engineers. The mobile hybrid operating table consists of two modules: a floating tabletop and a mobile module. In brief, the basic principle of the mobile hybrid operating table is as follows: firstly, the length of the mobile hybrid operating table is longer than that of a conventional surgical table and yet shorter than a conventional interventional table. It was designed with the goal of exhaustively meeting the intensive requirements of both endovascular and surgical procedures. Its mobile module allows for the floating tabletop to be moved quickly and precisely. It is important that during a procedure, a patient can be moved without being repositioned, particularly with a catheter in situ. Secondly, a slim-profile headrest facilitates the mounting of a radiolucent head cramp system for cranial stabilization and fixation. We have introduced a novel invention, a mobile hybrid operating table for use in an operating suite.

  13. Computer-aided classification of lung nodules on computed tomography images via deep learning technique

    PubMed Central

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. PMID:26346558

  14. Fast Microwave-assisted Pretreatment for Bioconversion of Sawdust Lignocellulose to Glucose

    NASA Astrophysics Data System (ADS)

    Nyoman Sudiana, I.; Mitsudo, Seitaro; Endang Susilowati, Prima; Ketut Sutiari, Desak; Widana Arsana, Made; Zamrun Firihu, Muhammad; Ode Ngkoimani, La; Aba, La; Sahaluddin Hasan, Erzam; Cahyono, Edi; Sabchevski, Svilen; Aripin, Haji; Gde Suastika, Komang

    2017-05-01

    A preliminary study of application microwave energy for bioconversion of cellulosic sawdust to glucose was performed. The effects of the microwave were compared to those of the conventional method for each solvent. It was expected that a broader mechanism responsible for the microwave effects on the chemical processes, especially the pretreatment on the hydrolysis of cellulose can be explained. Reagents used were an acid (HCl), an alkali (NaOH), and distilled water (H2O). The experimental results showed that the microwave-assisted pretreatment on the lignocellulosic sawdust faster than by using conventional heating (hotplate). Moreover by using microwave a higher glucose content compared to the conventional method was found. With microwave during hydrolisis, high temperatures and high reagent concentrations were not required. Pretreatment with a microwave at 800 Watt and solvent NaOH 22,50 mg/mL at a temperature of 120°c appeared to be most efficient found in this experiment. These results indicate that microwave effective for bioconversion of cellulosic sawdust to glucose. The microstructure evaluation by using SEM and XRD should be performed to understand more detail the effect especially on their cellulosic structural evolution.

  15. Impressions of functional food consumers.

    PubMed

    Saher, Marieke; Arvola, Anne; Lindeman, Marjaana; Lähteenmäki, Liisa

    2004-02-01

    Functional foods provide a new way of expressing healthiness in food choices. The objective of this study was to apply an indirect measure to explore what kind of impressions people form of users of functional foods. Respondents (n=350) received one of eight versions of a shopping list and rated the buyer of the foods on 66 bipolar attributes on 7-point scales. The shopping lists had either healthy or neutral background items, conventional or functional target items and the buyer was described either as a 40-year-old woman or man. The attribute ratings revealed three factors: disciplined, innovative and gentle. Buyers with healthy background items were perceived as more disciplined than those having neutral items on the list, users of functional foods were rated as more disciplined than users of conventional target items only when the background list consisted of neutral items. Buyers of functional foods were regarded as more innovative and less gentle, but gender affected the ratings on gentle dimension. The impressions of functional food users clearly differ from those formed of users of conventional foods with a healthy image. The shopping list method performed well as an indirect method, but further studies are required to test its feasibility in measuring other food-related impressions.

  16. Computer-aided classification of lung nodules on computed tomography images via deep learning technique.

    PubMed

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain.

  17. How to find non-dependent opiate users: a comparison of sampling methods in a field study of opium and heroin users.

    PubMed

    Korf, Dirk J; van Ginkel, Patrick; Benschop, Annemieke

    2010-05-01

    The first aim is to better understand the potentials and limitations of different sampling methods for reaching a specific, rarely studied population of drug users and for persuading them to take part in a multidisciplinary study. The second is to determine the extent to which these different methods reach similar or dissimilar segments of the non-dependent opiate-using population. Using ethnographic fieldwork (EFW) and targeted canvassing (TARC; small newspaper advertisements and website announcements), supplemented by snowball referrals, we recruited and interviewed 127 non-dependent opiate users (lifetime prevalence of use 5-100 times; 86.6% had used heroin and 56.7% opium). Average age was 39.0; 66.1% were male and 33.9% female. In addition to opiates, many respondents had wide experience with other illicit drugs. The majority had non-conventional lifestyles. Both EFW and TARC yielded only limited numbers of snowball referrals. EFW requires specific skills, is labour-intensive, thus expensive, but allows unsuitable candidates to be excluded faster. Respondents recruited through EFW were significantly more likely to have experience with opium and various drugs other than opiates. TARC resulted in larger percentages of women and respondents with conventional lifestyles. TARC is less labour-intensive but requires more time for screening candidates; its cost-effectiveness depends on the price of advertising for the recruitment. Different methods reach different segments of the population of non-dependent opiate users. It is useful to employ a multi-method approach to reduce selectivity. Copyright 2009 Elsevier B.V. All rights reserved.

  18. Reverse-time migration for subsurface imaging using single- and multi- frequency components

    NASA Astrophysics Data System (ADS)

    Ha, J.; Kim, Y.; Kim, S.; Chung, W.; Shin, S.; Lee, D.

    2017-12-01

    Reverse-time migration is a seismic data processing method for obtaining accurate subsurface structure images from seismic data. This method has been applied to obtain more precise complex geological structure information, including steep dips, by considering wave propagation characteristics based on two-way traveltime. Recently, various studies have reported the characteristics of acquired datasets from different types of media. In particular, because real subsurface media is comprised of various types of structures, seismic data represent various responses. Among them, frequency characteristics can be used as an important indicator for analyzing wave propagation in subsurface structures. All frequency components are utilized in conventional reverse-time migration, but analyzing each component is required because they contain inherent seismic response characteristics. In this study, we propose a reverse-time migration method that utilizes single- and multi- frequency components for analyzing subsurface imaging. We performed a spectral decomposition to utilize the characteristics of non-stationary seismic data. We propose two types of imaging conditions, in which decomposed signals are applied in complex and envelope traces. The SEG/EAGE Overthrust model was used to demonstrate the proposed method, and the 1st derivative Gaussian function with a 10 Hz cutoff was used as the source signature. The results were more accurate and stable when relatively lower frequency components in the effective frequency range were used. By combining the gradient obtained from various frequency components, we confirmed that the results are clearer than the conventional method using all frequency components. Also, further study is required to effectively combine the multi-frequency components.

  19. Selective mixed-bed solid phase extraction of atrazine herbicide from environmental water samples using molecularly imprinted polymer.

    PubMed

    Zarejousheghani, Mashaalah; Fiedler, Petra; Möder, Monika; Borsdorf, Helko

    2014-11-01

    A novel approach for the selective extraction of organic target compounds from water samples has been developed using a mixed-bed solid phase extraction (mixed-bed SPE) technique. The molecularly imprinted polymer (MIP) particles are embedded in a network of silica gel to form a stable uniform porous bed. The capabilities of this method are demonstrated using atrazine as a model compound. In comparison to conventional molecularly imprinted-solid phase extraction (MISPE), the proposed mixed-bed MISPE method in combination with gas chromatography-mass spectrometry (GC-MS) analysis enables more reproducible and efficient extraction performance. After optimization of operational parameters (polymerization conditions, bed matrix ingredients, polymer to silica gel ratio, pH of the sample solution, breakthrough volume plus washing and elution conditions), improved LODs (1.34 µg L(-1) in comparison to 2.25 µg L(-1) obtained using MISPE) and limits of quantification (4.5 µg L(-1) for mixed-bed MISPE and 7.5 µg L(-1) for MISPE) were observed for the analysis of atrazine. Furthermore, the relative standard deviations (RSDs) for atrazine at concentrations between 5 and 200 µg L(-1) ranged between 1.8% and 6.3% compared to MISPE (3.5-12.1%). Additionally, the column-to-column reproducibility for the mixed-bed MISPE was significantly improved to 16.1%, compared with 53% that was observed for MISPE. Due to the reduced bed-mass sorbent and at optimized conditions, the total amount of organic solvents required for conditioning, washing and elution steps reduced from more than 25 mL for conventional MISPE to less than 2 mL for mixed-bed MISPE. Besides reduced organic solvent consumption, total sample preparation time of the mixed-bed MISPE method relative to the conventional MISPE was reduced from more than 20 min to less than 10 min. The amount of organic solvent required for complete elution diminished from 3 mL (conventional MISPE) to less than 0.4 mL with the mixed-bed technique shows its inherent potential for online operation with an analytical instrument. In order to evaluate the selectivity and matrix effects of the developed mixed-bed MISPE method, it was applied as an extraction technique for atrazine from environmental wastewater and river water samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Computer classification of remotely sensed multispectral image data by extraction and classification of homogeneous objects

    NASA Technical Reports Server (NTRS)

    Kettig, R. L.

    1975-01-01

    A method of classification of digitized multispectral images is developed and experimentally evaluated on actual earth resources data collected by aircraft and satellite. The method is designed to exploit the characteristic dependence between adjacent states of nature that is neglected by the more conventional simple-symmetric decision rule. Thus contextual information is incorporated into the classification scheme. The principle reason for doing this is to improve the accuracy of the classification. For general types of dependence this would generally require more computation per resolution element than the simple-symmetric classifier. But when the dependence occurs in the form of redundance, the elements can be classified collectively, in groups, therby reducing the number of classifications required.

  1. X-ray phase contrast tomography by tracking near field speckle

    PubMed Central

    Wang, Hongchang; Berujon, Sebastien; Herzen, Julia; Atwood, Robert; Laundy, David; Hipp, Alexander; Sawhney, Kawal

    2015-01-01

    X-ray imaging techniques that capture variations in the x-ray phase can yield higher contrast images with lower x-ray dose than is possible with conventional absorption radiography. However, the extraction of phase information is often more difficult than the extraction of absorption information and requires a more sophisticated experimental arrangement. We here report a method for three-dimensional (3D) X-ray phase contrast computed tomography (CT) which gives quantitative volumetric information on the real part of the refractive index. The method is based on the recently developed X-ray speckle tracking technique in which the displacement of near field speckle is tracked using a digital image correlation algorithm. In addition to differential phase contrast projection images, the method allows the dark-field images to be simultaneously extracted. After reconstruction, compared to conventional absorption CT images, the 3D phase CT images show greatly enhanced contrast. This new imaging method has advantages compared to other X-ray imaging methods in simplicity of experimental arrangement, speed of measurement and relative insensitivity to beam movements. These features make the technique an attractive candidate for material imaging such as in-vivo imaging of biological systems containing soft tissue. PMID:25735237

  2. A simplified method for power-law modelling of metabolic pathways from time-course data and steady-state flux profiles.

    PubMed

    Kitayama, Tomoya; Kinoshita, Ayako; Sugimoto, Masahiro; Nakayama, Yoichi; Tomita, Masaru

    2006-07-17

    In order to improve understanding of metabolic systems there have been attempts to construct S-system models from time courses. Conventionally, non-linear curve-fitting algorithms have been used for modelling, because of the non-linear properties of parameter estimation from time series. However, the huge iterative calculations required have hindered the development of large-scale metabolic pathway models. To solve this problem we propose a novel method involving power-law modelling of metabolic pathways from the Jacobian of the targeted system and the steady-state flux profiles by linearization of S-systems. The results of two case studies modelling a straight and a branched pathway, respectively, showed that our method reduced the number of unknown parameters needing to be estimated. The time-courses simulated by conventional kinetic models and those described by our method behaved similarly under a wide range of perturbations of metabolite concentrations. The proposed method reduces calculation complexity and facilitates the construction of large-scale S-system models of metabolic pathways, realizing a practical application of reverse engineering of dynamic simulation models from the Jacobian of the targeted system and steady-state flux profiles.

  3. [A rapid dialysis method for analysis of artificial sweeteners in food].

    PubMed

    Tahara, Shoichi; Fujiwara, Takushi; Yasui, Akiko; Hayafuji, Chieko; Kobayashi, Chigusa; Uematsu, Yoko

    2014-01-01

    A simple and rapid dialysis method was developed for the extraction and purification of four artificial sweeteners, namely, sodium saccharin (Sa), acesulfame potassium (AK), aspartame (APM), and dulcin (Du), which are present in various foods. Conventional dialysis uses a membrane dialysis tube approximately 15 cm in length and is carried out over many hours owing to the small membrane area and owing to inefficient mixing. In particular, processed cereal products such as cookies required treatment for 48 hours to obtain satisfactory recovery of the compounds. By increasing the tube length to 55 cm and introducing efficient mixing by inversion at half-hour intervals, the dialysis times of the four artificial sweeteners, spiked at 0.1 g/kg in the cookie, were shortened to 4 hours. Recovery yields of 88.9-103.2% were obtained by using the improved method, whereas recovery yields were low (65.5-82.0%) by the conventional method. Recovery yields (%) of Sa, AK, APM, and Du, spiked at 0.1 g/kg in various foods, were 91.6-100.1, 93.9-100.1, 86.7-100.0 and 88.7-104.7 using the improved method.

  4. Label-Free, Flow-Imaging Methods for Determination of Cell Concentration and Viability.

    PubMed

    Sediq, A S; Klem, R; Nejadnik, M R; Meij, P; Jiskoot, Wim

    2018-05-30

    To investigate the potential of two flow imaging microscopy (FIM) techniques (Micro-Flow Imaging (MFI) and FlowCAM) to determine total cell concentration and cell viability. B-lineage acute lymphoblastic leukemia (B-ALL) cells of 2 different donors were exposed to ambient conditions. Samples were taken at different days and measured with MFI, FlowCAM, hemocytometry and automated cell counting. Dead and live cells from a fresh B-ALL cell suspension were fractionated by flow cytometry in order to derive software filters based on morphological parameters of separate cell populations with MFI and FlowCAM. The filter sets were used to assess cell viability in the measured samples. All techniques gave fairly similar cell concentration values over the whole incubation period. MFI showed to be superior with respect to precision, whereas FlowCAM provided particle images with a higher resolution. Moreover, both FIM methods were able to provide similar results for cell viability as the conventional methods (hemocytometry and automated cell counting). FIM-based methods may be advantageous over conventional cell methods for determining total cell concentration and cell viability, as FIM measures much larger sample volumes, does not require labeling, is less laborious and provides images of individual cells.

  5. Improved scatter correction using adaptive scatter kernel superposition

    NASA Astrophysics Data System (ADS)

    Sun, M.; Star-Lack, J. M.

    2010-11-01

    Accurate scatter correction is required to produce high-quality reconstructions of x-ray cone-beam computed tomography (CBCT) scans. This paper describes new scatter kernel superposition (SKS) algorithms for deconvolving scatter from projection data. The algorithms are designed to improve upon the conventional approach whose accuracy is limited by the use of symmetric kernels that characterize the scatter properties of uniform slabs. To model scatter transport in more realistic objects, nonstationary kernels, whose shapes adapt to local thickness variations in the projection data, are proposed. Two methods are introduced: (1) adaptive scatter kernel superposition (ASKS) requiring spatial domain convolutions and (2) fast adaptive scatter kernel superposition (fASKS) where, through a linearity approximation, convolution is efficiently performed in Fourier space. The conventional SKS algorithm, ASKS, and fASKS, were tested with Monte Carlo simulations and with phantom data acquired on a table-top CBCT system matching the Varian On-Board Imager (OBI). All three models accounted for scatter point-spread broadening due to object thickening, object edge effects, detector scatter properties and an anti-scatter grid. Hounsfield unit (HU) errors in reconstructions of a large pelvis phantom with a measured maximum scatter-to-primary ratio over 200% were reduced from -90 ± 58 HU (mean ± standard deviation) with no scatter correction to 53 ± 82 HU with SKS, to 19 ± 25 HU with fASKS and to 13 ± 21 HU with ASKS. HU accuracies and measured contrast were similarly improved in reconstructions of a body-sized elliptical Catphan phantom. The results show that the adaptive SKS methods offer significant advantages over the conventional scatter deconvolution technique.

  6. EpHLA software: a timesaving and accurate tool for improving identification of acceptable mismatches for clinical purposes.

    PubMed

    Filho, Herton Luiz Alves Sales; da Mata Sousa, Luiz Claudio Demes; von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; dos Santos Neto, Pedro de Alcântara; do Nascimento, Ferraz; de Castro, Adail Fonseca; do Nascimento, Liliane Machado; Kneib, Carolina; Bianchi Cazarote, Helena; Mayumi Kitamura, Daniele; Torres, Juliane Roberta Dias; da Cruz Lopes, Laiane; Barros, Aryela Loureiro; da Silva Edlin, Evelin Nildiane; de Moura, Fernanda Sá Leal; Watanabe, Janine Midori Figueiredo; do Monte, Semiramis Jamil Hadad

    2012-06-01

    The HLAMatchmaker algorithm, which allows the identification of “safe” acceptable mismatches (AMMs) for recipients of solid organ and cell allografts, is rarely used in part due to the difficulty in using it in the current Excel format. The automation of this algorithm may universalize its use to benefit the allocation of allografts. Recently, we have developed a new software called EpHLA, which is the first computer program automating the use of the HLAMatchmaker algorithm. Herein, we present the experimental validation of the EpHLA program by showing the time efficiency and the quality of operation. The same results, obtained by a single antigen bead assay with sera from 10 sensitized patients waiting for kidney transplants, were analyzed either by conventional HLAMatchmaker or by automated EpHLA method. Users testing these two methods were asked to record: (i) time required for completion of the analysis (in minutes); (ii) number of eplets obtained for class I and class II HLA molecules; (iii) categorization of eplets as reactive or non-reactive based on the MFI cutoff value; and (iv) determination of AMMs based on eplets' reactivities. We showed that although both methods had similar accuracy, the automated EpHLA method was over 8 times faster in comparison to the conventional HLAMatchmaker method. In particular the EpHLA software was faster and more reliable but equally accurate as the conventional method to define AMMs for allografts. The EpHLA software is an accurate and quick method for the identification of AMMs and thus it may be a very useful tool in the decision-making process of organ allocation for highly sensitized patients as well as in many other applications.

  7. 46 CFR 15.701 - Officers Competency Certificates Convention, 1936.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Officers Competency Certificates Convention, 1936. 15... SEAMEN MANNING REQUIREMENTS Limitations and Qualifying Factors § 15.701 Officers Competency Certificates Convention, 1936. (a) This section implements the Officers Competency Certificates Convention, 1936, and...

  8. Thermal discharges and their role in pending power plant regulatory decisions

    NASA Technical Reports Server (NTRS)

    Miller, M. H.

    1978-01-01

    Federal and state laws require the imminent retrofit of offstream condenser cooling to the newer steam electric stations. Waiver can be granted based on sound experimental data, demonstrating that existing once-through cooling will not adversely affect aquatic ecosystems. Conventional methods for monitoring thermal plumes, and some remote sensing alternatives, are reviewed, using on going work at one Maryland power plant for illustration.

  9. A fast technique for computing syndromes of BCH and RS codes. [deep space network

    NASA Technical Reports Server (NTRS)

    Reed, I. S.; Truong, T. K.; Miller, R. L.

    1979-01-01

    A combination of the Chinese Remainder Theorem and Winograd's algorithm is used to compute transforms of odd length over GF(2 to the m power). Such transforms are used to compute the syndromes needed for decoding CBH and RS codes. The present scheme requires substantially fewer multiplications and additions than the conventional method of computing the syndromes directly.

  10. The Efforts to Improve Mathematics Learning Achievement Results of High School Students as Required by Competency-Based Curriculum and Lesson Level-Based Curriculum

    ERIC Educational Resources Information Center

    Sidabutar, Ropinus

    2016-01-01

    The research was aimed to investigate the effect of various, innovated teaching models to improved the student's achievement in various topic in Mathematics. The study was conduct experiment by using innovated teaching with contextual, media and web which are the compared. with conventional teaching method. The result showed the innovation in the…

  11. Monitoring nitrogen deposition in throughfall using ion exchange resin columns: a field test in the San Bernardino Mountains

    Treesearch

    Mark E. Fenn; Mark A. Poth

    2004-01-01

    Conventional throughfall collection methods are labor intensive and analytically expensive to implement at broad scales. This study was conducted to test an alternative approach requiring infrequent sample collection and a greatly reduced number of chemical analyses. The major objective of the study was to determine the feasibility of using ion exchange resin (IER) to...

  12. Handshake with the Dragon: Engaging China in the Biological Weapons Convention

    DTIC Science & Technology

    1998-06-01

    modest pharmaceutical or fermentation industry could easily and cheaply produce BTW. Mass-production methods for growing bacterial cultures that are...widely used in the commercial production of yogurt , yeast, and beer are the same used to make pathogens and toxins.45 These technical developments have...Production Although biological agents can be grown in ordinary laboratory flasks, efficient production requires specialized fermenters . Until

  13. A unique method of retention for gum stripper- a case report.

    PubMed

    Doddamani, Santosh S; T S, Priyanka

    2014-12-01

    Successful restoration of partially edentulous situations, especially kennedy's class-I, II &IV requires lot of contemporary and conventional treatment approaches. Semi precision attachments play a major role in retention of clinically challenging partially edentulous situation. Attachment retained partial dentures can be one of the successful treatment option in prosthdontics. This article presents a unique technique of retaining gum stripper using semi precision attachments.

  14. Fault Detection of Rotating Machinery using the Spectral Distribution Function

    NASA Technical Reports Server (NTRS)

    Davis, Sanford S.

    1997-01-01

    The spectral distribution function is introduced to characterize the process leading to faults in rotating machinery. It is shown to be a more robust indicator than conventional power spectral density estimates, but requires only slightly more computational effort. The method is illustrated with examples from seeded gearbox transmission faults and an analytical model of a defective bearing. Procedures are suggested for implementation in realistic environments.

  15. Design and Implementation of Viterbi Decoder Using VHDL

    NASA Astrophysics Data System (ADS)

    Thakur, Akash; Chattopadhyay, Manju K.

    2018-03-01

    A digital design conversion of Viterbi decoder for ½ rate convolutional encoder with constraint length k = 3 is presented in this paper. The design is coded with the help of VHDL, simulated and synthesized using XILINX ISE 14.7. Synthesis results show a maximum frequency of operation for the design is 100.725 MHz. The requirement of memory is less as compared to conventional method.

  16. Slump sitting X-ray of the lumbar spine is superior to the conventional flexion view in assessing lumbar spine instability.

    PubMed

    Hey, Hwee Weng Dennis; Lau, Eugene Tze-Chun; Lim, Joel-Louis; Choong, Denise Ai-Wen; Tan, Chuen-Seng; Liu, Gabriel Ka-Po; Wong, Hee-Kit

    2017-03-01

    Flexion radiographs have been used to identify cases of spinal instability. However, current methods are not standardized and are not sufficiently sensitive or specific to identify instability. This study aimed to introduce a new slump sitting method for performing lumbar spine flexion radiographs and comparison of the angular range of motions (ROMs) and displacements between the conventional method and this new method. This study used is a prospective study on radiological evaluation of the lumbar spine flexion ROMs and displacements using dynamic radiographs. Sixty patients were recruited from a single spine tertiary center. Angular and displacement measurements of lumbar spine flexion were carried out. Participants were randomly allocated into two groups: those who did the new method first, followed by the conventional method versus those who did the conventional method first, followed by the new method. A comparison of the angular and displacement measurements of lumbar spine flexion between the conventional method and the new method was performed and tested for superiority and non-inferiority. The measurements of global lumbar angular ROM were, on average, 17.3° larger (p<.0001) using the new slump sitting method compared with the conventional method. They were most significant at the levels of L3-L4, L4-L5, and L5-S1 (p<.0001, p<.0001 and p=.001, respectively). There was no significant difference between both methods when measuring lumbar displacements (p=.814). The new method of slump sitting dynamic radiograph was shown to be superior to the conventional method in measuring the angular ROM and non-inferior to the conventional method in the measurement of displacement. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. 77 FR 71501 - International Fisheries; Western and Central Pacific Fisheries for Highly Migratory Species...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ...NMFS issues regulations under the authority of the Western and Central Pacific Fisheries Convention Implementation Act (WCPFC Implementation Act) to implement requirements for U.S. fishing vessels used for commercial fishing that offload or receive transshipments of highly migratory species (HMS), U.S. fishing vessels used for commercial fishing that provide bunkering or other support services to fishing vessels, and U.S. fishing vessels used for commercial fishing that receive bunkering or engage in other support services, in the area of application of the Convention on the Conservation and Management of Highly Migratory Fish Stocks in the Western and Central Pacific Ocean (Convention). Some of the requirements also apply to transshipments of fish caught in the area of application of the Convention (Convention Area) and transshipped elsewhere. NMFS also issues requirements regarding notification of entry into and exit from the ``Eastern High Seas Special Management Area'' (Eastern SMA) and requirements relating to discards from purse seine fishing vessels. This action is necessary for the United States to implement decisions of the Commission for the Conservation and Management of Highly Migratory Fish Stocks in the Western and Central Pacific Ocean (Commission or WCPFC) and to satisfy its obligations under the Convention, to which it is a Contracting Party.

  18. The extraction of essential oil from patchouli leaves (Pogostemon cablin Benth) using microwave hydrodistillation and solvent-free microwave extraction methods

    NASA Astrophysics Data System (ADS)

    Putri, D. K. Y.; Kusuma, H. S.; Syahputra, M. E.; Parasandi, D.; Mahfud, M.

    2017-12-01

    Patchouli plant (Pogostemon cablin Benth) is one of the important essential oil-producing plant, contributes more than 50% of total exports of Indonesia’s essential oil. However, the extraction of patchouli oil that has been done in Indonesia is generally still used conventional methods that require enormous amount of energy, high solvent usage, and long time of extraction. Therefore, in this study, patchouli oil extraction was carried out by using microwave hydrodistillation and solvent-free microwave extraction methods. Based on this research, it is known that the extraction of patchouli oil using microwave hydrodistillation method with longer extraction time (240 min) only produced patchouli oil’s yield 1.2 times greater than solvent-free microwave extraction method which require faster extraction time (120 min). Otherwise the analysis of electric consumption and the environmental impact, the solvent-free microwave extraction method showed a smaller amount when compared with microwave hydrodistillation method. It is conclude that the use of solvent-free microwave extraction method for patchouli oil extraction is suitably method as a new green technique.

  19. 22 CFR 103.5 - Violations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... WEAPONS CONVENTION AND THE CHEMICAL WEAPONS CONVENTION IMPLEMENTATION ACT OF 1998 ON THE TAKING OF SAMPLES...: (1) To establish or maintain any record required by the CWCIA or the Chemical Weapons Convention...

  20. 22 CFR 103.5 - Violations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... WEAPONS CONVENTION AND THE CHEMICAL WEAPONS CONVENTION IMPLEMENTATION ACT OF 1998 ON THE TAKING OF SAMPLES...: (1) To establish or maintain any record required by the CWCIA or the Chemical Weapons Convention...

  1. 46 CFR 109.103 - Requirements of the International Convention for Safety of Life at Sea, 1974.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Requirements of the International Convention for Safety of Life at Sea, 1974. 109.103 Section 109.103 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS General § 109.103 Requirements of the International...

  2. 46 CFR 109.103 - Requirements of the International Convention for Safety of Life at Sea, 1974.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Requirements of the International Convention for Safety of Life at Sea, 1974. 109.103 Section 109.103 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS General § 109.103 Requirements of the International...

  3. 46 CFR 109.103 - Requirements of the International Convention for Safety of Life at Sea, 1974.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Requirements of the International Convention for Safety of Life at Sea, 1974. 109.103 Section 109.103 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS General § 109.103 Requirements of the International...

  4. 46 CFR 109.103 - Requirements of the International Convention for Safety of Life at Sea, 1974.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Requirements of the International Convention for Safety of Life at Sea, 1974. 109.103 Section 109.103 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS General § 109.103 Requirements of the International...

  5. 46 CFR 109.103 - Requirements of the International Convention for Safety of Life at Sea, 1974.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Requirements of the International Convention for Safety of Life at Sea, 1974. 109.103 Section 109.103 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS General § 109.103 Requirements of the International...

  6. Feasibility and safety of modified inverted T-shaped method using linear stapler with movable cartridge fork for esophagojejunostomy following laparoscopic total gastrectomy

    PubMed Central

    Ohuchida, Kenoki; Moriyama, Taiki; Shindo, Koji; Manabe, Tatsuya; Ohtsuka, Takao; Shimizu, Shuji; Nakamura, Masafumi

    2017-01-01

    Background We previously reported the use of an inverted T-shaped method to obtain a suitable view for hand sewing to close the common entry hole when the linear stapler was fired for esophagojejunostomy after laparoscopic total gastrectomy (LTG). This conventional method involved insertion of the fixed cartridge fork to the Roux limb and the fine movable anvil fork to the esophagus to avoid perforation of the jejunum. However, insertion of the movable anvil fork to the esophagus during this procedure often requires us to strongly push down the main body of the stapler with the fixed cartridge fork to bring the direction of the anvil fork in line with the direction of the long axis of the esophagus while controlling the opening of the movable anvil fork. We therefore modified this complicated inverted T-shaped method using a linear stapler with a movable cartridge fork. This modified method involved insertion of the movable cartridge fork into the Roux limb followed by natural, easy insertion of the fixed anvil fork into the esophagus without controlling the opening of the movable cartridge fork. Methods We performed LTG in a total of 155 consecutive patients with gastric cancer from November 2007 to December 2015 in Kyushu University Hospital. After LTG, we performed the conventional inverted T-shaped method using a linear stapler with a fixed cartridge fork in 61 patients from November 2007 to July 2011 (fixed cartridge group). From August 2011, we used a linear stapler with a movable cartridge fork and performed the modified inverted T-shaped method in 94 patients (movable cartridge group). We herein compare the short-term outcomes in 94 cases of LTG using the modified method (movable cartridge fork) with those in 61 cases using the conventional method (fixed cartridge fork). Results We found no significant differences in the perioperative or postoperative events between the movable and fixed cartridge groups. One case of anastomotic leakage occurred in the fixed cartridge group, but no anastomotic leakage occurred in the movable cartridge group. Conclusions Although there were no remarkable differences in the short-term outcomes between the movable and fixed cartridge groups, we believe that the modified inverted T-shaped method is technically more feasible and reliable than the conventional method and will contribute to the improved safety of LTG. PMID:28616606

  7. Conventional engine technology. Volume 3: Comparisons and future potential

    NASA Technical Reports Server (NTRS)

    Dowdy, M. W.

    1981-01-01

    The status of five conventional automobile engine technologies was assessed and the future potential for increasing fuel economy and reducing exhaust emission was discussed, using the 1980 EPA California emisions standards as a comparative basis. By 1986, the fuel economy of a uniform charge Otto engine with a three-way catalyst is expected to increase 10%, while vehicles with lean burn (fast burn) engines should show a 20% fuel economy increase. Although vehicles with stratified-charge engines and rotary engines are expected to improve, their fuel economy will remain inferior to the other engine types. When adequate NO emissions control methods are implemented to meet the EPA requirements, vehicles with prechamber diesel engines are expected to yield a fuel economy advantage of about 15%. While successful introduction of direct injection diesel engine technology will provide a fuel savings of 30 to 35%, the planned regulation of exhaust particulates could seriously hinder this technology, because it is expected that only the smallest diesel engine vehicles could meet the proposed particulate requirements.

  8. Historical perspective: The pros and cons of conventional outcome measures in Parkinson's disease.

    PubMed

    Lim, Shen-Yang; Tan, Ai Huey

    2018-01-01

    Conventional outcome measures (COMs) in Parkinson's disease (PD) refer to rating scales, questionnaires, patient diaries and clinically-based tests that do not require specialized equipment. It is timely at this juncture - as clinicians and researchers begin to grapple with the "invasion" of digital technologies - to review the strengths and weaknesses of these outcome measures. This paper discusses advances (including an enhanced understanding of PD itself, and the development of clinimetrics as a field) that have led to improvements in the COMs used in PD; their strengths and limitations; and factors to consider when selecting and using a measuring instrument. It is envisaged that in the future, a combination of COMs and technology-based objective measures will be utilized, with different methods having their own strengths and weaknesses. Judgement is required on the part of the clinician and researcher in terms of which instrument(s) are appropriate to use, depending on the particular clinical or research setting or question. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. On the path to fusion energy

    NASA Astrophysics Data System (ADS)

    Tabak, M.

    2016-10-01

    There is a need to develop alternate energy sources in the coming century because fossil fuels will become depleted and their use may lead to global climate change. Inertial fusion can become such an energy source, but significant progress must be made before its promise is realized. The high-density approach to inertial fusion suggested by Nuckolls et al. leads reaction chambers compatible with civilian power production. Methods to achieve the good control of hydrodynamic stability and implosion symmetry required to achieve these high fuel densities will be discussed. Fast Ignition, a technique that achieves fusion ignition by igniting fusion fuel after it is assembled, will be described along with its gain curves. Fusion costs of energy for conventional hotspot ignition will be compared with those of Fast Ignition and their capital costs compared with advanced fission plants. Finally, techniques that may improve possible Fast Ignition gains by an order of magnitude and reduce driver scales by an order of magnitude below conventional ignition requirements are described.

  10. V/STOL aircraft and method

    DOEpatents

    Owens, Phillip R.

    1997-01-01

    Aircraft apparatus and method capable of V/STOL (vertical, short takeoff and landing) in addition to conventional flight. For V/STOL operation, induced lift is provided by blowing air over the upper surface of each wing through a duct installed near the leading edge. Intake air is supplied to the blowing fan through a duct installed near the trailing edge, thus providing suction as well as blowing. Two fans in series are required. The engine provides power not only to the propeller but also to a transmission which provides power to the pulleys driving the belt-driven fans.

  11. Advances in biological dosimetry

    NASA Astrophysics Data System (ADS)

    Ivashkevich, A.; Ohnesorg, T.; Sparbier, C. E.; Elsaleh, H.

    2017-01-01

    Rapid retrospective biodosimetry methods are essential for the fast triage of persons occupationally or accidentally exposed to ionizing radiation. Identification and detection of a radiation specific molecular ‘footprint’ should provide a sensitive and reliable measurement of radiation exposure. Here we discuss conventional (cytogenetic) methods of detection and assessment of radiation exposure in comparison to emerging approaches such as gene expression signatures and DNA damage markers. Furthermore, we provide an overview of technical and logistic details such as type of sample required, time for sample preparation and analysis, ease of use and potential for a high throughput analysis.

  12. Necessity of purification during bacterial DNA extraction with environmental soils

    PubMed Central

    Choi, Jung-Hyun

    2017-01-01

    Complexity and heterogeneity of soil samples have often implied the inclusion of purification steps in conventional DNA extraction for polymerase chain reaction (PCR) assays. Unfortunately the purification steps are also time and labor intensive. Therefore the necessity of DNA purification was re-visited and investigated for a variety of environmental soil samples that contained various amounts of PCR inhibitors. Bead beating and centrifugation was used as the baseline (without purification) method for DNA extraction. Its performance was compared with that of conventional DNA extraction kit (with purification). The necessity criteria for DNA purification were established with environmental soil samples. Using lysis conditions at 3000 rpm for 3 minutes with 0.1 mm glass beads, centrifugation time of 10 minutes and 1:10 dilution ratio, the baseline method outperformed conventional DNA extraction on cell seeded sand samples. Further investigation with PCR inhibitors (i.e., humic acids, clay, and magnesium [Mg]) showed that sand samples containing less than 10 μg/g humic acids and 70% clay may not require purifications. Interestingly, the inhibition pattern of Mg ion was different from other inhibitors due to the complexation interaction of Mg ion with DNA fragments. It was concluded that DNA extraction method without purification is suitable for soil samples that have less than 10 μg/g of humic acids, less than 70% clay content and less than 0.01% Mg ion content. PMID:28793754

  13. PubMed Central

    Schulz-Wendtland, Rüdiger; Jud, Sebastian M.; Fasching, Peter A.; Hartmann, Arndt; Radicke, Marcus; Rauh, Claudia; Uder, Michael; Wunderle, Marius; Gass, Paul; Langemann, Hanna; Beckmann, Matthias W.; Emons, Julius

    2017-01-01

    Aim The combination of different imaging modalities through the use of fusion devices promises significant diagnostic improvement for breast pathology. The aim of this study was to evaluate image quality and clinical feasibility of a prototype fusion device (fusion prototype) constructed from a standard tomosynthesis mammography unit and a standard 3D ultrasound probe using a new method of breast compression. Materials and Methods Imaging was performed on 5 mastectomy specimens from patients with confirmed DCIS or invasive carcinoma (BI-RADS ™ 6). For the preclinical fusion prototype an ABVS system ultrasound probe from an Acuson S2000 was integrated into a MAMMOMAT Inspiration (both Siemens Healthcare Ltd) and, with the aid of a newly developed compression plate, digital mammogram and automated 3D ultrasound images were obtained. Results The quality of digital mammogram images produced by the fusion prototype was comparable to those produced using conventional compression. The newly developed compression plate did not influence the applied x-ray dose. The method was not more labour intensive or time-consuming than conventional mammography. From the technical perspective, fusion of the two modalities was achievable. Conclusion In this study, using only a few mastectomy specimens, the fusion of an automated 3D ultrasound machine with a standard mammography unit delivered images of comparable quality to conventional mammography. The device allows simultaneous ultrasound – the second important imaging modality in complementary breast diagnostics – without increasing examination time or requiring additional staff. PMID:28713173

  14. Simple Carotid-Sparing Intensity-Modulated Radiotherapy Technique and Preliminary Experience for T1-2 Glottic Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenthal, David I., E-mail: dirosenthal@mdanderson.or; Fuller, Clifton D.; Barker, Jerry L.

    2010-06-01

    Purpose: To investigate the dosimetry and feasibility of carotid-sparing intensity-modulated radiotherapy (IMRT) for early glottic cancer and to report preliminary clinical experience. Methods and Materials: Digital Imaging and Communications in Medicine radiotherapy (DICOM-RT) datasets from 6 T1-2 conventionally treated glottic cancer patients were used to create both conventional IMRT plans. We developed a simplified IMRT planning algorithm with three fields and limited segments. Conventional and IMRT plans were compared using generalized equivalent uniform dose and dose-volume parameters for in-field carotid arteries, target volumes, and organs at risk. We have treated 11 patients with this simplified IMRT technique. Results: Intensity-modulated radiotherapymore » consistently reduced radiation dose to the carotid arteries (p < 0.05) while maintaining the clinical target volume coverage. With conventional planning, median carotid V35, V50, and V63 were 100%, 100%, and 69.0%, respectively. With IMRT planning these decreased to 2%, 0%, and 0%, respectively (p < 0.01). Radiation planning and treatment times were similar for conventional radiotherapy and IMRT. Treatment results have been excellent thus far. Conclusions: Intensity-modulated radiotherapy significantly reduced unnecessary radiation dose to the carotid arteries compared with conventional lateral fields while maintaining clinical target volume coverage. Further experience and longer follow-up will be required to demonstrate outcomes for cancer control and carotid artery effects.« less

  15. 22 CFR 103.3 - Requirement to provide a sample.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... IMPLEMENTATION OF THE CHEMICAL WEAPONS CONVENTION AND THE CHEMICAL WEAPONS CONVENTION IMPLEMENTATION ACT OF 1998... accordance with the applicable provisions contained in the Chemical Weapons Convention and the CWCIA. (d...

  16. 22 CFR 103.3 - Requirement to provide a sample.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... IMPLEMENTATION OF THE CHEMICAL WEAPONS CONVENTION AND THE CHEMICAL WEAPONS CONVENTION IMPLEMENTATION ACT OF 1998... accordance with the applicable provisions contained in the Chemical Weapons Convention and the CWCIA. (d...

  17. Migration without migraines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, L.; Burton, A.; Lu, H.X.

    Accurate velocity models are a necessity for reliable migration results. Velocity analysis generally involves the use of methods such as normal moveout analysis (NMO), seismic traveltime tomography, or iterative prestack migration. These techniques can be effective, and each has its own advantage or disadvantage. Conventional NMO methods are relatively inexpensive but basically require simplifying assumptions about geology. Tomography is a more general method but requires traveltime interpretation of prestack data. Iterative prestack depth migration is very general but is computationally expensive. In some cases, there is the opportunity to estimate vertical velocities by use of well information. The well informationmore » can be used to optimize poststack migrations, thereby eliminating some of the time and expense of iterative prestack migration. The optimized poststack migration procedure defined here computes the velocity model which minimizes the depth differences between seismic images and formation depths at the well by using a least squares inversion method. The optimization methods described in this paper will hopefully produce ``migrations without migraines.``« less

  18. Evaluation of a transfinite element numerical solution method for nonlinear heat transfer problems

    NASA Technical Reports Server (NTRS)

    Cerro, J. A.; Scotti, S. J.

    1991-01-01

    Laplace transform techniques have been widely used to solve linear, transient field problems. A transform-based algorithm enables calculation of the response at selected times of interest without the need for stepping in time as required by conventional time integration schemes. The elimination of time stepping can substantially reduce computer time when transform techniques are implemented in a numerical finite element program. The coupling of transform techniques with spatial discretization techniques such as the finite element method has resulted in what are known as transfinite element methods. Recently attempts have been made to extend the transfinite element method to solve nonlinear, transient field problems. This paper examines the theoretical basis and numerical implementation of one such algorithm, applied to nonlinear heat transfer problems. The problem is linearized and solved by requiring a numerical iteration at selected times of interest. While shown to be acceptable for weakly nonlinear problems, this algorithm is ineffective as a general nonlinear solution method.

  19. Advanced Testing Method for Ground Thermal Conductivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xiaobing; Clemenzi, Rick; Liu, Su

    A new method is developed that can quickly and more accurately determine the effective ground thermal conductivity (GTC) based on thermal response test (TRT) results. Ground thermal conductivity is an important parameter for sizing ground heat exchangers (GHEXs) used by geothermal heat pump systems. The conventional GTC test method usually requires a TRT for 48 hours with a very stable electric power supply throughout the entire test. In contrast, the new method reduces the required test time by 40%–60% or more, and it can determine GTC even with an unstable or intermittent power supply. Consequently, it can significantly reduce themore » cost of GTC testing and increase its use, which will enable optimal design of geothermal heat pump systems. Further, this new method provides more information about the thermal properties of the GHEX and the ground than previous techniques. It can verify the installation quality of GHEXs and has the potential, if developed, to characterize the heterogeneous thermal properties of the ground formation surrounding the GHEXs.« less

  20. Linear chirp phase perturbing approach for finding binary phased codes

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2017-05-01

    Binary phased codes have many applications in communication and radar systems. These applications require binary phased codes to have low sidelobes in order to reduce interferences and false detection. Barker codes are the ones that satisfy these requirements and they have lowest maximum sidelobes. However, Barker codes have very limited code lengths (equal or less than 13) while many applications including low probability of intercept radar, and spread spectrum communication, require much higher code lengths. The conventional techniques of finding binary phased codes in literatures include exhaust search, neural network, and evolutionary methods, and they all require very expensive computation for large code lengths. Therefore these techniques are limited to find binary phased codes with small code lengths (less than 100). In this paper, by analyzing Barker code, linear chirp, and P3 phases, we propose a new approach to find binary codes. Experiments show that the proposed method is able to find long low sidelobe binary phased codes (code length >500) with reasonable computational cost.

  1. Feasibility and safety of modified inverted T-shaped method using linear stapler with movable cartridge fork for esophagojejunostomy following laparoscopic total gastrectomy.

    PubMed

    Ohuchida, Kenoki; Nagai, Eishi; Moriyama, Taiki; Shindo, Koji; Manabe, Tatsuya; Ohtsuka, Takao; Shimizu, Shuji; Nakamura, Masafumi

    2017-01-01

    We previously reported the use of an inverted T-shaped method to obtain a suitable view for hand sewing to close the common entry hole when the linear stapler was fired for esophagojejunostomy after laparoscopic total gastrectomy (LTG). This conventional method involved insertion of the fixed cartridge fork to the Roux limb and the fine movable anvil fork to the esophagus to avoid perforation of the jejunum. However, insertion of the movable anvil fork to the esophagus during this procedure often requires us to strongly push down the main body of the stapler with the fixed cartridge fork to bring the direction of the anvil fork in line with the direction of the long axis of the esophagus while controlling the opening of the movable anvil fork. We therefore modified this complicated inverted T-shaped method using a linear stapler with a movable cartridge fork. This modified method involved insertion of the movable cartridge fork into the Roux limb followed by natural, easy insertion of the fixed anvil fork into the esophagus without controlling the opening of the movable cartridge fork. We performed LTG in a total of 155 consecutive patients with gastric cancer from November 2007 to December 2015 in Kyushu University Hospital. After LTG, we performed the conventional inverted T-shaped method using a linear stapler with a fixed cartridge fork in 61 patients from November 2007 to July 2011 (fixed cartridge group). From August 2011, we used a linear stapler with a movable cartridge fork and performed the modified inverted T-shaped method in 94 patients (movable cartridge group). We herein compare the short-term outcomes in 94 cases of LTG using the modified method (movable cartridge fork) with those in 61 cases using the conventional method (fixed cartridge fork). We found no significant differences in the perioperative or postoperative events between the movable and fixed cartridge groups. One case of anastomotic leakage occurred in the fixed cartridge group, but no anastomotic leakage occurred in the movable cartridge group. Although there were no remarkable differences in the short-term outcomes between the movable and fixed cartridge groups, we believe that the modified inverted T-shaped method is technically more feasible and reliable than the conventional method and will contribute to the improved safety of LTG.

  2. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    DOE PAGES

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; ...

    2016-11-25

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  3. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    NASA Astrophysics Data System (ADS)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; Docherty, Kenneth S.; Jimenez, Jose L.

    2016-11-01

    We present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography-mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arranged into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.

  4. High-precision Non-Contact Measurement of Creep of Ultra-High Temperature Materials for Aerospace

    NASA Technical Reports Server (NTRS)

    Rogers, Jan R.; Hyers, Robert

    2008-01-01

    For high-temperature applications (greater than 2,000 C) such as solid rocket motors, hypersonic aircraft, nuclear electric/thermal propulsion for spacecraft, and more efficient jet engines, creep becomes one of the most important design factors to be considered. Conventional creep-testing methods, where the specimen and test apparatus are in contact with each other, are limited to temperatures approximately 1,700 C. Development of alloys for higher-temperature applications is limited by the availability of testing methods at temperatures above 2000 C. Development of alloys for applications requiring a long service life at temperatures as low as 1500 C, such as the next generation of jet turbine superalloys, is limited by the difficulty of accelerated testing at temperatures above 1700 C. For these reasons, a new, non-contact creep-measurement technique is needed for higher temperature applications. A new non-contact method for creep measurements of ultra-high-temperature metals and ceramics has been developed and validated. Using the electrostatic levitation (ESL) facility at NASA Marshall Space Flight Center, a spherical sample is rotated quickly enough to cause creep deformation due to centrifugal acceleration. Very accurate measurement of the deformed shape through digital image analysis allows the stress exponent n to be determined very precisely from a single test, rather than from numerous conventional tests. Validation tests on single-crystal niobium spheres showed excellent agreement with conventional tests at 1985 C; however the non-contact method provides much greater precision while using only about 40 milligrams of material. This method is being applied to materials including metals and ceramics for non-eroding throats in solid rockets and next-generation superalloys for turbine engines. Recent advances in the method and the current state of these new measurements will be presented.

  5. Analysis of International Space Station Materials on MISSE-3 and MISSE-4

    NASA Technical Reports Server (NTRS)

    Finckenor, Miria M.; Golden, Johnny L.; O'Rourke, Mary Jane

    2008-01-01

    For high-temperature applications (> 2,000 C) such as solid rocket motors, hypersonic aircraft, nuclear electric/thermal propulsion for spacecraft, and more efficient jet engines, creep becomes one of the most important design factors to be considered. Conventional creep-testing methods, where the specimen and test apparatus are in contact with each other, are limited to temperatures 1,700 deg. C. Development of alloys for higher-temperature applications is limited by the availability of testing methods at temperatures above 2000 C. Development of alloys for applications requiring a long service life at temperatures as low as 1500 C, such as the next generation of jet turbine superalloys, is limited by the difficulty of accelerated testing at temperatures above 1700 0c. For these reasons, a new, non-contact creep-measurement technique is needed for higher temperature applications. A new non-contact method for creep measurements of ultra-high-temperature metals and ceramics has been developed and validated. Using the electrostatic levitation (ESL) facility at NASA Marshall Space Flight Center, a spherical sample is rotated quickly enough to cause creep deformation due to centrifugal acceleration. Very accurate measurement of the deformed shape through digital image analysis allows the stress exponent n to be determined very precisely from a single test, rather than from numerous conventional tests. Validation tests on single-crystal niobium spheres showed excellent agreement with conventional tests at 1985 C; however the non-contact method provides much greater precision while using only about 40 milligrams of material. This method is being applied to materials including metals and ceramics for noneroding throats in solid rockets and next-generation superalloys for turbine engines. Recent advances in the method and the current state of these new measurements will be presented.

  6. Super-resolution Doppler beam sharpening method using fast iterative adaptive approach-based spectral estimation

    NASA Astrophysics Data System (ADS)

    Mao, Deqing; Zhang, Yin; Zhang, Yongchao; Huang, Yulin; Yang, Jianyu

    2018-01-01

    Doppler beam sharpening (DBS) is a critical technology for airborne radar ground mapping in forward-squint region. In conventional DBS technology, the narrow-band Doppler filter groups formed by fast Fourier transform (FFT) method suffer from low spectral resolution and high side lobe levels. The iterative adaptive approach (IAA), based on the weighted least squares (WLS), is applied to the DBS imaging applications, forming narrower Doppler filter groups than the FFT with lower side lobe levels. Regrettably, the IAA is iterative, and requires matrix multiplication and inverse operation when forming the covariance matrix, its inverse and traversing the WLS estimate for each sampling point, resulting in a notably high computational complexity for cubic time. We propose a fast IAA (FIAA)-based super-resolution DBS imaging method, taking advantage of the rich matrix structures of the classical narrow-band filtering. First, we formulate the covariance matrix via the FFT instead of the conventional matrix multiplication operation, based on the typical Fourier structure of the steering matrix. Then, by exploiting the Gohberg-Semencul representation, the inverse of the Toeplitz covariance matrix is computed by the celebrated Levinson-Durbin (LD) and Toeplitz-vector algorithm. Finally, the FFT and fast Toeplitz-vector algorithm are further used to traverse the WLS estimates based on the data-dependent trigonometric polynomials. The method uses the Hermitian feature of the echo autocorrelation matrix R to achieve its fast solution and uses the Toeplitz structure of R to realize its fast inversion. The proposed method enjoys a lower computational complexity without performance loss compared with the conventional IAA-based super-resolution DBS imaging method. The results based on simulations and measured data verify the imaging performance and the operational efficiency.

  7. Radon-domain interferometric interpolation for reconstruction of the near-offset gap in marine seismic data

    NASA Astrophysics Data System (ADS)

    Xu, Zhuo; Sopher, Daniel; Juhlin, Christopher; Han, Liguo; Gong, Xiangbo

    2018-04-01

    In towed marine seismic data acquisition, a gap between the source and the nearest recording channel is typical. Therefore, extrapolation of the missing near-offset traces is often required to avoid unwanted effects in subsequent data processing steps. However, most existing interpolation methods perform poorly when extrapolating traces. Interferometric interpolation methods are one particular method that have been developed for filling in trace gaps in shot gathers. Interferometry-type interpolation methods differ from conventional interpolation methods as they utilize information from several adjacent shot records to fill in the missing traces. In this study, we aim to improve upon the results generated by conventional time-space domain interferometric interpolation by performing interferometric interpolation in the Radon domain, in order to overcome the effects of irregular data sampling and limited source-receiver aperture. We apply both time-space and Radon-domain interferometric interpolation methods to the Sigsbee2B synthetic dataset and a real towed marine dataset from the Baltic Sea with the primary aim to improve the image of the seabed through extrapolation into the near-offset gap. Radon-domain interferometric interpolation performs better at interpolating the missing near-offset traces than conventional interferometric interpolation when applied to data with irregular geometry and limited source-receiver aperture. We also compare the interferometric interpolated results with those obtained using solely Radon transform (RT) based interpolation and show that interferometry-type interpolation performs better than solely RT-based interpolation when extrapolating the missing near-offset traces. After data processing, we show that the image of the seabed is improved by performing interferometry-type interpolation, especially when Radon-domain interferometric interpolation is applied.

  8. How to approach ballast water management in European seas

    NASA Astrophysics Data System (ADS)

    David, Matej; Gollasch, Stephan

    2018-02-01

    The latest research continues to show that the ballast water issue is very complex, which makes it very challenging to manage. In 2004, the International Convention for the Control and Management of Ships' Ballast Water and Sediments (BWM Convention) was adopted to globally harmonize action against the transfer of harmful aquatic organisms and pathogens via ships' ballast water and related sediments. Analyses of the BWM Convention requirements, conducted through different research projects mainly aiming to provide support for the implementation of the BWM Convention, have shown that there are different steps countries need to take and that there are still some open issues which need to be solved. This paper presents some of the main issues identified and the core theoretical and applied measures required to solve these issues, with the aim to support more efficient and coordinated implementation of the BWM Convention requirements in EU seas. The approaches recommended here for the EU may be universally interesting for similar application in other areas of the world.

  9. Knowledge-based system V and V in the Space Station Freedom program

    NASA Technical Reports Server (NTRS)

    Kelley, Keith; Hamilton, David; Culbert, Chris

    1992-01-01

    Knowledge Based Systems (KBS's) are expected to be heavily used in the Space Station Freedom Program (SSFP). Although SSFP Verification and Validation (V&V) requirements are based on the latest state-of-the-practice in software engineering technology, they may be insufficient for Knowledge Based Systems (KBS's); it is widely stated that there are differences in both approach and execution between KBS V&V and conventional software V&V. In order to better understand this issue, we have surveyed and/or interviewed developers from sixty expert system projects in order to understand the differences and difficulties in KBS V&V. We have used this survey results to analyze the SSFP V&V requirements for conventional software in order to determine which specific requirements are inappropriate for KBS V&V and why they are inappropriate. Further work will result in a set of recommendations that can be used either as guidelines for applying conventional software V&V requirements to KBS's or as modifications to extend the existing SSFP conventional software V&V requirements to include KBS requirements. The results of this work are significant to many projects, in addition to SSFP, which will involve KBS's.

  10. [Human resources requirements for diabetic patients healthcare in primary care clinics of the Mexican Institute of Social Security].

    PubMed

    Doubova, Svetlana V; Ramírez-Sánchez, Claudine; Figueroa-Lara, Alejandro; Pérez-Cuevas, Ricardo

    2013-12-01

    To estimate the requirements of human resources (HR) of two models of care for diabetes patients: conventional and specific, also called DiabetIMSS, which are provided in primary care clinics of the Mexican Institute of Social Security (IMSS). An evaluative research was conducted. An expert group identified the HR activities and time required to provide healthcare consistent with the best clinical practices for diabetic patients. HR were estimated by using the evidence-based adjusted service target approach for health workforce planning; then, comparisons between existing and estimated HRs were made. To provide healthcare in accordance with the patients' metabolic control, the conventional model required increasing the number of family doctors (1.2 times) nutritionists (4.2 times) and social workers (4.1 times). The DiabetIMSS model requires greater increase than the conventional model. Increasing HR is required to provide evidence-based healthcare to diabetes patients.

  11. Fast and accurate preparation fatty acid methyl esters by microwave-assisted derivatization in the yeast Saccharomyces cerevisiae.

    PubMed

    Khoomrung, Sakda; Chumnanpuen, Pramote; Jansa-ard, Suwanee; Nookaew, Intawat; Nielsen, Jens

    2012-06-01

    We present a fast and accurate method for preparation of fatty acid methyl esters (FAMEs) using microwave-assisted derivatization of fatty acids present in yeast samples. The esterification of free/bound fatty acids to FAMEs was completed within 5 min, which is 24 times faster than with conventional heating methods. The developed method was validated in two ways: (1) through comparison with a conventional method (hot plate) and (2) through validation with the standard reference material (SRM) 3275-2 omega-3 and omega-6 fatty acids in fish oil (from the Nation Institute of Standards and Technology, USA). There were no significant differences (P>0.05) in yields of FAMEs with both validations. By performing a simple modification of closed-vessel microwave heating, it was possible to carry out the esterification in Pyrex glass tubes kept inside the closed vessel. Hereby, we are able to increase the number of sample preparations to several hundred samples per day as the time for preparation of reused vessels was eliminated. Pretreated cell disruption steps are not required, since the direct FAME preparation provides equally quantitative results. The new microwave-assisted derivatization method facilitates the preparation of FAMEs directly from yeast cells, but the method is likely to also be applicable for other biological samples.

  12. Quality evaluation of cook-chilled chicory stems (Cichorium intybus L., Catalogna group) by conventional and sous vide cooking methods.

    PubMed

    Renna, Massimiliano; Gonnella, Maria; Giannino, Donato; Santamaria, Pietro

    2014-03-15

    Chicory stems, appreciated both raw and cooked, represent a nutritious and refined food. In this study the effects on the quality of stems cooked by conventional (boiling, steaming and microwaving) and innovative (sous vide) methods were analysed. Several physical, chemical and sensory traits were compared using two local varieties (Galatina and Molfettese) of southern Italy (Puglia region). Independently of the variety, the sous vide method did not significantly affect (redness, yellowness and hue angle) or had the least impact on (lightness and total colour difference) quality parameters among the four methods as compared with the raw product. Following sensory analysis, the sous vide product always showed the highest score among the cooking methods. Moreover, this innovative method did not affect total phenol (TP) content and antioxidant activity (AA) compared with uncooked stems of both varieties. Microwaving increased TP content and AA (though associated with higher weight loss), while different responses depending on the chicory variety were observed after boiling and steaming. The results indicate the sous vide technique as optimal to preserve several traits, including organoleptic ones, for the quality of cook-chilled chicory stems. They also provide product-specific information usually required for cooking process strategies in the industrial sector of ready-to-eat vegetables. © 2013 Society of Chemical Industry.

  13. Increased Sensitivity of HIV-1 p24 ELISA Using a Photochemical Signal Amplification System.

    PubMed

    Bystryak, Simon; Santockyte, Rasa

    2015-10-01

    In this study we describe a photochemical signal amplification method (PSAM) for increasing of the sensitivity of enzyme-linked immunosorbent assay (ELISA) for determination of HIV-1 p24 antigen. The photochemical signal amplification method is based on an autocatalytic photochemical reaction of a horseradish peroxidase (HRP) substrate, orthophenylenediamine (OPD). To compare the performance of PSAM-boosted ELISA with a conventional colorimetric ELISA for determination of HIV-1 p24 antigen we employed a PerkinElmer HIV-1 p24 ELISA kit, using conventional ELISA alongside ELISA + PSAM. In the present study, we show that PSAM technology allows one to increase the analytical sensitivity and dynamic range of a commercial HIV-1 p24 ELISA kit, with and without immune-complex disruption, by a factor of approximately 40-fold. ELISA + PSAM is compatible with commercially available microtiter plate readers, requires only an inexpensive illumination device, and the PSAM amplification step takes no longer than 15 min. This method can be used for both commercially available and in-house ELISA tests, and has the advantage of being considerably simpler and less costly than alternative signal amplification methods. This method can be used for both commercially available and in-house ELISA tests, and has the advantage of being considerably simpler and less costly than alternative signal amplification methods.

  14. Faraday forcing of high-temperature levitated liquid metal drops for the measurement of surface tension.

    PubMed

    Brosius, Nevin; Ward, Kevin; Matsumoto, Satoshi; SanSoucie, Michael; Narayanan, Ranga

    2018-01-01

    In this work, a method for the measurement of surface tension using continuous periodic forcing is presented. To reduce gravitational effects, samples are electrostatically levitated prior to forcing. The method, called Faraday forcing, is particularly well suited for fluids that require high temperature measurements such as liquid metals where conventional surface tension measurement methods are not possible. It offers distinct advantages over the conventional pulse-decay analysis method when the sample viscosity is high or the levitation feedback control system is noisy. In the current method, levitated drops are continuously translated about a mean position at a small, constant forcing amplitude over a range of frequencies. At a particular frequency in this range, the drop suddenly enters a state of resonance, which is confirmed by large executions of prolate/oblate deformations about the mean spherical shape. The arrival at this resonant condition is a signature that the parametric forcing frequency is equal to the drop's natural frequency, the latter being a known function of surface tension. A description of the experimental procedure is presented. A proof of concept is given using pure Zr and a Ti 39.5 Zr 39.5 Ni 21 alloy as examples. The results compare favorably with accepted literature values obtained using the pulse-decay method.

  15. Conventional vs Biomimetic Approaches to the Exploration of Mars

    NASA Astrophysics Data System (ADS)

    Ellery, A.

    It is not usual to refer to convention in planetary exploration missions by virtue of the innovation required for such projects. The term conventional refers to the methodologies, tools and approaches typically adopted in engineering that are applied to such missions. Presented is a "conventional" Mars rover mission in which the author was involved - ExoMars - into which is interspersed references to examples where biomimetic approaches may yield superior capabilities. Biomimetics is a relatively recently active area of research which seeks to examine how biological systems solve the problem of survival in the natural environment. Biological organisms are autonomous entities that must survive in a hostile world adapting both adaptivity and robustness. It is not then surprising that biomimetics is particularly useful when applied to robotic elements of a Mars exploration mission. I present a number of areas in which biomimetics may yield new solutions to the problem of Mars exploration - optic flow navigation, potential field navigation, genetically-evolved neuro-controllers, legged locomotion, electric motors implementing muscular behaviour, and a biomimetic drill based on the wood wasp ovipositor. Each of these techniques offers an alternative approach to conventional ones. However, the perceptive hurdles are likely to dwarf the technical hurdles in implementing many of these methods in the near future.

  16. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization

    NASA Astrophysics Data System (ADS)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-01

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  17. Study on Impact of Electric Vehicles Charging Models on Power Load

    NASA Astrophysics Data System (ADS)

    Cheng, Chen; Hui-mei, Yuan

    2017-05-01

    With the rapid increase in the number of electric vehicles, which will lead the power load on grid increased and have an adversely affect. This paper gives a detailed analysis of the following factors, such as scale of the electric cars, charging mode, initial charging time, initial state of charge, charging power and other factors. Monte Carlo simulation method is used to compare the two charging modes, which are conventional charging and fast charging, and MATLAB is used to model and simulate the electric vehicle charging load. The results show that compared with the conventional charging mode, fast charging mode can meet the requirements of fast charging, but also bring great load to the distribution network which will affect the reliability of power grid.

  18. Advances in Candida detection platforms for clinical and point-of-care applications

    PubMed Central

    Safavieh, Mohammadali; Coarsey, Chad; Esiobu, Nwadiuto; Memic, Adnan; Vyas, Jatin Mahesh; Shafiee, Hadi; Asghar, Waseem

    2016-01-01

    Invasive candidiasis remains one of the most serious community and healthcare-acquired infections worldwide. Conventional Candida detection methods based on blood and plate culture are time-consuming and require at least 2–4 days to identify various Candida species. Despite considerable advances for candidiasis detection, the development of simple, compact and portable point-of-care diagnostics for rapid and precise testing that automatically performs cell lysis, nucleic acid extraction, purification and detection still remains a challenge. Here, we systematically review most prominent conventional and nonconventional techniques for the detection of various Candida species, including Candida staining, blood culture, serological testing and nucleic acid-based analysis. We also discuss the most advanced lab on a chip devices for candida detection. PMID:27093473

  19. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization.

    PubMed

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-28

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  20. Thwarting science by protecting the received wisdom on tobacco addiction from the scientific method.

    PubMed

    Difranza, Joseph R

    2010-11-04

    In their commentary, Dar and Frenk call into question the validity of all published data that describe the onset of nicotine addiction. They argue that the data that describe the early onset of nicotine addiction is so different from the conventional wisdom that it is irrelevant. In this rebuttal, the author argues that the conventional wisdom cannot withstand an application of the scientific method that requires that theories be tested and discarded when they are contradicted by data. The author examines the origins of the threshold theory that has represented the conventional wisdom concerning the onset of nicotine addiction for 4 decades. The major tenets of the threshold theory are presented as hypotheses followed by an examination of the relevant literature. Every tenet of the threshold theory is contradicted by all available relevant data and yet it remains the conventional wisdom. The author provides an evidence-based account of the natural history of nicotine addiction, including its onset and development as revealed by case histories, focus groups, and surveys involving tens of thousands of smokers. These peer-reviewed and replicated studies are the work of independent researchers from around the world using a variety of measures, and they provide a consistent and coherent clinical picture. The author argues that the scientific method demands that the fanciful conventional wisdom be discarded and replaced with the evidence-based description of nicotine addiction that is backed by data. The author charges that in their attempt to defend the conventional wisdom in the face of overwhelming data to the contrary, Dar and Frenk attempt to destroy the credibility of all who have produced these data. Dar and Frenk accuse other researchers of committing methodological errors and showing bias in the analysis of data when in fact Dar and Frenk commit several errors and reveal their bias by using a few outlying data points to misrepresent an entire body of research, and by grossly and consistently mischaracterizing the claims of those whose research they attack.

  1. Design Tool Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.

  2. Aquifer water abundance evaluation using a fuzzy- comprehensive weighting method

    NASA Astrophysics Data System (ADS)

    Wei, Z.

    2016-08-01

    Aquifer water abundance evaluation is a highly relevant issue that has been researched for many years. Despite prior research, problems with the conventional evaluation method remain. This paper establishes an aquifer water abundance evaluation method that combines fuzzy evaluation with a comprehensive weighting method to overcome both the subjectivity and lack of conformity in determining weight by pure data analysis alone. First, this paper introduces the principle of a fuzzy-comprehensive weighting method. Second, the example of well field no. 3 (of a coalfield) is used to illustrate the method's process. The evaluation results show that this method is can more suitably meet the real requirements of aquifer water abundance assessment, leading to more precise and accurate evaluations. Ultimately, this paper provides a new method for aquifer water abundance evaluation.

  3. Sensors vs. experts - a performance comparison of sensor-based fall risk assessment vs. conventional assessment in a sample of geriatric patients.

    PubMed

    Marschollek, Michael; Rehwald, Anja; Wolf, Klaus-Hendrik; Gietzelt, Matthias; Nemitz, Gerhard; zu Schwabedissen, Hubertus Meyer; Schulze, Mareike

    2011-06-28

    Fall events contribute significantly to mortality, morbidity and costs in our ageing population. In order to identify persons at risk and to target preventive measures, many scores and assessment tools have been developed. These often require expertise and are costly to implement. Recent research investigates the use of wearable inertial sensors to provide objective data on motion features which can be used to assess individual fall risk automatically. So far it is unknown how well this new method performs in comparison with conventional fall risk assessment tools. The aim of our research is to compare the predictive performance of our new sensor-based method with conventional and established methods, based on prospective data. In a first study phase, 119 inpatients of a geriatric clinic took part in motion measurements using a wireless triaxial accelerometer during a Timed Up&Go (TUG) test and a 20 m walk. Furthermore, the St. Thomas Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY) was performed, and the multidisciplinary geriatric care team estimated the patients' fall risk. In a second follow-up phase of the study, 46 of the participants were interviewed after one year, including a fall and activity assessment. The predictive performances of the TUG, the STRATIFY and team scores are compared. Furthermore, two automatically induced logistic regression models based on conventional clinical and assessment data (CONV) as well as sensor data (SENSOR) are matched. Among the risk assessment scores, the geriatric team score (sensitivity 56%, specificity 80%) outperforms STRATIFY and TUG. The induced logistic regression models CONV and SENSOR achieve similar performance values (sensitivity 68%/58%, specificity 74%/78%, AUC 0.74/0.72, +LR 2.64/2.61). Both models are able to identify more persons at risk than the simple scores. Sensor-based objective measurements of motion parameters in geriatric patients can be used to assess individual fall risk, and our prediction model's performance matches that of a model based on conventional clinical and assessment data. Sensor-based measurements using a small wearable device may contribute significant information to conventional methods and are feasible in an unsupervised setting. More prospective research is needed to assess the cost-benefit relation of our approach.

  4. Hybrid Technology of Hard Coal Mining from Seams Located at Great Depths

    NASA Astrophysics Data System (ADS)

    Czaja, Piotr; Kamiński, Paweł; Klich, Jerzy; Tajduś, Antoni

    2014-10-01

    Learning to control fire changed the life of man considerably. Learning to convert the energy derived from combustion of coal or hydrocarbons into another type of energy, such as steam pressure or electricity, has put him on the path of scientific and technological revolution, stimulating dynamic development. Since the dawn of time, fossil fuels have been serving as the mankind's natural reservoir of energy in an increasingly great capacity. A completely incomprehensible refusal to use fossil fuels causes some local populations, who do not possess a comprehensive knowledge of the subject, to protest and even generate social conflicts as an expression of their dislike for the extraction of minerals. Our times are marked by the search for more efficient ways of utilizing fossil fuels by introducing non-conventional technologies of exploiting conventional energy sources. During apartheid, South Africa demonstrated that cheap coal can easily satisfy total demand for liquid and gaseous fuels. In consideration of current high prices of hydrocarbon media (oil and gas), gasification or liquefaction of coal seems to be the innovative technology convergent with contemporary expectations of both energy producers as well as environmentalists. Known mainly from literature reports, underground coal gasification technologies can be brought down to two basic methods: - shaftless method - drilling, in which the gasified seam is uncovered using boreholes drilled from the surface, - shaft method, in which the existing infrastructure of underground mines is used to uncover the seams. This paper presents a hybrid shaft-drilling approach to the acquisition of primary energy carriers (methane and syngas) from coal seams located at great depths. A major advantage of this method is the fact that the use of conventional coal mining technology requires the seams located at great depths to be placed on the off-balance sheet, while the hybrid method of underground gasification enables them to become a source of additional energy for the economy. It should be noted, however, that the shaft-drilling method cannot be considered as an alternative to conventional methods of coal extraction, but rather as a complementary and cheaper way of utilizing resources located almost beyond the technical capabilities of conventional extraction methods due to the associated natural hazards and high costs of combating them. This article presents a completely different approach to the issue of underground coal gasification. Repurposing of the already fully depreciated mining infrastructure for the gasification process may result in a large value added of synthesis gas production and very positive economic effect.

  5. An experimental investigation of wastewater treatment using electron beam irradiation

    NASA Astrophysics Data System (ADS)

    Emami-Meibodi, M.; Parsaeian, M. R.; Amraei, R.; Banaei, M.; Anvari, F.; Tahami, S. M. R.; Vakhshoor, B.; Mehdizadeh, A.; Fallah Nejad, N.; Shirmardi, S. P.; Mostafavi, S. J.; Mousavi, S. M. J.

    2016-08-01

    Electron beam (EB) is used for disinfection and treatment of different types of sewage and industrial wastewater. However, high capital investment required and the abundant energy consumed by this process raise doubts about its cost-effectiveness. In this paper, different wastewaters, including two textile sewages and one municipal wastewater are experimentally studied under different irradiation strategies (i.e. batch, 60 l/min and 1000 m3/day) in order to establish the reliability and the optimum conditions for the treatment process. According to the results, EB improves the efficiency of traditional wastewater treatment methods, but, for textile samples, coagulation before EB irradiation is recommended. The cost estimation of EB treatment compared to conventional methods shows that EB has been more expensive than chlorination and less expensive than activated sludge. Therefore, EB irradiation is advisable if and only if conventional methods of textile wastewater treatment are insufficient or chlorination of municipal wastewater is not allowed for health reasons. Nevertheless, among the advanced oxidation processes (AOP), EB irradiation process may be the most suitable one in industrial scale operations.

  6. Peptide arrays on cellulose support: SPOT synthesis, a time and cost efficient method for synthesis of large numbers of peptides in a parallel and addressable fashion.

    PubMed

    Hilpert, Kai; Winkler, Dirk F H; Hancock, Robert E W

    2007-01-01

    Peptide synthesis on cellulose using SPOT technology allows the parallel synthesis of large numbers of addressable peptides in small amounts. In addition, the cost per peptide is less than 1% of peptides synthesized conventionally on resin. The SPOT method follows standard fluorenyl-methoxy-carbonyl chemistry on conventional cellulose sheets, and can utilize more than 600 different building blocks. The procedure involves three phases: preparation of the cellulose membrane, stepwise coupling of the amino acids and cleavage of the side-chain protection groups. If necessary, peptides can be cleaved from the membrane for assays performed using soluble peptides. These features make this method an excellent tool for screening large numbers of peptides for many different purposes. Potential applications range from simple binding assays, to more sophisticated enzyme assays and studies with living microbes or cells. The time required to complete the protocol depends on the number and length of the peptides. For example, 400 9-mer peptides can be synthesized within 6 days.

  7. Dynamic leaching and fractionation of trace elements from environmental solids exploiting a novel circulating-flow platform.

    PubMed

    Mori, Masanobu; Nakano, Koji; Sasaki, Masaya; Shinozaki, Haruka; Suzuki, Shiho; Okawara, Chitose; Miró, Manuel; Itabashi, Hideyuki

    2016-02-01

    A dynamic flow-through microcolumn extraction system based on extractant re-circulation is herein proposed as a novel analytical approach for simplification of bioaccessibility tests of trace elements in sediments. On-line metal leaching is undertaken in the format of all injection (AI) analysis, which is a sequel of flow injection analysis, but involving extraction under steady-state conditions. The minimum circulation times and flow rates required to determine the maximum bioaccessible pools of target metals (viz., Cu, Zn, Cd, and Pb) from lake and river sediment samples were estimated using Tessier's sequential extraction scheme and an acid single extraction test. The on-line AIA method was successfully validated by mass balance studies of CRM and real sediment samples. Tessier's test in on-line AI format demonstrated to be carried out by one third of extraction time (6h against more than 17 h by the conventional method), with better analytical precision (<9.2% against >15% by the conventional method) and significant decrease in blank readouts as compared with the manual batch counterpart. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Ion beam figuring of Φ520mm convex hyperbolic secondary mirror

    NASA Astrophysics Data System (ADS)

    Meng, Xiaohui; Wang, Yonggang; Li, Ang; Li, Wenqing

    2016-10-01

    The convex hyperbolic secondary mirror is a Φ520-mm Zerodur lightweight hyperbolic convex mirror. Typically conventional methods like CCOS, stressed-lap polishing are used to manufacture this secondary mirror. Nevertheless, the required surface accuracy cannot be achieved through the use of conventional polishing methods because of the unpredictable behavior of the polishing tools, which leads to an unstable removal rate. Ion beam figuring is an optical fabrication method that provides highly controlled error of previously polished surfaces using a directed, inert and neutralized ion beam to physically sputter material from the optic surface. Several iterations with different ion beam size are selected and optimized to fit different stages of surface figure error and spatial frequency components. Before ion beam figuring, surface figure error of the secondary mirror is 2.5λ p-v, 0.23λ rms, and is improved to 0.12λ p-v, 0.014λ rms in several process iterations. The demonstration clearly shows that ion beam figuring can not only be used to the final correction of aspheric, but also be suitable for polishing the coarse surface of large, complex mirror.

  9. New reporting procedures based on long-term method detection levels and some considerations for interpretations of water-quality data provided by the U.S. Geological Survey National Water Quality Laboratory

    USGS Publications Warehouse

    Childress, Carolyn J. Oblinger; Foreman, William T.; Connor, Brooke F.; Maloney, Thomas J.

    1999-01-01

    This report describes the U.S. Geological Survey National Water Quality Laboratory?s approach for determining long-term method detection levels and establishing reporting levels, details relevant new reporting conventions, and provides preliminary guidance on interpreting data reported with the new conventions. At the long-term method detection level concentration, the risk of a false positive detection (analyte reported present at the long-term method detection level when not in sample) is no more than 1 percent. However, at the long-term method detection level, the risk of a false negative occurrence (analyte reported not present when present at the long-term method detection level concentration) is up to 50 percent. Because this false negative rate is too high for use as a default 'less than' reporting level, a more reliable laboratory reporting level is set at twice the determined long-term method detection level. For all methods, concentrations measured between the laboratory reporting level and the long-term method detection level will be reported as estimated concentrations. Non-detections will be censored to the laboratory reporting level. Adoption of the new reporting conventions requires a full understanding of how low-concentration data can be used and interpreted and places responsibility for using and presenting final data with the user rather than with the laboratory. Users must consider that (1) new laboratory reporting levels may differ from previously established minimum reporting levels, (2) long-term method detection levels and laboratory reporting levels may change over time, and (3) estimated concentrations are less certain than concentrations reported above the laboratory reporting level. The availability of uncensored but qualified low-concentration data for interpretation and statistical analysis is a substantial benefit to the user. A decision to censor data after they are reported from the laboratory may still be made by the user, if merited, on the basis of the intended use of the data.

  10. Digital Cellular Solid Pressure Vessels: A Novel Approach for Human Habitation in Space

    NASA Technical Reports Server (NTRS)

    Cellucci, Daniel; Jenett, Benjamin; Cheung, Kenneth C.

    2017-01-01

    It is widely assumed that human exploration beyond Earth's orbit will require vehicles capable of providing long duration habitats that simulate an Earth-like environment - consistent artificial gravity, breathable atmosphere, and sufficient living space- while requiring the minimum possible launch mass. This paper examines how the qualities of digital cellular solids - high-performance, repairability, reconfigurability, tunable mechanical response - allow the accomplishment of long-duration habitat objectives at a fraction of the mass required for traditional structural technologies. To illustrate the impact digital cellular solids could make as a replacement to conventional habitat subsystems, we compare recent proposed deep space habitat structural systems with a digital cellular solids pressure vessel design that consists of a carbon fiber reinforced polymer (CFRP) digital cellular solid cylindrical framework that is lined with an ultra-high molecular weight polyethylene (UHMWPE) skin. We use the analytical treatment of a linear specific modulus scaling cellular solid to find the minimum mass pressure vessel for a structure and find that, for equivalent habitable volume and appropriate safety factors, the use of digital cellular solids provides clear methods for producing structures that are not only repairable and reconfigurable, but also higher performance than their conventionally manufactured counterparts.

  11. An Investigation of a Photographic Technique of Measuring High Surface Temperatures

    NASA Technical Reports Server (NTRS)

    Siviter, James H., Jr.; Strass, H. Kurt

    1960-01-01

    A photographic method of temperature determination has been developed to measure elevated temperatures of surfaces. The technique presented herein minimizes calibration procedures and permits wide variation in emulsion developing techniques. The present work indicates that the lower limit of applicability is approximately 1,400 F when conventional cameras, emulsions, and moderate exposures are used. The upper limit is determined by the calibration technique and the accuracy required.

  12. Optical data transmission technology for fixed and drag-on STS payload umbilicals, volume 2

    NASA Technical Reports Server (NTRS)

    St.denis, R. W.

    1981-01-01

    Optical data handling methods are studied as applicable to payload communications checkout and monitoring. Both payload umbilicals and interconnecting communication lines carrying payload data are examined for the following: (1) ground checkout requirements; (2) optical approach (technical survey of optical approaches, selection of optimum approach); (3) survey and select components; (4) compare with conventional approach; and (5) definition of follow on activity.

  13. A Unique Method of Retention for Gum Stripper- A Case Report

    PubMed Central

    T.S., Priyanka

    2014-01-01

    Successful restoration of partially edentulous situations, especially kennedy’s class-I, II &IV requires lot of contemporary and conventional treatment approaches. Semi precision attachments play a major role in retention of clinically challenging partially edentulous situation. Attachment retained partial dentures can be one of the successful treatment option in prosthdontics. This article presents a unique technique of retaining gum stripper using semi precision attachments. PMID:25654046

  14. Sequential strand displacement beacon for detection of DNA coverage on functionalized gold nanoparticles.

    PubMed

    Paliwoda, Rebecca E; Li, Feng; Reid, Michael S; Lin, Yanwen; Le, X Chris

    2014-06-17

    Functionalizing nanomaterials for diverse analytical, biomedical, and therapeutic applications requires determination of surface coverage (or density) of DNA on nanomaterials. We describe a sequential strand displacement beacon assay that is able to quantify specific DNA sequences conjugated or coconjugated onto gold nanoparticles (AuNPs). Unlike the conventional fluorescence assay that requires the target DNA to be fluorescently labeled, the sequential strand displacement beacon method is able to quantify multiple unlabeled DNA oligonucleotides using a single (universal) strand displacement beacon. This unique feature is achieved by introducing two short unlabeled DNA probes for each specific DNA sequence and by performing sequential DNA strand displacement reactions. Varying the relative amounts of the specific DNA sequences and spacing DNA sequences during their coconjugation onto AuNPs results in different densities of the specific DNA on AuNP, ranging from 90 to 230 DNA molecules per AuNP. Results obtained from our sequential strand displacement beacon assay are consistent with those obtained from the conventional fluorescence assays. However, labeling of DNA with some fluorescent dyes, e.g., tetramethylrhodamine, alters DNA density on AuNP. The strand displacement strategy overcomes this problem by obviating direct labeling of the target DNA. This method has broad potential to facilitate more efficient design and characterization of novel multifunctional materials for diverse applications.

  15. Detection of 22 common leukemic fusion genes using a single-step multiplex qRT-PCR-based assay.

    PubMed

    Lyu, Xiaodong; Wang, Xianwei; Zhang, Lina; Chen, Zhenzhu; Zhao, Yu; Hu, Jieying; Fan, Ruihua; Song, Yongping

    2017-07-25

    Fusion genes generated from chromosomal translocation play an important role in hematological malignancies. Detection of fusion genes currently employ use of either conventional RT-PCR methods or fluorescent in situ hybridization (FISH), where both methods involve tedious methodologies and require prior characterization of chromosomal translocation events as determined by cytogenetic analysis. In this study, we describe a real-time quantitative reverse transcription PCR (qRT-PCR)-based multi-fusion gene screening method with the capacity to detect 22 fusion genes commonly found in leukemia. This method does not require pre-characterization of gene translocation events, thereby facilitating immediate diagnosis and therapeutic management. We performed fluorescent qRT-PCR (F-qRT-PCR) using a commercially-available multi-fusion gene detection kit on a patient cohort of 345 individuals comprising 108 cases diagnosed with acute myeloid leukemia (AML) for initial evaluation; remaining patients within the cohort were assayed for confirmatory diagnosis. Results obtained by F-qRT-PCR were compared alongside patient analysis by cytogenetic characterization. Gene translocations detected by F-qRT-PCR in AML cases were diagnosed in 69.4% of the patient cohort, which was comparatively similar to 68.5% as diagnosed by cytogenetic analysis, thereby demonstrating 99.1% concordance. Overall gene fusion was detected in 53.7% of the overall patient population by F-qRT-PCR, 52.9% by cytogenetic prediction in leukemia, and 9.1% in non-leukemia patients by both methods. The overall concordance rate was calculated to be 99.0%. Fusion genes were detected by F-qRT-PCR in 97.3% of patients with CML, followed by 69.4% with AML, 33.3% with acute lymphoblastic leukemia (ALL), 9.1% with myelodysplastic syndromes (MDS), and 0% with chronic lymphocytic leukemia (CLL). We describe the use of a F-qRT-PCR-based multi-fusion gene screening method as an efficient one-step diagnostic procedure as an effective alternative to lengthy conventional diagnostic procedures requiring both cytogenetic analysis followed by targeted quantitative reverse transcription (qRT-PCR) methods, thus allowing timely patient management.

  16. A randomized trial of early versus delayed mediastinal drain removal after cardiac surgery using silastic and conventional tubes

    PubMed Central

    Moss, Emmanuel; Miller, Corey S.; Jensen, Henrik; Basmadjian, Arsène; Bouchard, Denis; Carrier, Michel; Perrault, Louis P.; Cartier, Raymond; Pellerin, Michel; Demers, Philippe

    2013-01-01

    OBJECTIVES Mediastinal drainage following cardiac surgery with traditional large-bore plastic tubes can be painful and cumbersome. This study was designed to determine whether prolonged drainage (5 days) with a silastic tube decreased the incidence of significant pericardial effusion and tamponade following aortic or valvular surgery. METHODS One hundred and fifty patients undergoing valvular or aortic surgery in a tertiary cardiac surgery institution were randomized to receive a conventional mediastinal tube plus a silastic Blake drain (n = 75), or two conventional tubes (n = 75). Conventional drains were removed on postoperative day (POD) 1, while Blake drains were removed on POD 5. The primary end-point was the combined incidence of significant pericardial effusion (≥15 mm) or tamponade through POD 5. Secondary end-points included total mediastinal drainage, postoperative atrial fibrillation (AF) and pain. RESULTS Analysis was performed for 67 patients in the Blake group and 73 in the conventional group. There was no difference between the two groups in the combined end-point of significant effusion or tamponade (7.4 vs 8.3%, P = 0.74), or in the incidence of AF (47 vs 46%, P = 0.89). Mean 24-h drainage was greater in the Blake group than in the conventional group (749 ± 444 ml vs 645 ± 618 ml, P < 0.01). Overall incidence of significant pericardial effusion at 30 days was 12.1% (n = 17), with 5% (n = 7) requiring drainage. The Blake group had a numerically lower incidence of effusion requiring drainage at POD 30 (3.0 vs 6.8%, P = 0.44). Postoperative pain was similar between groups. CONCLUSIONS In patients undergoing ascending aortic or valvular surgery, prolonged drainage with silastic tubes is safe and does not increase postoperative pain. There was no difference between the Blake and conventional drains with regard to significant pericardial effusion or tamponade in this cohort; however, this conclusion is limited by the low overall incidence of the primary outcome in this cohort. PMID:23575759

  17. Fast focus estimation using frequency analysis in digital holography.

    PubMed

    Oh, Seungtaik; Hwang, Chi-Young; Jeong, Il Kwon; Lee, Sung-Keun; Park, Jae-Hyeung

    2014-11-17

    A novel fast frequency-based method to estimate the focus distance of digital hologram for a single object is proposed. The focus distance is computed by analyzing the distribution of intersections of smoothed-rays. The smoothed-rays are determined by the directions of energy flow which are computed from local spatial frequency spectrum based on the windowed Fourier transform. So our method uses only the intrinsic frequency information of the optical field on the hologram and therefore does not require any sequential numerical reconstructions and focus detection techniques of conventional photography, both of which are the essential parts in previous methods. To show the effectiveness of our method, numerical results and analysis are presented as well.

  18. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  19. Esculin hydrolysis by Gram positive bacteria. A rapid test and it's comparison with other methods.

    PubMed

    Qadri, S M; Smith, J C; Zubairi, S; DeSilva, M I

    1981-01-01

    A number of bacteria hydrolyze esculin enzymatically to esculetin. This characteristic is used by taxonomists and clinical microbiologists in the differentiation and identification of bacteria, especially to distinguish Lance-field group D streptococci from non-group D organisms and Listeria monocytogenes from morphologically similar Erysipelothrix rhusipoathiae and diphtheroids. Conventional methods used for esculin hydrolysis require 4--48 h for completion. We developed and evaluated a medium which gives positive results more rapidly. The 2,330 isolates used in this study consisted of 1,680 esculin positive and 650 esculin negative organisms. The sensitivity and specificity of this method were compared with the PathoTec esculin hydrolysis strip and the procedure of Vaughn and Levine (VL). Of the 1,680 esculin positive organisms, 97% gave positive reactions within 30 minutes with the rapid test whereas PathoTec required 3--4 h incubation for the same number of organisms to yield a positive reaction.

  20. Multisensory visual servoing by a neural network.

    PubMed

    Wei, G Q; Hirzinger, G

    1999-01-01

    Conventional computer vision methods for determining a robot's end-effector motion based on sensory data needs sensor calibration (e.g., camera calibration) and sensor-to-hand calibration (e.g., hand-eye calibration). This involves many computations and even some difficulties, especially when different kinds of sensors are involved. In this correspondence, we present a neural network approach to the motion determination problem without any calibration. Two kinds of sensory data, namely, camera images and laser range data, are used as the input to a multilayer feedforward network to associate the direct transformation from the sensory data to the required motions. This provides a practical sensor fusion method. Using a recursive motion strategy and in terms of a network correction, we relax the requirement for the exactness of the learned transformation. Another important feature of our work is that the goal position can be changed without having to do network retraining. Experimental results show the effectiveness of our method.

  1. Non-orthogonal internally contracted multi-configurational perturbation theory (NICPT): Dynamic electron correlation for large, compact active spaces

    NASA Astrophysics Data System (ADS)

    Kähler, Sven; Olsen, Jeppe

    2017-11-01

    A computational method is presented for systems that require high-level treatments of static and dynamic electron correlation but cannot be treated using conventional complete active space self-consistent field-based methods due to the required size of the active space. Our method introduces an efficient algorithm for perturbative dynamic correlation corrections for compact non-orthogonal MCSCF calculations. In the algorithm, biorthonormal expansions of orbitals and CI-wave functions are used to reduce the scaling of the performance determining step from quadratic to linear in the number of configurations. We describe a hierarchy of configuration spaces that can be chosen for the active space. Potential curves for the nitrogen molecule and the chromium dimer are compared for different configuration spaces. Already the most compact spaces yield qualitatively correct potentials that with increasing size of configuration spaces systematically approach complete active space results.

  2. Are the classic diagnostic methods in mycology still state of the art?

    PubMed

    Wiegand, Cornelia; Bauer, Andrea; Brasch, Jochen; Nenoff, Pietro; Schaller, Martin; Mayser, Peter; Hipler, Uta-Christina; Elsner, Peter

    2016-05-01

    The diagnostic workup of cutaneous fungal infections is traditionally based on microscopic KOH preparations as well as culturing of the causative organism from sample material. Another possible option is the detection of fungal elements by dermatohistology. If performed correctly, these methods are generally suitable for the diagnosis of mycoses. However, the advent of personalized medicine and the tasks arising therefrom require new procedures marked by simplicity, specificity, and swiftness. The additional use of DNA-based molecular techniques further enhances sensitivity and diagnostic specificity, and reduces the diagnostic interval to 24-48 hours, compared to weeks required for conventional mycological methods. Given the steady evolution in the field of personalized medicine, simple analytical PCR-based systems are conceivable, which allow for instant diagnosis of dermatophytes in the dermatology office (point-of-care tests). © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  3. Comparison of effectiveness of convection-, transpiration-, and film-cooling methods with air as coolant

    NASA Technical Reports Server (NTRS)

    Eckert, E R G; Livingood, N B

    1954-01-01

    Various parts of aircraft propulsion engines that are in contact with hot gases often require cooling. Transpiration and film cooling, new methods that supposedly utilize cooling air more effectively than conventional convection cooling, have already been proposed. This report presents material necessary for a comparison of the cooling requirements of these three methods. Correlations that are regarded by the authors as the most reliable today are employed in evaluating each of the cooling processes. Calculations for the special case in which the gas velocity is constant along the cooled wall (flat plate) are presented. The calculations reveal that a comparison of the three cooling processes can be made on quite a general basis. The superiority of transpiration cooling is clearly shown for both laminar and turbulent flow. This superiority is reduced when the effects of radiation are included; for gas-turbine blades, however, there is evidence indicating that radiation may be neglected.

  4. Color-Coded Prefilled Medication Syringes Decrease Time to Delivery and Dosing Error in Simulated Emergency Department Pediatric Resuscitations.

    PubMed

    Moreira, Maria E; Hernandez, Caleb; Stevens, Allen D; Jones, Seth; Sande, Margaret; Blumen, Jason R; Hopkins, Emily; Bakes, Katherine; Haukoos, Jason S

    2015-08-01

    The Institute of Medicine has called on the US health care system to identify and reduce medical errors. Unfortunately, medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients when dosing requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national health care priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared with conventional medication administration, in simulated pediatric emergency department (ED) resuscitation scenarios. We performed a prospective, block-randomized, crossover study in which 10 emergency physician and nurse teams managed 2 simulated pediatric arrest scenarios in situ, using either prefilled, color-coded syringes (intervention) or conventional drug administration methods (control). The ED resuscitation room and the intravenous medication port were video recorded during the simulations. Data were extracted from video review by blinded, independent reviewers. Median time to delivery of all doses for the conventional and color-coded delivery groups was 47 seconds (95% confidence interval [CI] 40 to 53 seconds) and 19 seconds (95% CI 18 to 20 seconds), respectively (difference=27 seconds; 95% CI 21 to 33 seconds). With the conventional method, 118 doses were administered, with 20 critical dosing errors (17%); with the color-coded method, 123 doses were administered, with 0 critical dosing errors (difference=17%; 95% CI 4% to 30%). A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by emergency physician and nurse teams during simulated pediatric ED resuscitations. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  5. A new approach for the calculation of response spectral density of a linear stationary random multidegree of freedom system

    NASA Astrophysics Data System (ADS)

    Sharan, A. M.; Sankar, S.; Sankar, T. S.

    1982-08-01

    A new approach for the calculation of response spectral density for a linear stationary random multidegree of freedom system is presented. The method is based on modifying the stochastic dynamic equations of the system by using a set of auxiliary variables. The response spectral density matrix obtained by using this new approach contains the spectral densities and the cross-spectral densities of the system generalized displacements and velocities. The new method requires significantly less computation time as compared to the conventional method for calculating response spectral densities. Two numerical examples are presented to compare quantitatively the computation time.

  6. Coherent mode decomposition using mixed Wigner functions of Hermite-Gaussian beams.

    PubMed

    Tanaka, Takashi

    2017-04-15

    A new method of coherent mode decomposition (CMD) is proposed that is based on a Wigner-function representation of Hermite-Gaussian beams. In contrast to the well-known method using the cross spectral density (CSD), it directly determines the mode functions and their weights without solving the eigenvalue problem. This facilitates the CMD of partially coherent light whose Wigner functions (and thus CSDs) are not separable, in which case the conventional CMD requires solving an eigenvalue problem with a large matrix and thus is numerically formidable. An example is shown regarding the CMD of synchrotron radiation, one of the most important applications of the proposed method.

  7. MEMS piezoresistive cantilever for the direct measurement of cardiomyocyte contractile force

    NASA Astrophysics Data System (ADS)

    Matsudaira, Kenei; Nguyen, Thanh-Vinh; Hirayama Shoji, Kayoko; Tsukagoshi, Takuya; Takahata, Tomoyuki; Shimoyama, Isao

    2017-10-01

    This paper reports on a method to directly measure the contractile forces of cardiomyocytes using MEMS (micro electro mechanical systems)-based force sensors. The fabricated sensor chip consists of piezoresistive cantilevers that can measure contractile forces with high frequency (several tens of kHz) and high sensing resolution (less than 0.1 nN). Moreover, the proposed method does not require a complex observation system or image processing, which are necessary in conventional optical-based methods. This paper describes the design, fabrication, and evaluation of the proposed device and demonstrates the direct measurements of contractile forces of cardiomyocytes using the fabricated device.

  8. Novel method to sample very high power CO2 lasers: II Continuing Studies

    NASA Astrophysics Data System (ADS)

    Eric, John; Seibert, Daniel B., II; Green, Lawrence I.

    2005-04-01

    For the past 28 years, the Laser Hardened Materials Evaluation Laboratory (LHMEL) at the Wright-Patterson Air Force Base, OH, has worked with CO2 lasers capable of producing continuous energy up to 150 kW. These lasers are used in a number of advanced materials processing applications that require accurate spatial energy measurements of the laser. Conventional non-electronic methods are not satisfactory for determining the spatial energy profile. This paper describes continuing efforts in qualifying the new method in which a continuous, real-time electronic spatial energy profile can be obtained for very high power, (VHP) CO2 lasers.

  9. Near-Net Forging Technology Demonstration Program

    NASA Technical Reports Server (NTRS)

    Hall, I. Keith

    1996-01-01

    Significant advantages in specific mechanical properties, when compared to conventional aluminum (Al) alloys, make aluminum-lithium (Al-Li) alloys attractive candidate materials for use in cryogenic propellant tanks and dry bay structures. However, the cost of Al-Li alloys is typically five times that of 2219 aluminum. If conventional fabrication processes are employed to fabricate launch vehicle structure, the material costs will restrict their utilization. In order to fully exploit the potential cost and performance benefits of Al-Li alloys, it is necessary that near-net manufacturing methods be developed to off-set or reduce raw material costs. Near-net forging is an advanced manufacturing method that uses elevated temperature metal movement (forging) to fabricate a single piece, near-net shape, structure. This process is termed 'near-net' because only a minimal amount of post-forge machining is required. The near-net forging process was developed to reduce the material scrap rate (buy-to-fly ratio) and fabrication costs associated with conventional manufacturing methods. The goal for the near-net forging process, when mature, is to achieve an overall cost reduction of approximately 50 percent compared with conventional manufacturing options for producing structures fabricated from Al-Li alloys. This NASA Marshall Space Flight Center (MSFC) sponsored program has been a part of a unique government / industry partnership, coordinated to develop and demonstrate near-net forging technology. The objective of this program was to demonstrate scale-up of the near-net forging process. This objective was successfully achieved by fabricating four integrally stiffened, 170- inch diameter by 20-inch tall, Al-Li alloy 2195, Y-ring adapters. Initially, two 2195 Al-Li ingots were converted and back extruded to produce four cylindrical blockers. Conventional ring rolling of the blockers was performed to produce ring preforms, which were then contour ring rolled to produce 'contour preforms'. All of the contour preforms on this first-of-a-kind effort were imperfect, and the ingot used to fabricate two of the preforms was of an earlier vintage. As lessons were learned throughout the program, the tooling and procedures evolved, and hence the preform quality. Two of the best contour preforms were near- net forged to produce a process pathfinder Y-ring adapter and a 'mechanical properties pathfinder' Y-ring adapter. At this point, Lockheed Martin Astronautics elected to procure additional 2195 aluminum-lithium ingot of the latest vintage, produce two additional preforms, and substitute them for older vintage material non-perfectly filled preforms already produced on this contract. The existing preforms could have been used to fulfill the requirements of the contract.

  10. Sensor-less pseudo-sinusoidal drive for a permanent-magnet brushless ac motor

    NASA Astrophysics Data System (ADS)

    Liu, Li-Hsiang; Chern, Tzuen-Lih; Pan, Ping-Lung; Huang, Tsung-Mou; Tsay, Der-Min; Kuang, Jao-Hwa

    2012-04-01

    The precise rotor-position information is required for a permanent-magnet brushless ac motor (BLACM) drive. In the conventional sinusoidal drive method, either an encoder or a resolver is usually employed. For position sensor-less vector control schemes, the rotor flux estimation and torque components are obtained by complicated coordinate transformations. These computational intensive methods are susceptible to current distortions and parameter variations. To simplify the method complexity, this work presents a sensor-less pseudo-sinusoidal drive scheme with speed control for a three-phase BLACM. Based on the sinusoidal drive scheme, a floating period of each phase current is inserted for back electromotive force detection. The zero-crossing point is determined directly by the proposed scheme, and the rotor magnetic position and rotor speed can be estimated simultaneously. Several experiments for various active angle periods are undertaken. Furthermore, a current feedback control is included to minimize and compensate the torque fluctuation. The experimental results show that the proposed method has a competitive performance compared with the conventional drive manners for BLACM. The proposed scheme is straightforward, bringing the benefits of sensor-less drive and negating the need for coordinate transformations in the operating process.

  11. The Role of 16S rRNA Gene Sequencing in Identification of Microorganisms Misidentified by Conventional Methods

    PubMed Central

    Petti, C. A.; Polage, C. R.; Schreckenberger, P.

    2005-01-01

    Traditional methods for microbial identification require the recognition of differences in morphology, growth, enzymatic activity, and metabolism to define genera and species. Full and partial 16S rRNA gene sequencing methods have emerged as useful tools for identifying phenotypically aberrant microorganisms. We report on three bacterial blood isolates from three different College of American Pathologists-certified laboratories that were referred to ARUP Laboratories for definitive identification. Because phenotypic identification suggested unusual organisms not typically associated with the submitted clinical diagnosis, consultation with the Medical Director was sought and further testing was performed including partial 16S rRNA gene sequencing. All three patients had endocarditis, and conventional methods identified isolates from patients A, B, and C as a Facklamia sp., Eubacterium tenue, and a Bifidobacterium sp. 16S rRNA gene sequencing identified the isolates as Enterococcus faecalis, Cardiobacterium valvarum, and Streptococcus mutans, respectively. We conclude that the initial identifications of these three isolates were erroneous, may have misled clinicians, and potentially impacted patient care. 16S rRNA gene sequencing is a more objective identification tool, unaffected by phenotypic variation or technologist bias, and has the potential to reduce laboratory errors. PMID:16333109

  12. Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting

    PubMed Central

    Husen, Mohd Nizam; Lee, Sukhan

    2016-01-01

    A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis. PMID:27845711

  13. Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting.

    PubMed

    Husen, Mohd Nizam; Lee, Sukhan

    2016-11-11

    A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis.

  14. Acceleration of FDTD mode solver by high-performance computing techniques.

    PubMed

    Han, Lin; Xi, Yanping; Huang, Wei-Ping

    2010-06-21

    A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.

  15. Rapid phenotypic antimicrobial susceptibility testing using nanoliter arrays.

    PubMed

    Avesar, Jonathan; Rosenfeld, Dekel; Truman-Rosentsvit, Marianna; Ben-Arye, Tom; Geffen, Yuval; Bercovici, Moran; Levenberg, Shulamit

    2017-07-18

    Antibiotic resistance is a major global health concern that requires action across all sectors of society. In particular, to allow conservative and effective use of antibiotics clinical settings require better diagnostic tools that provide rapid determination of antimicrobial susceptibility. We present a method for rapid and scalable antimicrobial susceptibility testing using stationary nanoliter droplet arrays that is capable of delivering results in approximately half the time of conventional methods, allowing its results to be used the same working day. In addition, we present an algorithm for automated data analysis and a multiplexing system promoting practicality and translatability for clinical settings. We test the efficacy of our approach on numerous clinical isolates and demonstrate a 2-d reduction in diagnostic time when testing bacteria isolated directly from urine samples.

  16. Wavelength calibration of dispersive near-infrared spectrometer using relative k-space distribution with low coherence interferometer

    NASA Astrophysics Data System (ADS)

    Kim, Ji-hyun; Han, Jae-Ho; Jeong, Jichai

    2016-05-01

    The commonly employed calibration methods for laboratory-made spectrometers have several disadvantages, including poor calibration when the number of characteristic spectral peaks is low. Therefore, we present a wavelength calibration method using relative k-space distribution with low coherence interferometer. The proposed method utilizes an interferogram with a perfect sinusoidal pattern in k-space for calibration. Zero-crossing detection extracts the k-space distribution of a spectrometer from the interferogram in the wavelength domain, and a calibration lamp provides information about absolute wavenumbers. To assign wavenumbers, wavelength-to-k-space conversion is required for the characteristic spectrum of the calibration lamp with the extracted k-space distribution. Then, the wavelength calibration is completed by inverse conversion of the k-space into wavelength domain. The calibration performance of the proposed method was demonstrated with two experimental conditions of four and eight characteristic spectral peaks. The proposed method elicited reliable calibration results in both cases, whereas the conventional method of third-order polynomial curve fitting failed to determine wavelengths in the case of four characteristic peaks. Moreover, for optical coherence tomography imaging, the proposed method could improve axial resolution due to higher suppression of sidelobes in point spread function than the conventional method. We believe that our findings can improve not only wavelength calibration accuracy but also resolution for optical coherence tomography.

  17. Planar-focusing cathodes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewellen, J. W.; Noonan, J.; Accelerator Systems Division

    2005-01-01

    Conventional {pi}-mode rf photoinjectors typically use magnetic solenoids for emittance compensation. This provides independent focusing strength but can complicate rf power feed placement, introduce asymmetries (due to coil crossovers), and greatly increase the cost of the photoinjector. Cathode-region focusing can also provide for a form of emittance compensation. Typically this method strongly couples focusing strength to the field gradient on the cathode, however, and usually requires altering the longitudinal position of the cathode to change the focusing. We propose a new method for achieving cathode-region variable-strength focusing for emittance compensation. The new method reduces the coupling to the gradient onmore » the cathode and does not require a change in the longitudinal position of the cathode. Expected performance for an S-band system is similar to conventional solenoid-based designs. This paper presents the results of rf cavity and beam dynamics simulations of the new design. We have proposed a method for performing emittance compensation using a cathode-region focusing scheme. This technique allows the focusing strength to be adjusted somewhat independently of the on-axis field strength. Beam dynamics calculations indicate performance should be comparable to presently in-use emittance compensation schemes, with a simpler configuration and fewer possibilities for emittance degradation due to the focusing optics. There are several potential difficulties with this approach, including cathode material selection, cathode heating, and peak fields in the gun. We hope to begin experimenting with a cathode of this type in the near future, and several possibilities exist for reducing the peak gradients to more acceptable levels.« less

  18. Ion beam figuring of silicon aspheres

    NASA Astrophysics Data System (ADS)

    Demmler, Marcel; Zeuner, Michael; Luca, Alfonz; Dunger, Thoralf; Rost, Dirk; Kiontke, Sven; Krüger, Marcus

    2011-03-01

    Silicon lenses are widely used for infrared applications. Especially for portable devices the size and weight of the optical system are very important factors. The use of aspherical silicon lenses instead of spherical silicon lenses results in a significant reduction of weight and size. The manufacture of silicon lenses is more challenging than the manufacture of standard glass lenses. Typically conventional methods like diamond turning, grinding and polishing are used. However, due to the high hardness of silicon, diamond turning is very difficult and requires a lot of experience. To achieve surfaces of a high quality a polishing step is mandatory within the manufacturing process. Nevertheless, the required surface form accuracy cannot be achieved through the use of conventional polishing methods because of the unpredictable behavior of the polishing tools, which leads to an unstable removal rate. To overcome these disadvantages a method called Ion Beam Figuring can be used to manufacture silicon lenses with high surface form accuracies. The general advantage of the Ion Beam Figuring technology is a contactless polishing process without any aging effects of the tool. Due to this an excellent stability of the removal rate without any mechanical surface damage is achieved. The related physical process - called sputtering - can be applied to any material and is therefore also applicable to materials of high hardness like Silicon (SiC, WC). The process is realized through the commercially available ion beam figuring system IonScan 3D. During the process, the substrate is moved in front of a focused broad ion beam. The local milling rate is controlled via a modulated velocity profile, which is calculated specifically for each surface topology in order to mill the material at the associated positions to the target geometry. The authors will present aspherical silicon lenses with very high surface form accuracies compared to conventionally manufactured lenses.

  19. A comparison of the clinical effectiveness of spinal orthoses manufactured using the conventional manual method and CAD/CAM method in the management of AIS.

    PubMed

    Wong, M S; Cheng, C Y; Ng, B K W; Lam, T P; Chiu, S W

    2006-01-01

    Spinal orthoses are commonly prescribed to patients with moderate AIS for prevention of further deterioration. In a conventional manufacturing method, plaster bandages are used to get patient's body contour and plaster cast is rectified manually. With the introduction of CAD/CAM system, a series of automated processes from body scanning to digital rectification and milling of positive model can be performed in a fast and accurate fashion. This project is to study the impact of CAD/CAM method as compared with the conventional method. In assessing the 147 recruited subjects fitted with spinal orthoses (43 subjects using conventional method and 104 subjects using CAD/CAM method), significant decreases (p<0.05) were found in the Cobb angles when comparing the pre-intervention data with that of the first year of intervention. Regarding the learning curve, Orthotists are getting more competent with the CAD/CAM technique in four years time. The mean productivity of the CAD/CAM method is 2.75 times higher than that of the conventional method. The CAD/CAM method could achieve similar clinical outcomes and with its high efficiency, could be considered as substitute for conventional methods in fabricating spinal orthoses for patients with AIS.

  20. 47 CFR 90.633 - Conventional systems loading requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... of need. In a ribbon, regional or statewide system, a mobile station will be counted for channel... 47 Telecommunication 5 2011-10-01 2011-10-01 false Conventional systems loading requirements. 90... RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing Licensing and Use of Frequencies...

Top