ERIC Educational Resources Information Center
Elrick, Mike
2003-01-01
Traditional techniques and gear are better suited for comfortable extended wilderness trips with high school students than are emerging technologies and techniques based on low-impact camping and petroleum-based clothing, which send students the wrong messages about ecological relatedness and sustainability. Traditional travel techniques and…
NASA Technical Reports Server (NTRS)
Ricks, Wendell R.; Abbott, Kathy H.
1987-01-01
A traditional programming technique for controlling the display of optional flight information in a civil transport cockpit is compared to a rule-based technique for the same function. This application required complex decision logic and a frequently modified rule base. The techniques are evaluated for execution efficiency and implementation ease; the criterion used to calculate the execution efficiency is the total number of steps required to isolate hypotheses that were true and the criteria used to evaluate the implementability are ease of modification and verification and explanation capability. It is observed that the traditional program is more efficient than the rule-based program; however, the rule-based programming technique is more applicable for improving programmer productivity.
NASA Technical Reports Server (NTRS)
Ricks, Wendell R.; Abbott, Kathy H.
1987-01-01
To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.
Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa
2015-04-13
Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ultrasound Guidance for Botulinum Neurotoxin Chemodenervation Procedures.
Alter, Katharine E; Karp, Barbara I
2017-12-28
Injections of botulinum neurotoxins (BoNTs) are prescribed by clinicians for a variety of disorders that cause over-activity of muscles; glands; pain and other structures. Accurately targeting the structure for injection is one of the principle goals when performing BoNTs procedures. Traditionally; injections have been guided by anatomic landmarks; palpation; range of motion; electromyography or electrical stimulation. Ultrasound (US) based imaging based guidance overcomes some of the limitations of traditional techniques. US and/or US combined with traditional guidance techniques is utilized and or recommended by many expert clinicians; authors and in practice guidelines by professional academies. This article reviews the advantages and disadvantages of available guidance techniques including US as well as technical aspects of US guidance and a focused literature review related to US guidance for chemodenervation procedures including BoNTs injection.
Dai, Yunchao; Nasir, Mubasher; Zhang, Yulin; Gao, Jiakai; Lv, Yamin; Lv, Jialong
2018-01-01
Several predictive models and methods have been used for heavy metals bioavailability, but there is no universally accepted approach in evaluating the bioavailability of arsenic (As) in soil. The technique of diffusive gradients in thin-films (DGT) is a promising tool, but there is a considerable debate with respect to its suitability. The DGT method was compared with other traditional chemical extractions techniques (soil solution, NaHCO 3 , NH 4 Cl, HCl, and total As method) for estimating As bioavailability in soil based on a greenhouse experiment using Brassica chinensis grown in various soils from 15 provinces in China. In addition, we assessed whether these methods are independent of soil properties. The correlations between plant and soil As concentration measured with traditional extraction techniques were pH and iron oxide (Fe ox ) dependent, indicating that these methods are influenced by soil properties. In contrast, DGT measurements were independent of soil properties and also showed a better correlation coefficient than other traditional techniques. Thus, DGT technique is superior to traditional techniques and should be preferable for evaluating As bioavailability in different type of soils. Copyright © 2017 Elsevier Ltd. All rights reserved.
Chen, Shilin; Guo, Baolin; Zhang, Guijun; Yan, Zhuyun; Luo, Guangming; Sun, Suqin; Wu, Hezhen; Huang, Linfang; Pang, Xiaohui; Chen, Jianbo
2012-04-01
In this review, the authors summarized the new technologies and methods for identifying traditional Chinese medicinal materials, including molecular identification, chemical identification, morphological identification, microscopic identification and identification based on biological effects. The authors introduced the principle, characteristics, application and prospect on each new technology or method and compared their advantages and disadvantages. In general, new methods make the result more objective and accurate. DNA barcoding technique and spectroscopy identification have their owner obvious strongpoint in universality and digitalization. In the near future, the two techniques are promising to be the main trend for identifying traditional Chinese medicinal materials. The identification techniques based on microscopy, liquid chromatography, PCR, biological effects and DNA chip will be indispensable supplements. However, the bionic identification technology is just placed in the developing stage at present.
Mabry, C D
2001-03-01
Vascular surgeons have had to contend with rising costs while their reimbursements have undergone steady reductions. The use of newer accounting techniques can help vascular surgeons better manage their practices, plan for future expansion, and control costs. This article reviews traditional accounting methods, together with activity-based costing (ABC) principles that have been used in the past for practice expense analysis. The main focus is on a new technique-resource-based costing (RBC)-which uses the widely available Resource-Based Relative Value Scale (RBRVS) as its basis. The RBC technique promises easier implementation as well as more flexibility in determining true costs of performing various procedures, as opposed to more traditional accounting methods. It is hoped that RBC will assist vascular surgeons in coping with decreasing reimbursement. Copyright 2001 by W.B. Saunders Company
Overview of innovative remediation of emerging contaminants
NASA Astrophysics Data System (ADS)
Keller, A. A.; Adeleye, A. S.; Huang, Y.; Garner, K.
2015-12-01
The application of nanotechnology in drinking water treatment and pollution cleanup is promising, as demonstrated by a number of field-based (pilot and full scale) and bench scale studies. A number of reviews exist for these nanotechnology-based applications; but to better illustrate its importance and guide its development, a direct comparison between traditional treatment technologies and emerging approaches using nanotechnology is needed. In this review, the performances of traditional technologies and nanotechnology for water treatment and environmental remediation were compared with the goal of providing an up-to-date reference on the state of treatment techniques for researchers, industry, and policy makers. Pollutants were categorized into broad classes, and the most cost-effective techniques (traditional and nanotechnology-based) in each category reported in the literature were compared. Where information was available, cost and environmental implications of both technologies were also compared. Traditional treatment technologies were found to currently offer the most cost-effective choices for removal of several common pollutants from drinking water and polluted sites. Nano-based techniques may however become important in complicated remediation conditions and meeting increasingly stringent water quality standards, especially in removal of emerging pollutants and low levels of contaminants. We also discuss challenges facing environmental application of nanotechnology were also discussed and potential solutions.
NASA Astrophysics Data System (ADS)
Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal
2014-06-01
This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.
Quality assurance paradigms for artificial intelligence in modelling and simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oren, T.I.
1987-04-01
New classes of quality assurance concepts and techniques are required for the advanced knowledge-processing paradigms (such as artificial intelligence, expert systems, or knowledge-based systems) and the complex problems that only simulative systems can cope with. A systematization of quality assurance problems as well as examples are given to traditional and cognizant quality assurance techniques in traditional and cognizant modelling and simulation.
Techniques for cesarean section.
Hofmeyr, Justus G; Novikova, Natalia; Mathai, Matthews; Shah, Archana
2009-11-01
The effects of complete methods of cesarean section (CS) were compared. Metaanalysis of randomized controlled trials of intention to perform CS using different techniques was carried out. Joel-Cohen-based CS compared with Pfannenstiel CS was associated with reduced blood loss, operating time, time to oral intake, fever, duration of postoperative pain, analgesic injections, and time from skin incision to birth of the baby. Misgav-Ladach compared with the traditional method was associated with reduced blood loss, operating time, time to mobilization, and length of postoperative stay for the mother. Joel-Cohen-based methods have advantages compared with Pfannenstiel and traditional (lower midline) CS techniques. However, these trials do not provide information on serious and long-term outcomes.
NASA Astrophysics Data System (ADS)
Keller, A. A.; Adeleye, A. S.; Huang, Y.; Garner, K.
2014-12-01
The application of nanotechnology in drinking water treatment and pollution cleanup is promising, as demonstrated by a number of field-based (pilot and full scale) and bench scale studies. A number of reviews exist for these nanotechnology-based applications; but to better illustrate its importance and guide its development, a direct comparison between traditional treatment technologies and emerging approaches using nanotechnology is needed. In this review, the performances of traditional technologies and nanotechnology for water treatment and environmental remediation were compared with the goal of providing an up-to-date reference on the state of treatment techniques for researchers, industry, and policy makers. Pollutants were categorized into broad classes, and the most cost-effective techniques (traditional and nanotechnology-based) in each category reported in the literature were compared. Where information was available, cost and environmental implications of both technologies were also compared. Traditional treatment technologies were found to currently offer the most cost-effective choices for removal of several common pollutants from drinking water and polluted sites. Nano-based techniques may however become important in complicated remediation conditions and meeting increasingly stringent water quality standards, especially in removal of emerging pollutants and low levels of contaminants. We also discuss challenges facing environmental application of nanotechnology were also discussed and potential solutions.
Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M
2007-02-19
A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.
Townsend, F I; Ralphs, S C; Coronado, G; Sweet, D C; Ward, J; Bloch, C P
2012-01-01
To compare the hydro-surgical technique to traditional techniques for removal of subcutaneous tissue in the preparation of full-thickness skin grafts. Ex vivo experimental study and a single clinical case report. Four canine cadavers and a single clinical case. Four sections of skin were harvested from the lateral flank of recently euthanatized dogs. Traditional preparation methods used included both a blade or scissors technique, each of which were compared to the hydro-surgical technique individually. Preparation methods were compared based on length of time for removal of the subcutaneous tissue from the graft, histologic grading, and measurable thickness as compared to an untreated sample. The hydro-surgical technique had the shortest skin graft preparation time as compared to traditional techniques (p = 0.002). There was no significant difference in the histological grading or measurable subcutaneous thickness between skin specimens. The hydro-surgical technique provides a rapid, effective debridement of subcutaneous tissue in the preparation of full-thickness skin grafts. There were not any significant changes in histological grade and subcutaneous tissue remaining among all treatment types. Additionally the hydro-surgical technique was successfully used to prepare a full-thickness meshed free skin graft in the reconstruction of a traumatic medial tarsal wound in a dog.
Relational Data Bases--Are You Ready?
ERIC Educational Resources Information Center
Marshall, Dorothy M.
1989-01-01
Migrating from a traditional to a relational database technology requires more than traditional project management techniques. An overview of what to consider before migrating to relational database technology is presented. Leadership, staffing, vendor support, hardware, software, and application development are discussed. (MLW)
Way to nanogrinding technology
NASA Astrophysics Data System (ADS)
Miyashita, Masakazu
1990-11-01
Precision finishing process of hard and brittle material components such as single crystal silicon wafer and magnetic head consists of lapping and polishing which depend too much on skilled labor. This process is based on the traditional optical production technology and entirely different from the automated mass production technique in automobile production. Instead of traditional lapping and polishing, the nanogrinding is proposed as a new stock removal machining to generate optical surface on brittle materials. By this new technology, the damage free surface which is the same one produced by lapping and polishing can be obtained on brittle materials, and the free carvature can also be generated on brittle materials. This technology is based on the motion copying principle which is the same as in case of metal parts machining. The new nanogrinding technology is anticipated to be adapted as the machining technique suitable for automated mass production, because the stable machining on the level of optical production technique is expected to be obtained by the traditional lapping and polishing.
Liu, Liang Qin; Mehigan, Sinead
2016-05-01
This systematic review aimed to critically appraise and synthesize updated evidence regarding the effect of surgical-scrub techniques on skin integrity and the incidence of surgical site infections. Databases searched include the Cumulative Index to Nursing and Allied Health Literature, MEDLINE, Embase, and Cochrane Central. Our review was limited to eight peer-reviewed, randomized controlled trials and two nonrandomized controlled trials published in English from 1990 to 2015. Comparison models included traditional hand scrubbing with chlorhexidine gluconate or povidone-iodine against alcohol-based hand rubbing, scrubbing with a brush versus without a brush, and detergent-based antiseptics alone versus antiseptics incorporating alcohol solutions. Evidence showed that hand rubbing techniques are as effective as traditional scrubbing and seem to be better tolerated. Hand rubbing appears to cause less skin damage than traditional scrub protocols, and scrub personnel tolerated brushless techniques better than scrubbing using a brush. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Wijerathne, Buddhika; Rathnayake, Geetha
2013-01-01
Background Most universities currently practice traditional practical spot tests to evaluate students. However, traditional methods have several disadvantages. Computer-based examination techniques are becoming more popular among medical educators worldwide. Therefore incorporating the computer interface in practical spot testing is a novel concept that may minimize the shortcomings of traditional methods. Assessing students’ attitudes and perspectives is vital in understanding how students perceive the novel method. Methods One hundred and sixty medical students were randomly allocated to either a computer-based spot test (n=80) or a traditional spot test (n=80). The students rated their attitudes and perspectives regarding the spot test method soon after the test. The results were described comparatively. Results Students had higher positive attitudes towards the computer-based practical spot test compared to the traditional spot test. Their recommendations to introduce the novel practical spot test method for future exams and to other universities were statistically significantly higher. Conclusions The computer-based practical spot test is viewed as more acceptable to students than the traditional spot test. PMID:26451213
Zheng, Shiqi; Tang, Xiaoqi; Song, Bao; Lu, Shaowu; Ye, Bosheng
2013-07-01
In this paper, a stable adaptive PI control strategy based on the improved just-in-time learning (IJITL) technique is proposed for permanent magnet synchronous motor (PMSM) drive. Firstly, the traditional JITL technique is improved. The new IJITL technique has less computational burden and is more suitable for online identification of the PMSM drive system which is highly real-time compared to traditional JITL. In this way, the PMSM drive system is identified by IJITL technique, which provides information to an adaptive PI controller. Secondly, the adaptive PI controller is designed in discrete time domain which is composed of a PI controller and a supervisory controller. The PI controller is capable of automatically online tuning the control gains based on the gradient descent method and the supervisory controller is developed to eliminate the effect of the approximation error introduced by the PI controller upon the system stability in the Lyapunov sense. Finally, experimental results on the PMSM drive system show accurate identification and favorable tracking performance. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Collaborative Learning in the Dance Technique Class
ERIC Educational Resources Information Center
Raman, Tanja
2009-01-01
This research was designed to enhance dance technique learning by promoting critical thinking amongst students studying on a degree programme at the University of Wales Institute, Cardiff. Students were taught Cunningham-based dance technique using pair work together with the traditional demonstration/copying method. To evaluate the study,…
DDC 10 Year Requirements and Planning Study. Interagency Survey Report
1975-12-12
RDT&E Management Information Services Traditional bibliographic information storage and retrieval techniques are insufficienL for satisfaction of...a private source. 3.3 Economics and Marketing ERDA’s tradition of absorbing costs for information 1rocessing shows no indication of changing...Libraries. Its products, in addition to traditional library services, include three prime data bases: 0 MEDLINE - journal citations and subject
Wind Gust Measurement Techniques-From Traditional Anemometry to New Possibilities.
Suomi, Irene; Vihma, Timo
2018-04-23
Information on wind gusts is needed for assessment of wind-induced damage and risks to safety. The measurement of wind gust speed requires a high temporal resolution of the anemometer system, because the gust is defined as a short-duration (seconds) maximum of the fluctuating wind speed. Until the digitalization of wind measurements in the 1990s, the wind gust measurements suffered from limited recording and data processing resources. Therefore, the majority of continuous wind gust records date back at most only by 30 years. Although the response characteristics of anemometer systems are good enough today, the traditional measurement techniques at weather stations based on cup and sonic anemometers are limited to heights and regions where the supporting structures can reach. Therefore, existing measurements are mainly concentrated over densely-populated land areas, whereas from remote locations, such as the marine Arctic, wind gust information is available only from sparse coastal locations. Recent developments of wind gust measurement techniques based on turbulence measurements from research aircraft and from Doppler lidar can potentially provide new information from heights and locations unreachable by traditional measurement techniques. Moreover, fast-developing measurement methods based on Unmanned Aircraft Systems (UASs) may add to better coverage of wind gust measurements in the future. In this paper, we provide an overview of the history and the current status of anemometry from the perspective of wind gusts. Furthermore, a discussion on the potential future directions of wind gust measurement techniques is provided.
2014-03-01
purpose of the study was to determine if the use of a simulator is at least as effective in marksmanship training as traditional dry fire techniques...determine if the use of a simulator is at least as effective in marksmanship training as traditional dry fire techniques. A between-groups study with a...marksmanship. Naval commands could use the information to effectively maintain gun qualifications for inport duty section watch bills and constant anti
[An object-oriented intelligent engineering design approach for lake pollution control].
Zou, Rui; Zhou, Jing; Liu, Yong; Zhu, Xiang; Zhao, Lei; Yang, Ping-Jian; Guo, Huai-Cheng
2013-03-01
Regarding the shortage and deficiency of traditional lake pollution control engineering techniques, a new lake pollution control engineering approach was proposed in this study, based on object-oriented intelligent design (OOID) from the perspective of intelligence. It can provide a new methodology and framework for effectively controlling lake pollution and improving water quality. The differences between the traditional engineering techniques and the OOID approach were compared. The key points for OOID were described as object perspective, cause and effect foundation, set points into surface, and temporal and spatial optimization. The blue algae control in lake was taken as an example in this study. The effect of algae control and water quality improvement were analyzed in details from the perspective of object-oriented intelligent design based on two engineering techniques (vertical hydrodynamic mixer and pumping algaecide recharge). The modeling results showed that the traditional engineering design paradigm cannot provide scientific and effective guidance for engineering design and decision-making regarding lake pollution. Intelligent design approach is based on the object perspective and quantitative causal analysis in this case. This approach identified that the efficiency of mixers was much higher than pumps in achieving the goal of low to moderate water quality improvement. However, when the objective of water quality exceeded a certain value (such as the control objective of peak Chla concentration exceeded 100 microg x L(-1) in this experimental water), the mixer cannot achieve this goal. The pump technique can achieve the goal but with higher cost. The efficiency of combining the two techniques was higher than using one of the two techniques alone. Moreover, the quantitative scale control of the two engineering techniques has a significant impact on the actual project benefits and costs.
IACOANGELI, Maurizio; NOCCHI, Niccolò; NASI, Davide; DI RIENZO, Alessandro; DOBRAN, Mauro; GLADI, Maurizio; COLASANTI, Roberto; ALVARO, Lorenzo; POLONARA, Gabriele; SCERRATI, Massimo
2016-01-01
The most important target of minimally invasive surgery is to obtain the best therapeutic effect with the least iatrogenic injury. In this background, a pivotal role in contemporary neurosurgery is played by the supraorbital key-hole approach proposed by Perneczky for anterior cranial base surgery. In this article, it is presented as a possible valid alternative to the traditional craniotomies in anterior cranial fossa meningiomas removal. From January 2008 to January 2012 at our department 56 patients underwent anterior cranial base meningiomas removal. Thirty-three patients were submitted to traditional approaches while 23 to supraorbital key-hole technique. A clinical and neuroradiological pre- and postoperative evaluation were performed, with attention to eventual complications, length of surgical procedure, and hospitalization. Compared to traditional approaches the supraorbital key-hole approach was associated neither to a greater range of postoperative complications nor to a longer surgical procedure and hospitalization while permitting the same lesion control. With this technique, minimization of brain exposition and manipulation with reduction of unwanted iatrogenic injuries, neurovascular structures preservation, and a better aesthetic result are possible. The supraorbital key-hole approach according to Perneckzy could represent a valid alternative to traditional approaches in anterior cranial base meningiomas surgery. PMID:26804334
Application of Digital Anthropometry for Craniofacial Assessment
Jayaratne, Yasas S. N.; Zwahlen, Roger A.
2014-01-01
Craniofacial anthropometry is an objective technique based on a series of measurements and proportions, which facilitate the characterization of phenotypic variation and quantification of dysmorphology. With the introduction of stereophotography, it is possible to acquire a lifelike three-dimensional (3D) image of the face with natural color and texture. Most of the traditional anthropometric landmarks can be identified on these 3D photographs using specialized software. Therefore, it has become possible to compute new digital measurements, which were not feasible with traditional instruments. The term “digital anthropometry” has been used by researchers based on such systems to separate their methods from conventional manual measurements. Anthropometry has been traditionally used as a research tool. With the advent of digital anthropometry, this technique can be employed in several disciplines as a noninvasive tool for quantifying facial morphology. The aim of this review is to provide a broad overview of digital anthropometry and discuss its clinical applications. PMID:25050146
Interventional urology: endourology in small animal veterinary medicine.
Berent, Allyson C
2015-07-01
The use of novel image-guided techniques in veterinary medicine has become more widespread, especially in urologic diseases. With the common incidence of urinary tract obstructions, stones disease, renal disease, and urothelial malignancies, combined with the recognized invasiveness and morbidity associated with traditional surgical techniques, the use of minimally invasive alternatives using interventional radiology and interventional endoscopy techniques has become incredibly appealing to owners and clinicians. This article provides a brief overview of some of the most common procedures done in endourology in veterinary medicine to date, providing as much evidence-based medicine as possible when comparing with traditional surgical alternatives. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin
2018-03-01
Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.
Mitigating the Hook Effect in Lateral Flow Sandwich Immunoassays Using Real-Time Reaction Kinetics.
Rey, Elizabeth G; O'Dell, Dakota; Mehta, Saurabh; Erickson, David
2017-05-02
The quantification of analyte concentrations using lateral flow assays is a low-cost and user-friendly alternative to traditional lab-based assays. However, sandwich-type immunoassays are often limited by the high-dose hook effect, which causes falsely low results when analytes are present at very high concentrations. In this paper, we present a reaction kinetics-based technique that solves this problem, significantly increasing the dynamic range of these devices. With the use of a traditional sandwich lateral flow immunoassay, a portable imaging device, and a mobile interface, we demonstrate the technique by quantifying C-reactive protein concentrations in human serum over a large portion of the physiological range. The technique could be applied to any hook effect-limited sandwich lateral flow assay and has a high level of accuracy even in the hook effect range.
Plasticity models of material variability based on uncertainty quantification techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Reese E.; Rizzi, Francesco; Boyce, Brad
The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQmore » techniques can be used in model selection and assessing the quality of calibrated physical parameters.« less
Magnetic resonance angiography of the pediatric abdomen and pelvis: techniques and imaging findings.
Sada, David M; Vellody, Ranjith; Liu, Peter S
2013-11-01
Although traditional catheter-based angiography has been the gold standard for pediatric abdominal and pelvic vascular imaging for the past several decades, advances in magnetic resonance angiography (MRA) have made it a viable alternative. MRA offers several advantages in that it is noninvasive, can be performed without ionizing radiation, and does not necessarily rely on contrast administration. The ability of modern MRA techniques to define variant vascular anatomy and detect vascular disease may obviate traditional angiography in some patients. Copyright © 2013 Elsevier Inc. All rights reserved.
Preservice Elementary Teachers' Beliefs about Science Teaching
ERIC Educational Resources Information Center
Yilmaz-Tuzun, Ozgul
2008-01-01
In this study, a Beliefs About Teaching (BAT) scale was created to examine preservice elementary science teachers' self-reported comfort level with both traditional and reform-based teaching methods, assessment techniques, classroom management techniques, and science content. Participants included 166 preservice teachers from three different US…
Precision Learning Assessment: An Alternative to Traditional Assessment Techniques.
ERIC Educational Resources Information Center
Caltagirone, Paul J.; Glover, Christopher E.
1985-01-01
A continuous and curriculum-based assessment method, Precision Learning Assessment (PLA), which integrates precision teaching and norm-referenced techniques, was applied to a math computation curriculum for 214 third graders. The resulting districtwide learning curves defining average annual progress through the computation curriculum provided…
Inherent secure communications using lattice based waveform design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pugh, Matthew Owen
2013-12-01
The wireless communications channel is innately insecure due to the broadcast nature of the electromagnetic medium. Many techniques have been developed and implemented in order to combat insecurities and ensure the privacy of transmitted messages. Traditional methods include encrypting the data via cryptographic methods, hiding the data in the noise floor as in wideband communications, or nulling the signal in the spatial direction of the adversary using array processing techniques. This work analyzes the design of signaling constellations, i.e. modulation formats, to combat eavesdroppers from correctly decoding transmitted messages. It has been shown that in certain channel models the abilitymore » of an adversary to decode the transmitted messages can be degraded by a clever signaling constellation based on lattice theory. This work attempts to optimize certain lattice parameters in order to maximize the security of the data transmission. These techniques are of interest because they are orthogonal to, and can be used in conjunction with, traditional security techniques to create a more secure communication channel.« less
Wang, Mei; Wang, Hongxia; Zhao, Namula
2015-02-01
To explore the unique ideas, properties, and standards of fracture repositioning with osteopathy in traditional Mongolian medicine in China. Based on the natural life concept of "integration of universe and man", osteopathy in traditional Mongolian medicine in China uses the modern principles and methods of physiology, psychology, and biomechanics. Against this background, we explored the unique ideas, properties, and stan- dards of fracture repositioning in traditional Mongolian medicine. Fracture treatment with osteopathy in traditional Mongolian medicine in China is based on (a) the ideas of natural, sealed, self and dynamic repositioning of fractures; (b) the properties of structural continuity and functional completeness; (c) the standards of "integration of movement and stillness" and "force to force". The unique ideas, properties, and standards of fracture repositioning with osteopathy in traditional Mongolian medicine in China have resulted in the widespread use of such techniques and represents the future direction of the development of fracture repositioning.
Wind Gust Measurement Techniques—From Traditional Anemometry to New Possibilities
2018-01-01
Information on wind gusts is needed for assessment of wind-induced damage and risks to safety. The measurement of wind gust speed requires a high temporal resolution of the anemometer system, because the gust is defined as a short-duration (seconds) maximum of the fluctuating wind speed. Until the digitalization of wind measurements in the 1990s, the wind gust measurements suffered from limited recording and data processing resources. Therefore, the majority of continuous wind gust records date back at most only by 30 years. Although the response characteristics of anemometer systems are good enough today, the traditional measurement techniques at weather stations based on cup and sonic anemometers are limited to heights and regions where the supporting structures can reach. Therefore, existing measurements are mainly concentrated over densely-populated land areas, whereas from remote locations, such as the marine Arctic, wind gust information is available only from sparse coastal locations. Recent developments of wind gust measurement techniques based on turbulence measurements from research aircraft and from Doppler lidar can potentially provide new information from heights and locations unreachable by traditional measurement techniques. Moreover, fast-developing measurement methods based on Unmanned Aircraft Systems (UASs) may add to better coverage of wind gust measurements in the future. In this paper, we provide an overview of the history and the current status of anemometry from the perspective of wind gusts. Furthermore, a discussion on the potential future directions of wind gust measurement techniques is provided. PMID:29690647
Unbundling in Current Broadband and Next-Generation Ultra-Broadband Access Networks
NASA Astrophysics Data System (ADS)
Gaudino, Roberto; Giuliano, Romeo; Mazzenga, Franco; Valcarenghi, Luca; Vatalaro, Francesco
2014-05-01
This article overviews the methods that are currently under investigation for implementing multi-operator open-access/shared-access techniques in next-generation access ultra-broadband architectures, starting from the traditional "unbundling-of-the-local-loop" techniques implemented in legacy twisted-pair digital subscriber line access networks. A straightforward replication of these copper-based unbundling-of-the-local-loop techniques is usually not feasible on next-generation access networks, including fiber-to-the-home point-to-multipoint passive optical networks. To investigate this issue, the article first gives a concise description of traditional copper-based unbundling-of-the-local-loop solutions, then focalizes on both next-generation access hybrid fiber-copper digital subscriber line fiber-to-the-cabinet scenarios and on fiber to the home by accounting for the mix of regulatory and technological reasons driving the next-generation access migration path, focusing mostly on the European situation.
Density-matrix-based algorithm for solving eigenvalue problems
NASA Astrophysics Data System (ADS)
Polizzi, Eric
2009-03-01
A fast and stable numerical algorithm for solving the symmetric eigenvalue problem is presented. The technique deviates fundamentally from the traditional Krylov subspace iteration based techniques (Arnoldi and Lanczos algorithms) or other Davidson-Jacobi techniques and takes its inspiration from the contour integration and density-matrix representation in quantum mechanics. It will be shown that this algorithm—named FEAST—exhibits high efficiency, robustness, accuracy, and scalability on parallel architectures. Examples from electronic structure calculations of carbon nanotubes are presented, and numerical performances and capabilities are discussed.
Emotional Design Tutoring System Based on Multimodal Affective Computing Techniques
ERIC Educational Resources Information Center
Wang, Cheng-Hung; Lin, Hao-Chiang Koong
2018-01-01
In a traditional class, the role of the teacher is to teach and that of the students is to learn. However, the constant and rapid technological advancements have transformed education in numerous ways. For instance, in addition to traditional, face to face teaching, E-learning is now possible. Nevertheless, face to face teaching is unavailable in…
Point-of-care ultrasound (POCUS): unnecessary gadgetry or evidence-based medicine?
Smallwood, Nicholas; Dachsel, Martin
2018-06-01
Over the last decade there has been increasing interest and enthusiasm in point-of-care ultrasound (POCUS) as an aide to traditional examination techniques in assessing acutely unwell adult patients. However, it currently remains the domain of a relatively small handful of physicians within the UK. There are numerous reasons for this, notably a lack of training pathways and supervisors but also a lack of understanding of the evidence base behind this imaging modality. This review article aims to explore some of the evidence base behind POCUS for a number of medical pathologies, and where possible compare it to evidenced traditional examination techniques. We discuss the issues around training in bedside ultrasound and recommend a push to integrate POCUS training into internal medicine curricula and support trainers to comprehensively deliver this. © Royal College of Physicians 2018. All rights reserved.
Symmetric Phase Only Filtering for Improved DPIV Data Processing
NASA Technical Reports Server (NTRS)
Wernet, Mark P.
2006-01-01
The standard approach in Digital Particle Image Velocimetry (DPIV) data processing is to use Fast Fourier Transforms to obtain the cross-correlation of two single exposure subregions, where the location of the cross-correlation peak is representative of the most probable particle displacement across the subregion. This standard DPIV processing technique is analogous to Matched Spatial Filtering, a technique commonly used in optical correlators to perform the crosscorrelation operation. Phase only filtering is a well known variation of Matched Spatial Filtering, which when used to process DPIV image data yields correlation peaks which are narrower and up to an order of magnitude larger than those obtained using traditional DPIV processing. In addition to possessing desirable correlation plane features, phase only filters also provide superior performance in the presence of DC noise in the correlation subregion. When DPIV image subregions contaminated with surface flare light or high background noise levels are processed using phase only filters, the correlation peak pertaining only to the particle displacement is readily detected above any signal stemming from the DC objects. Tedious image masking or background image subtraction are not required. Both theoretical and experimental analyses of the signal-to-noise ratio performance of the filter functions are presented. In addition, a new Symmetric Phase Only Filtering (SPOF) technique, which is a variation on the traditional phase only filtering technique, is described and demonstrated. The SPOF technique exceeds the performance of the traditionally accepted phase only filtering techniques and is easily implemented in standard DPIV FFT based correlation processing with no significant computational performance penalty. An "Automatic" SPOF algorithm is presented which determines when the SPOF is able to provide better signal to noise results than traditional PIV processing. The SPOF based optical correlation processing approach is presented as a new paradigm for more robust cross-correlation processing of low signal-to-noise ratio DPIV image data."
USDA-ARS?s Scientific Manuscript database
Ambient desorption ionization techniques, such as laser desorption with electrospray ionization assistance (ELDI), direct analysis in real time (DART) and desorption electrospray ionization (DESI) have been developed as alternatives to traditional mass spectrometric-based methods. Such techniques al...
[Construction of multiple drug release system based on components of traditional Chinese medicine].
Liu, Dan; Jia, Xiaobin; Yu, Danhong; Zhang, Zhenhai; Sun, E
2012-08-01
With the development of the modernization drive of traditional Chinese medicine (TCM) preparations, new-type TCM dosage forms research have become a hot spot in the field. Because of complexity of TCM components as well as uncertainty of material base, there is still not a scientific system for modern TCM dosage forms so far. Modern TCM preparations inevitably take the nature of the multi-component and the general function characteristics of multi-link and multi-target into account. The author suggests building a multiple drug release system for TCM using diverse preparation techniques and drug release methods at levels on the basis the nature and function characteristics of TCM components. This essay expounds elaborates the ideas to build the multiple traditional Chinese medicine release system, theoretical basis, preparation techniques and assessment system, current problems and solutions, in order to build a multiple TCM release system with a view of enhancing the bioavailability of TCM components and provide a new form for TCM preparations.
NASA Astrophysics Data System (ADS)
Li, Ning; Wang, Yan; Xu, Kexin
2006-08-01
Combined with Fourier transform infrared (FTIR) spectroscopy and three kinds of pattern recognition techniques, 53 traditional Chinese medicine danshen samples were rapidly discriminated according to geographical origins. The results showed that it was feasible to discriminate using FTIR spectroscopy ascertained by principal component analysis (PCA). An effective model was built by employing the Soft Independent Modeling of Class Analogy (SIMCA) and PCA, and 82% of the samples were discriminated correctly. Through use of the artificial neural network (ANN)-based back propagation (BP) network, the origins of danshen were completely classified.
Thunder among the pines: defining a pan-Asian soma.
Dannaway, Frederick
2009-03-01
Many ancient cultures and religions engaged in various techniques and used various substances to instigate religious experience and to alter perception. These techniques of psycho-sexual drug yoga reached an unparalleled level of sophistication that arose and was often cloaked in practical terms of alchemy and metallurgy. The Vedic tradition describes this plant-based ritualism as soma, which has been identified by Gordon Wasson as the mushroom Amanita muscaria. This article traces these soma-influenced sects of esoteric Buddhism that exerted influences from India, China and Tibet to Japan. Some of the key components, practices and symbolism are retained despite numerous cultural filters. Japan's tradition of esoteric Buddhism can thus be seen to have preserved and incorporated the soma/amrita mushroom lore into its own traditions of mountain ascetic mystics.
Tunable Infrared Metasurface on a Soft Polymer Scaffold.
Reeves, Jeremy B; Jayne, Rachael K; Stark, Thomas J; Barrett, Lawrence K; White, Alice E; Bishop, David J
2018-05-09
The fabrication of metallic electromagnetic meta-atoms on a soft microstructured polymer scaffold using a MEMS-based stencil lithography technique is demonstrated. Using this technique, complex metasurfaces that are generally impossible to fabricate with traditional photolithographic techniques are created. By engineering the mechanical deformation of the polymer scaffold, the metasurface reflectivity in the mid-infrared can be tuned by the application of moderate strains.
A voxel-based approach to gray matter asymmetries.
Luders, E; Gaser, C; Jancke, L; Schlaug, G
2004-06-01
Voxel-based morphometry (VBM) was used to analyze gray matter (GM) asymmetries in a large sample (n = 60) of male and female professional musicians with and without absolute pitch (AP). We chose to examine these particular groups because previous studies using traditional region-of-interest (ROI) analyses have shown differences in hemispheric asymmetry related to AP and gender. Voxel-based methods may have advantages over traditional ROI-based methods since the analysis can be performed across the whole brain with minimal user bias. After determining that the VBM method was sufficiently sensitive for the detection of differences in GM asymmetries between groups, we found that male AP musicians were more leftward lateralized in the anterior region of the planum temporale (PT) than male non-AP musicians. This confirmed the results of previous studies using ROI-based methods that showed an association between PT asymmetry and the AP phenotype. We further observed that male non-AP musicians revealed an increased leftward GM asymmetry in the postcentral gyrus compared to female non-AP musicians, again corroborating results of a previously published study using ROI-based methods. By analyzing hemispheric GM differences across our entire sample, we were able to partially confirm findings of previous studies using traditional morphometric techniques, as well as more recent, voxel-based analyses. In addition, we found some unusually pronounced GM asymmetries in our musician sample not previously detected in subjects unselected for musical training. Since we were able to validate gender- and AP-related brain asymmetries previously described using traditional ROI-based morphometric techniques, the results of our analyses support the use of VBM for examinations of GM asymmetries.
NASA Astrophysics Data System (ADS)
Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin
2015-10-01
The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.
Murdoch, B E; Pitt, G; Theodoros, D G; Ward, E C
1999-01-01
The efficacy of traditional and physiological biofeedback methods for modifying abnormal speech breathing patterns was investigated in a child with persistent dysarthria following severe traumatic brain injury (TBI). An A-B-A-B single-subject experimental research design was utilized to provide the subject with two exclusive periods of therapy for speech breathing, based on traditional therapy techniques and physiological biofeedback methods, respectively. Traditional therapy techniques included establishing optimal posture for speech breathing, explanation of the movement of the respiratory muscles, and a hierarchy of non-speech and speech tasks focusing on establishing an appropriate level of sub-glottal air pressure, and improving the subject's control of inhalation and exhalation. The biofeedback phase of therapy utilized variable inductance plethysmography (or Respitrace) to provide real-time, continuous visual biofeedback of ribcage circumference during breathing. As in traditional therapy, a hierarchy of non-speech and speech tasks were devised to improve the subject's control of his respiratory pattern. Throughout the project, the subject's respiratory support for speech was assessed both instrumentally and perceptually. Instrumental assessment included kinematic and spirometric measures, and perceptual assessment included the Frenchay Dysarthria Assessment, Assessment of Intelligibility of Dysarthric Speech, and analysis of a speech sample. The results of the study demonstrated that real-time continuous visual biofeedback techniques for modifying speech breathing patterns were not only effective, but superior to the traditional therapy techniques for modifying abnormal speech breathing patterns in a child with persistent dysarthria following severe TBI. These results show that physiological biofeedback techniques are potentially useful clinical tools for the remediation of speech breathing impairment in the paediatric dysarthric population.
An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques
ERIC Educational Resources Information Center
de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.
2017-01-01
This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…
Alternative evaluation of innovations’ effectiveness in mechanical engineering
NASA Astrophysics Data System (ADS)
Puryaev, A. S.
2017-09-01
The aim of present work is approbation of the developed technique for assessing innovations’ effectiveness. We demonstrate an alternative assessment of innovations’ effectiveness (innovation projects) in mechanical engineering on illustrative example. It is proposed as an alternative to the traditional method technique based on the value concept and the method of “Cash flow”.
Integrating Traditional Learning and Games on Large Displays: An Experimental Study
ERIC Educational Resources Information Center
Ardito, Carmelo; Lanzilotti, Rosa; Costabile, Maria F.; Desolda, Giuseppe
2013-01-01
Current information and communication technology (ICT) has the potential to bring further changes to education. New learning techniques must be identified to take advantage of recent technological tools, such as smartphones, multimodal interfaces, multi-touch displays, etc. Game-based techniques that capitalize on ICT have proven to be very…
NASA Astrophysics Data System (ADS)
Bae, Albert; Westendorf, Christian; Erlenkamper, Christoph; Galland, Edouard; Franck, Carl; Bodenschatz, Eberhard; Beta, Carsten
2010-03-01
Eukaryotic cell flattening is valuable for improving microscopic observations, ranging from bright field to total internal reflection fluorescence microscopy. In this talk, we will discuss traditional overlay techniques, and more modern, microfluidic based flattening, which provides a greater level of control. We demonstrate these techniques on the social amoebae Dictyostelium discoideum, comparing the advantages and disadvantages of each method.
USDA-ARS?s Scientific Manuscript database
Streptococcus iniae is among the major pathogens of a large number of fish species cultured in fresh and marine recirculating and net pen production systems . The traditional plate culture technique to detect and identify S. iniae is time consuming and may be problematic due to phenotypic variations...
Biochemistry and Molecular Biology Techniques for Person Characterization
ERIC Educational Resources Information Center
Herrero, Salvador; Ivorra, Jose Luis; Garcia-Sogo, Magdalena; Martinez-Cortina, Carmen
2008-01-01
Using the traditional serological tests and the most novel techniques for DNA fingerprinting, forensic scientists scan different traits that vary from person to person and use the data to include or exclude suspects based on matching with the evidence obtained in a criminal case. Although the forensic application of these methods is well known,…
Visualization techniques for tongue analysis in traditional Chinese medicine
NASA Astrophysics Data System (ADS)
Pham, Binh L.; Cai, Yang
2004-05-01
Visual inspection of the tongue has been an important diagnostic method of Traditional Chinese Medicine (TCM). Clinic data have shown significant connections between various viscera cancers and abnormalities in the tongue and the tongue coating. Visual inspection of the tongue is simple and inexpensive, but the current practice in TCM is mainly experience-based and the quality of the visual inspection varies between individuals. The computerized inspection method provides quantitative models to evaluate color, texture and surface features on the tongue. In this paper, we investigate visualization techniques and processes to allow interactive data analysis with the aim to merge computerized measurements with human expert's diagnostic variables based on five-scale diagnostic conditions: Healthy (H), History Cancers (HC), History of Polyps (HP), Polyps (P) and Colon Cancer (C).
Lee, Yi-Hsuan; von Davier, Alina A
2013-07-01
Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.
Green-noise halftoning with dot diffusion
NASA Astrophysics Data System (ADS)
Lippens, Stefaan; Philips, Wilfried
2007-02-01
Dot diffusion is a halftoning technique that is based on the traditional error diffusion concept, but offers a high degree of parallel processing by its block based approach. Traditional dot diffusion however suffers from periodicity artifacts. To limit the visibility of these artifacts, we propose grid diffusion, which applies different class matrices for different blocks. Furthermore, in this paper we will discuss two approaches in the dot diffusion framework to generate green-noise halftone patterns. The first approach is based on output dependent feedback (hysteresis), analogous to the standard green-noise error diffusion techniques. We observe that the resulting halftones are rather coarse and highly dependent on the used dot diffusion class matrices. In the second approach we don't limit the diffusion to the nearest neighbors. This leads to less coarse halftones, compared to the first approach. The drawback is that it can only cope with rather limited cluster sizes. We can reduce these drawbacks by combining the two approaches.
Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George
2017-06-26
We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.
Modeling corneal surfaces with rational functions for high-speed videokeratoscopy data compression.
Schneider, Martin; Iskander, D Robert; Collins, Michael J
2009-02-01
High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.
COMPUTERIZED RISK AND BIOACCUMULATION SYSTEM (VERSION 1.0)
CRABS is a combination of a rule-based expert system and more traditional procedural programming techniques. ule-based expert systems attempt to emulate the decision making process of human experts within a clearly defined subject area. xpert systems consist of an "inference engi...
Performance comparison of optical interference cancellation system architectures.
Lu, Maddie; Chang, Matt; Deng, Yanhua; Prucnal, Paul R
2013-04-10
The performance of three optics-based interference cancellation systems are compared and contrasted with each other, and with traditional electronic techniques for interference cancellation. The comparison is based on a set of common performance metrics that we have developed for this purpose. It is shown that thorough evaluation of our optical approaches takes into account the traditional notions of depth of cancellation and dynamic range, along with notions of link loss and uniformity of cancellation. Our evaluation shows that our use of optical components affords performance that surpasses traditional electronic approaches, and that the optimal choice for an optical interference canceller requires taking into account the performance metrics discussed in this paper.
Contrasting faith-based and traditional substance abuse treatment programs.
Neff, James Alan; Shorkey, Clayton T; Windsor, Liliane Cambraia
2006-01-01
This article (a) discusses the definition of faith-based substance abuse treatment programs, (b) juxtaposes Durkheim's theory regarding religion with treatment process model to highlight key dimensions of faith-based and traditional programs, and (c) presents results from a study of seven programs to identify key program dimensions and to identify differences/similarities between program types. Focus group/Concept Mapping techniques yielded a clear "spiritual activities, beliefs, and rituals" dimension, rated as significantly more important to faith-based programs. Faith-based program staff also rated "structure and discipline" as more important and "work readiness" as less important. No differences were found for "group activities/cohesion" and "role modeling/mentoring," "safe, supportive environment," and "traditional treatment modalities." Programs showed substantial similarities with regard to core social processes of treatment such as mentoring, role modeling, and social cohesion. Implications are considered for further research on treatment engagement, retention, and other outcomes.
Efficacy of problem based learning in a high school science classroom
NASA Astrophysics Data System (ADS)
Rissi, James Ryan
At the high school level, the maturity of the students, as well as constraints of the traditional high school (both in terms of class time, and number of students), impedes the use of the Problem-based instruction. But with more coaching, guidance, and planning, Problem-based Learning may be an effective teaching technique with secondary students. In recent years, the State of Michigan High School Content Expectations have emphasized the importance of inquiry and problem solving in the high school science classroom. In order to help students gain inquiry and problem solving skills, a move towards a problem-based curriculum and away from the didactic approach may lead to favorable results. In this study, the problem-based-learning framework was implemented in a high school Anatomy and Physiology classroom. Using pre-tests and post-tests over the material presented using the Problem-based technique, student comprehension and long-term retention of the material was monitored. It was found that Problem-based Learning produced comparable test performance when compared to traditional lecture, note-taking, and enrichment activities. In addition, students showed evidence of gaining research and team-working skills.
Zhuang, Yan; Xie, Bangtie; Weng, Shengxin; Xie, Yanming
2011-10-01
To construct real world integrated data warehouse on re-evaluation of post-marketing traditional Chinese medicine for the research on key techniques of clinic re-evaluation which mainly includes indication of traditional Chinese medicine, dosage usage, course of treatment, unit medication, combined disease and adverse reaction, which provides data for reviewed research on its safety,availability and economy,and provides foundation for perspective research. The integrated data warehouse extracts and integrate data from HIS by information collection system and data warehouse technique and forms standard structure and data. The further research is on process based on the data. A data warehouse and several sub data warehouses were built, which focused on patients' main records, doctor orders, diseases diagnoses, laboratory results and economic indications in hospital. These data warehouses can provide research data for re-evaluation of post-marketing traditional Chinese medicine, and it has clinical value. Besides, it points out the direction for further research.
Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering
NASA Astrophysics Data System (ADS)
Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki
2018-03-01
We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.
Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong
2014-01-01
Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980
The influence of surface finishing methods on touch-sensitive reactions
NASA Astrophysics Data System (ADS)
Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.
2017-02-01
This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.
NASA Technical Reports Server (NTRS)
Barnden, John; Srinivas, Kankanahalli
1990-01-01
Symbol manipulation as used in traditional Artificial Intelligence has been criticized by neural net researchers for being excessively inflexible and sequential. On the other hand, the application of neural net techniques to the types of high-level cognitive processing studied in traditional artificial intelligence presents major problems as well. A promising way out of this impasse is to build neural net models that accomplish massively parallel case-based reasoning. Case-based reasoning, which has received much attention recently, is essentially the same as analogy-based reasoning, and avoids many of the problems leveled at traditional artificial intelligence. Further problems are avoided by doing many strands of case-based reasoning in parallel, and by implementing the whole system as a neural net. In addition, such a system provides an approach to some aspects of the problems of noise, uncertainty and novelty in reasoning systems. The current neural net system (Conposit), which performs standard rule-based reasoning, is being modified into a massively parallel case-based reasoning version.
Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A
2015-10-01
Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales.
[Modified Misgav-Labach at a tertiary hospital].
Martínez Ceccopieri, David Alejandro; Barrios Prieto, Ernesto; Martínez Ríos, David
2012-08-01
According to several studies from around the globe, the modified Misgav Ladach technique simplifies the surgical procedure for cesarean section, reduces operation time, costs, and complications, and optimizes obstetric and perinatal outcomes. Compare obstetric outcomes between patients operated on using traditional cesarean section technique and those operated on using modified Misgav Ladach technique. The study included 49 patients operated on using traditional cesarean section technique and 47 patients operated on using modified Misgav Ladach technique to compare the outcomes in both surgical techniques. The modified Misgav Ladach technique was associated with more benefits than those of the traditional technique: less surgical bleeding, less operation time, less analgesic total doses, less rescue analgesic doses and less need of more than one analgesic drug. The modified Misgav Ladach surgical technique was associated with better obstetric results than those of the traditional surgical technique; this concurs with the results reported by other national and international studies.
Consani, Rafael Leonardo Xediek; Domitti, Saide Sarckis; Consani, Simonides
2002-09-01
The pressure of final closure may be released when the flask is removed from the mechanical or pneumatic press and placed in the spring clamp. This release in pressure may result in dimensional changes that distort the denture base. The purpose of this study was to investigate differences between the dimensional stability of standardized simulated denture bases processed by traditional moist heat-polymerization and those processed by use of a new tension system. A metal master die was fabricated to simulate an edentulous maxillary arch without irregularities in the alveolar ridge walls. A silicone mold of this metallic die was prepared, and 40 stone casts were formed from the mold with type III dental stone. The casts were randomly assigned to 4 test groups (A-D) of 10 specimens each. A uniform denture base pattern was made on each stone cast with a 1.5-mm thickness of base-plate wax, measured with a caliper. The patterns were invested for traditional hot water processing. A polymethyl methacrylate dough was prepared and packed for processing. The flasks in groups A and B were closed with the traditional pressure technique and placed in spring clamps after final closure. The flasks in groups C and D were pressed between the metallic plates of the new tension system after the final closure. The group A and C flasks were immediately immersed in the water processing unit at room temperature (25 degrees +/- 2 degrees C). The unit was programmed to raise the temperature to 74 degrees C over 1 hour, and then maintained the temperature at 74 degrees C for 8 hours. The group B and D flasks were bench stored at room temperature (25 degrees +/- 2 degrees C) for 6 hours and were then subjected to the same moist heat polymerization conditions as groups A and C. All processed dentures were bench cooled for 3 hours. After recovery from the flasks, the base-cast sets were transversally sectioned into 3 parts (corresponding to 3 zones): (1) distal of the canines, (2) mesial of the first molars, and (3) mesial of the posterior palate). These areas had been previously established and standardized by use of a pattern denture in the sawing device to determine the sections in each base-cast set. Base-cast gaps were measured at 5 predetermined points on each section with an optical micrometer that had a tolerance of 0.001 mm. Collected data were analyzed with analysis of variance and Tukey's test. Denture bases processed with the new tension system exhibited significantly better base adaptation than those processed with traditional acrylic resin packing. Immediately after polymerization (Groups A and C), mean dimensional change values were 0.213 +/- 0.055 mm for the traditional packing technique and 0.173 +/- 0.050 mm for new tension system. After delayed polymerization (Groups B and D), the values were 0.216 +/- 0.074 mm for the traditional packing technique and 0.164 +/- 0.032 mm for new tension system. With both techniques, dimensional changes in the posterior palatal zone were greater (conventional = 0.286 +/- 0.038 mm; new system = 0.214 +/- 0.024 mm) than those elsewhere on the base-cast set. Within the limitations of this study, the new tension packing system was associated with decreased dimensional changes in the simulated maxillary denture bases processed with heat-polymerization.
A raman microprobe investigation of the molecular architecture of loblolly pine tracheids
James S. Bond; Rajai H. Atalla
1999-01-01
Our understanding of the molecular architecture of intact, native plant cell walls is very limited. Traditional methods of investigation disturb the tissue to varying degrees and conclusions based on these methods may be intimately related to the technique used. A promising new technique to study native-state organization is polarized Raman spectroscopy. In this...
Antimicrobial efficacy of soap and water hand washing versus an alcohol-based hand cleanser.
Holton, Ronald H; Huber, Michaell A; Terezhalmy, Geza T
2009-12-01
The emergence of alcohol-based hand cleansers may represent an alternative to soap and water in the clinical dental setting. In this study, the antimicrobial efficacy of traditional hand washing vs. a unique alcohol-based hand cleanser with persistence was evaluated. Two experienced dentists participated over a 10-day period. On days 1-5, each clinician used an antibacterial liquid soap (Dial, Dial Corporation, Scottsdale, AZ). Days 6-10, an alcohol-based hand cleanser (Triseptin Water Optional, Healthpoint Surgical, Fort Worth, TX) was used. Sampling was by modified glove juice technique. The results indicate that the alcohol-based hand cleanser dramatically outperforms the traditional hand washing agent in the general dental setting.
A novel data processing technique for image reconstruction of penumbral imaging
NASA Astrophysics Data System (ADS)
Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin
2011-06-01
CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.
DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine.
Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li; Li, Haifeng
2018-01-01
Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine.
DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine
Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li
2018-01-01
Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine. PMID:29849709
Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali
2018-06-01
Text categorization has been used extensively in recent years to classify plain-text clinical reports. This study employs text categorization techniques for the classification of open narrative forensic autopsy reports. One of the key steps in text classification is document representation. In document representation, a clinical report is transformed into a format that is suitable for classification. The traditional document representation technique for text categorization is the bag-of-words (BoW) technique. In this study, the traditional BoW technique is ineffective in classifying forensic autopsy reports because it merely extracts frequent but discriminative features from clinical reports. Moreover, this technique fails to capture word inversion, as well as word-level synonymy and polysemy, when classifying autopsy reports. Hence, the BoW technique suffers from low accuracy and low robustness unless it is improved with contextual and application-specific information. To overcome the aforementioned limitations of the BoW technique, this research aims to develop an effective conceptual graph-based document representation (CGDR) technique to classify 1500 forensic autopsy reports from four (4) manners of death (MoD) and sixteen (16) causes of death (CoD). Term-based and Systematized Nomenclature of Medicine-Clinical Terms (SNOMED CT) based conceptual features were extracted and represented through graphs. These features were then used to train a two-level text classifier. The first level classifier was responsible for predicting MoD. In addition, the second level classifier was responsible for predicting CoD using the proposed conceptual graph-based document representation technique. To demonstrate the significance of the proposed technique, its results were compared with those of six (6) state-of-the-art document representation techniques. Lastly, this study compared the effects of one-level classification and two-level classification on the experimental results. The experimental results indicated that the CGDR technique achieved 12% to 15% improvement in accuracy compared with fully automated document representation baseline techniques. Moreover, two-level classification obtained better results compared with one-level classification. The promising results of the proposed conceptual graph-based document representation technique suggest that pathologists can adopt the proposed system as their basis for second opinion, thereby supporting them in effectively determining CoD. Copyright © 2018 Elsevier Inc. All rights reserved.
Pre-Nursing Students Perceptions of Traditional and Inquiry Based Chemistry Laboratories
NASA Astrophysics Data System (ADS)
Rogers, Jessica
This paper describes a process that attempted to meet the needs of undergraduate students in a pre-nursing chemistry class. The laboratory was taught in traditional verification style and students were surveyed to assess their perceptions of the educational goals of the laboratory. A literature review resulted in an inquiry based method and analysis of the needs of nurses resulted in more application based activities. This new inquiry format was implemented the next semester, the students were surveyed at the end of the semester and results were compared to the previous method. Student and instructor response to the change in format was positive. Students in the traditional format placed goals concerning technique above critical thinking and felt the lab was easy to understand and carry out. Students in the inquiry based lab felt they learned more critical thinking skills and enjoyed the independence of designing experiments and answering their own questions.
Remote sensing as a source of land cover information utilized in the universal soil loss equation
NASA Technical Reports Server (NTRS)
Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.; Scarpace, F. L.
1979-01-01
In this study, methods for gathering the land use/land cover information required by the USLE were investigated with medium altitude, multi-date color and color infrared 70-mm positive transparencies using human and computer-based interpretation techniques. Successful results, which compare favorably with traditional field study methods, were obtained within the test site watershed with airphoto data sources and human airphoto interpretation techniques. Computer-based interpretation techniques were not capable of identifying soil conservation practices but were successful to varying degrees in gathering other types of desired land use/land cover information.
NASA Technical Reports Server (NTRS)
Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.
1984-01-01
Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
NASA Astrophysics Data System (ADS)
Azrina Talik, Noor; Boon Kar, Yap; Noradhlia Mohamad Tukijan, Siti; Wong, Chuan Ling
2017-10-01
To date, the state of art organic semiconductor distributed feedback (DFB) lasers gains tremendous interest in the organic device industry. This paper presents a short reviews on the fabrication techniques of DFB based laser by focusing on the fabrication method of DFB corrugated structure and the deposition of organic gain on the nano-patterned DFB resonator. The fabrication techniques such as Laser Direct Writing (LDW), ultrafast photo excitation dynamics, Laser Interference Lithography (LIL) and Nanoimprint Lithography (NIL) for DFB patterning are presented. In addition to that, the method for gain medium deposition method is also discussed. The technical procedures of the stated fabrication techniques are summarized together with their benefits and comparisons to the traditional fabrication techniques.
Applying Case-Based Reasoning in Knowledge Management to Support Organizational Performance
ERIC Educational Resources Information Center
Wang, Feng-Kwei
2006-01-01
Research and practice in human performance technology (HPT) has recently accelerated the search for innovative approaches to supplement or replace traditional training interventions for improving organizational performance. This article examines a knowledge management framework built upon the theories and techniques of case-based reasoning (CBR)…
NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.
ERIC Educational Resources Information Center
Zhou, Lina; Zhang, Dongsong
2003-01-01
Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…
ORGANIZATIONAL RISK COMMUNICATION
Ris communication tools in organizations differs in several ways from many of tools and techniques developed for public meetings. The traditional view of risk communication seeks to manage the public outrage ssociated with site-based issues. Organizational risk communication seek...
Hospital positioning: a strategic tool for the 1990s.
San Augustine, A J; Long, W J; Pantzallis, J
1992-03-01
The authors extend the process of market positioning in the health care sector by focusing on the simultaneous utilization of traditional research methods and emerging new computer-based adaptive perceptual mapping technologies and techniques.
Analysis of preparation of Chinese traditional medicine based on the fiber fingerprint drop trace
NASA Astrophysics Data System (ADS)
Zhang, Zhilin; Wang, Jialu; Sun, Weimin; Yan, Qi
2010-11-01
The purpose of the fiber micro-drop analyzing technique is to measure the characteristics of liquids using optical methods. The fiber fingerprint drop trace (FFDT) is a curve of light intensity vs. time. This curve indicates the forming, growing and dripping processes of the liquid drops. A pair of fibers was used to monitor the dripping process. The FFDTs are acquired and analyzed by a computer. Different liquid samples of many kinds of preparation of Chinese traditional medicines were tested by using the fiber micro-drop sensor in the experiments. The FFDTs of preparation of Chinese traditional medicines with different concentrations were analyzed in different ways. Considering the characters of the FFDTs, a novel method is proposed to measure the different preparation of Chinese traditional medicines and its concentration based on the corresponding relationship of FFDTs and the physical and chemical parameters of the liquids.
ERIC Educational Resources Information Center
Garrett, Michael Tlanusta; Brubaker, Michael; Torres-Rivera, Edil; West-Olatunji, Cirecie; Conwill, William L.
2008-01-01
This article provides group counselors a description of Ayeli, a culturally-based centering technique rooted in Native American traditions. Ayeli is a process that allows participants an opportunity to experience and reflect on four crucial elements relevant to wellness from a Native American perspective: belonging, mastery, independence, and…
Multigrid Strategies for Viscous Flow Solvers on Anisotropic Unstructured Meshes
NASA Technical Reports Server (NTRS)
Movriplis, Dimitri J.
1998-01-01
Unstructured multigrid techniques for relieving the stiffness associated with high-Reynolds number viscous flow simulations on extremely stretched grids are investigated. One approach consists of employing a semi-coarsening or directional-coarsening technique, based on the directions of strong coupling within the mesh, in order to construct more optimal coarse grid levels. An alternate approach is developed which employs directional implicit smoothing with regular fully coarsened multigrid levels. The directional implicit smoothing is obtained by constructing implicit lines in the unstructured mesh based on the directions of strong coupling. Both approaches yield large increases in convergence rates over the traditional explicit full-coarsening multigrid algorithm. However, maximum benefits are achieved by combining the two approaches in a coupled manner into a single algorithm. An order of magnitude increase in convergence rate over the traditional explicit full-coarsening algorithm is demonstrated, and convergence rates for high-Reynolds number viscous flows which are independent of the grid aspect ratio are obtained. Further acceleration is provided by incorporating low-Mach-number preconditioning techniques, and a Newton-GMRES strategy which employs the multigrid scheme as a preconditioner. The compounding effects of these various techniques on speed of convergence is documented through several example test cases.
NASA Astrophysics Data System (ADS)
Lyu, Jiang-Tao; Zhou, Chen
2017-12-01
Ionospheric refraction is one of the principal error sources for limiting the accuracy of radar systems for space target detection. High-accuracy measurement of the ionospheric electron density along the propagation path of radar wave is the most important procedure for the ionospheric refraction correction. Traditionally, the ionospheric model and the ionospheric detection instruments, like ionosonde or GPS receivers, are employed for obtaining the electron density. However, both methods are not capable of satisfying the requirements of correction accuracy for the advanced space target radar system. In this study, we propose a novel technique for ionospheric refraction correction based on radar dual-frequency detection. Radar target range measurements at two adjacent frequencies are utilized for calculating the electron density integral exactly along the propagation path of the radar wave, which can generate accurate ionospheric range correction. The implementation of radar dual-frequency detection is validated by a P band radar located in midlatitude China. The experimental results present that the accuracy of this novel technique is more accurate than the traditional ionospheric model correction. The technique proposed in this study is very promising for the high-accuracy radar detection and tracking of objects in geospace.
A procedure to achieve fine control in MW processing of foods
NASA Astrophysics Data System (ADS)
Cuccurullo, G.; Cinquanta, L.; Sorrentino, G.
2007-01-01
A two-dimensional analytical model for predicting the unsteady temperature field in a cylindrical shaped body affected by spatially varying heat generation is presented. The dimensionless problem is solved analytically by using both partial solutions and the variation of parameters techniques. Having in mind industrial microwave heating for food pasteurization, the easy-to-handle solution is used to confirm the intrinsic lack of spatial uniformity of such a treatment in comparison to the traditional one. From an experimental point of view, a batch pasteurization treatment was realized to compare the effect of two different control techniques both based on IR thermography readout: the former assured a classical PID control, while the latter was based on a "shadowing" technique, consisting in covering portions of the sample which are hot enough with a mobile metallic screen. A measure of the effectiveness of the two control techniques was obtained by evaluating the thermal death curves of a strain Lactobacillus plantarum submitted to pasteurization temperatures. Preliminary results showed meaningful increases in the microwave thermal inactivation of the L. plantarum and similar significant decreases in thermal inactivation time with respect to the traditional pasteurization thermal treatment.
Extending the knowledge in histochemistry and cell biology.
Heupel, Wolfgang-Moritz; Drenckhahn, Detlev
2010-01-01
Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.
Web-Based Interactive 3D Visualization as a Tool for Improved Anatomy Learning
ERIC Educational Resources Information Center
Petersson, Helge; Sinkvist, David; Wang, Chunliang; Smedby, Orjan
2009-01-01
Despite a long tradition, conventional anatomy education based on dissection is declining. This study tested a new virtual reality (VR) technique for anatomy learning based on virtual contrast injection. The aim was to assess whether students value this new three-dimensional (3D) visualization method as a learning tool and what value they gain…
Generalized query-based active learning to identify differentially methylated regions in DNA.
Haque, Md Muksitul; Holder, Lawrence B; Skinner, Michael K; Cook, Diane J
2013-01-01
Active learning is a supervised learning technique that reduces the number of examples required for building a successful classifier, because it can choose the data it learns from. This technique holds promise for many biological domains in which classified examples are expensive and time-consuming to obtain. Most traditional active learning methods ask very specific queries to the Oracle (e.g., a human expert) to label an unlabeled example. The example may consist of numerous features, many of which are irrelevant. Removing such features will create a shorter query with only relevant features, and it will be easier for the Oracle to answer. We propose a generalized query-based active learning (GQAL) approach that constructs generalized queries based on multiple instances. By constructing appropriately generalized queries, we can achieve higher accuracy compared to traditional active learning methods. We apply our active learning method to find differentially DNA methylated regions (DMRs). DMRs are DNA locations in the genome that are known to be involved in tissue differentiation, epigenetic regulation, and disease. We also apply our method on 13 other data sets and show that our method is better than another popular active learning technique.
Fitting Prony Series To Data On Viscoelastic Materials
NASA Technical Reports Server (NTRS)
Hill, S. A.
1995-01-01
Improved method of fitting Prony series to data on viscoelastic materials involves use of least-squares optimization techniques. Based on optimization techniques yields closer correlation with data than traditional method. Involves no assumptions regarding the gamma'(sub i)s and higher-order terms, and provides for as many Prony terms as needed to represent higher-order subtleties in data. Curve-fitting problem treated as design-optimization problem and solved by use of partially-constrained-optimization techniques.
Resolution-improved in situ DNA hybridization detection based on microwave photonic interrogation.
Cao, Yuan; Guo, Tuan; Wang, Xudong; Sun, Dandan; Ran, Yang; Feng, Xinhuan; Guan, Bai-ou
2015-10-19
In situ bio-sensing system based on microwave photonics filter (MPF) interrogation method with improved resolution is proposed and experimentally demonstrated. A microfiber Bragg grating (mFBG) is used as sensing probe for DNA hybridization detection. Different from the traditional wavelength monitoring technique, we use the frequency interrogation scheme for resolution-improved bio-sensing detection. Experimental results show that the frequency shift of MPF notch presents a linear response to the surrounding refractive index (SRI) change over the range of 1.33 to 1.38, with a SRI resolution up to 2.6 × 10(-5) RIU, which has been increased for almost two orders of magnitude compared with the traditional fundamental mode monitoring technique (~3.6 × 10(-3) RIU). Due to the high Q value (about 27), the whole process of DNA hybridization can be in situ monitored. The proposed MPF-based bio-sensing system provides a new interrogation method over the frequency domain with improved sensing resolution and rapid interrogation rate for biochemical and environmental measurement.
A rapid method for the sampling of atmospheric water vapour for isotopic analysis.
Peters, Leon I; Yakir, Dan
2010-01-01
Analysis of the stable isotopic composition of atmospheric moisture is widely applied in the environmental sciences. Traditional methods for obtaining isotopic compositional data from ambient moisture have required complicated sampling procedures, expensive and sophisticated distillation lines, hazardous consumables, and lengthy treatments prior to analysis. Newer laser-based techniques are expensive and usually not suitable for large-scale field campaigns, especially in cases where access to mains power is not feasible or high spatial coverage is required. Here we outline the construction and usage of a novel vapour-sampling system based on a battery-operated Stirling cycle cooler, which is simple to operate, does not require any consumables, or post-collection distillation, and is light-weight and highly portable. We demonstrate the ability of this system to reproduce delta(18)O isotopic compositions of ambient water vapour, with samples taken simultaneously by a traditional cryogenic collection technique. Samples were collected over 1 h directly into autosampler vials and were analysed by mass spectrometry after pyrolysis of 1 microL aliquots to CO. This yielded an average error of < +/-0.5 per thousand, approximately equal to the signal-to-noise ratio of traditional approaches. This new system provides a rapid and reliable alternative to conventional cryogenic techniques, particularly in cases requiring high sample throughput or where access to distillation lines, slurry maintenance or mains power is not feasible. Copyright 2009 John Wiley & Sons, Ltd.
Teaching strategies and student achievement in high school block scheduled biology classes
NASA Astrophysics Data System (ADS)
Louden, Cynthia Knapp
The objectives of this study included determining whether teachers in block or traditionally scheduled biology classes (1) implement inquiry-based instruction more often or with different methods, (2) understand the concept of inquiry-based instruction as it is described in the National Science Standards, (3) have classes with significantly different student achievement, and (4) believe that their school schedule facilitates their use of inquiry-based instruction in the classroom. Biology teachers in block and non-block scheduled classes were interviewed, surveyed, and observed to determine the degree to which they implement inquiry-based instructional practices in their classrooms. State biology exams were used to indicate student achievement. Teachers in block scheduled and traditional classes used inquiry-based instruction with nearly the same frequency. Approximately 30% of all teachers do not understand the concept of inquiry-based instruction as described by the National Science Standards. No significant achievement differences between block and traditionally scheduled biology classes were found using ANCOVA analyses and a nonequivalent control-group quasi-experimental design. Using the same analysis techniques, significant achievement differences were found between biology classes with teachers who used inquiry-based instruction frequently and infrequently. Teachers in block schedules believed that their schedules facilitated inquiry-based instruction more than teachers in traditional schedules.
Figure analysis: A teaching technique to promote visual literacy and active Learning.
Wiles, Amy M
2016-07-08
Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
Exploration of video-based structural health monitoring techniques.
DOT National Transportation Integrated Search
2014-10-01
Structural health monitoring (SHM) has become a viable tool to provide owners with objective data for maintenance and repair. Traditionally, discrete contact sensors such as strain gages or accelerometers have been used : for SHM. However, distribute...
Edge enhancement and noise suppression for infrared image based on feature analysis
NASA Astrophysics Data System (ADS)
Jiang, Meng
2018-06-01
Infrared images are often suffering from background noise, blurred edges, few details and low signal-to-noise ratios. To improve infrared image quality, it is essential to suppress noise and enhance edges simultaneously. To realize it in this paper, we propose a novel algorithm based on feature analysis in shearlet domain. Firstly, as one of multi-scale geometric analysis (MGA), we introduce the theory and superiority of shearlet transform. Secondly, after analyzing the defects of traditional thresholding technique to suppress noise, we propose a novel feature extraction distinguishing image structures from noise well and use it to improve the traditional thresholding technique. Thirdly, with computing the correlations between neighboring shearlet coefficients, the feature attribute maps identifying the weak detail and strong edges are completed to improve the generalized unsharped masking (GUM). At last, experiment results with infrared images captured in different scenes demonstrate that the proposed algorithm suppresses noise efficiently and enhances image edges adaptively.
NASA Astrophysics Data System (ADS)
Zellweger, Christoph; Emmenegger, Lukas; Firdaus, Mohd; Hatakka, Juha; Heimann, Martin; Kozlova, Elena; Spain, T. Gerard; Steinbacher, Martin; van der Schoot, Marcel V.; Buchmann, Brigitte
2016-09-01
Until recently, atmospheric carbon dioxide (CO2) and methane (CH4) measurements were made almost exclusively using nondispersive infrared (NDIR) absorption and gas chromatography with flame ionisation detection (GC/FID) techniques, respectively. Recently, commercially available instruments based on spectroscopic techniques such as cavity ring-down spectroscopy (CRDS), off-axis integrated cavity output spectroscopy (OA-ICOS) and Fourier transform infrared (FTIR) spectroscopy have become more widely available and affordable. This resulted in a widespread use of these techniques at many measurement stations. This paper is focused on the comparison between a CRDS "travelling instrument" that has been used during performance audits within the Global Atmosphere Watch (GAW) programme of the World Meteorological Organization (WMO) with instruments incorporating other, more traditional techniques for measuring CO2 and CH4 (NDIR and GC/FID). We demonstrate that CRDS instruments and likely other spectroscopic techniques are suitable for WMO/GAW stations and allow a smooth continuation of historic CO2 and CH4 time series. Moreover, the analysis of the audit results indicates that the spectroscopic techniques have a number of advantages over the traditional methods which will lead to the improved accuracy of atmospheric CO2 and CH4 measurements.
NASA Astrophysics Data System (ADS)
George, Paul; Kemeny, Andras; Colombet, Florent; Merienne, Frédéric; Chardonnet, Jean-Rémy; Thouvenin, Indira Mouttapa
2014-02-01
Immersive digital project reviews consist in using virtual reality (VR) as a tool for discussion between various stakeholders of a project. In the automotive industry, the digital car prototype model is the common thread that binds them. It is used during immersive digital project reviews between designers, engineers, ergonomists, etc. The digital mockup is also used to assess future car architecture, habitability or perceived quality requirements with the aim to reduce using physical mockups for optimized cost, delay and quality efficiency. Among the difficulties identified by the users, handling the mockup is a major one. Inspired by current uses of nomad devices (multi-touch gestures, IPhone UI look'n'feel and AR applications), we designed a navigation technique taking advantage of these popular input devices: Space scrolling allows moving around the mockup. In this paper, we present the results of a study we conducted on the usability and acceptability of the proposed smartphone-based interaction metaphor compared to traditional technique and we provide indications of the most efficient choices for different use-cases accordingly. It was carried out in a traditional 4-sided CAVE and its purpose is to assess a chosen set of interaction techniques to be implemented in Renault's new 5-sides 4K x 4K wall high performance CAVE. The proposed new metaphor using nomad devices is well accepted by novice VR users and future implementation should allow an efficient industrial use. Their use is an easy and user friendly alternative of the existing traditional control devices such as a joystick.
Semantically transparent fingerprinting for right protection of digital cinema
NASA Astrophysics Data System (ADS)
Wu, Xiaolin
2003-06-01
Digital cinema, a new frontier and crown jewel of digital multimedia, has the potential of revolutionizing the science, engineering and business of movie production and distribution. The advantages of digital cinema technology over traditional analog technology are numerous and profound. But without effective and enforceable copyright protection measures, digital cinema can be more susceptible to widespread piracy, which can dampen or even prevent the commercial deployment of digital cinema. In this paper we propose a novel approach of fingerprinting each individual distribution copy of a digital movie for the purpose of tracing pirated copies back to their source. The proposed fingerprinting technique presents a fundamental departure from the traditional digital watermarking/fingerprinting techniques. Its novelty and uniqueness lie in a so-called semantic or subjective transparency property. The fingerprints are created by editing those visual and audio attributes that can be modified with semantic and subjective transparency to the audience. Semantically-transparent fingerprinting or watermarking is the most robust kind among all existing watermarking techniques, because it is content-based not sample-based, and semantically-recoverable not statistically-recoverable.
Sanz, Laura M; Crespo, Benigno; De-Cózar, Cristina; Ding, Xavier C; Llergo, Jose L; Burrows, Jeremy N; García-Bustos, Jose F; Gamo, Francisco-Javier
2012-01-01
Chemotherapy is still the cornerstone for malaria control. Developing drugs against Plasmodium parasites and monitoring their efficacy requires methods to accurately determine the parasite killing rate in response to treatment. Commonly used techniques essentially measure metabolic activity as a proxy for parasite viability. However, these approaches are susceptible to artefacts, as viability and metabolism are two parameters that are coupled during the parasite life cycle but can be differentially affected in response to drug actions. Moreover, traditional techniques do not allow to measure the speed-of-action of compounds on parasite viability, which is an essential efficacy determinant. We present here a comprehensive methodology to measure in vitro the direct effect of antimalarial compounds over the parasite viability, which is based on limiting serial dilution of treated parasites and re-growth monitoring. This methodology allows to precisely determine the killing rate of antimalarial compounds, which can be quantified by the parasite reduction ratio and parasite clearance time, which are key mode-of-action parameters. Importantly, we demonstrate that this technique readily permits to determine compound killing activities that might be otherwise missed by traditional, metabolism-based techniques. The analysis of a large set of antimalarial drugs reveals that this viability-based assay allows to discriminate compounds based on their antimalarial mode-of-action. This approach has been adapted to perform medium throughput screening, facilitating the identification of fast-acting antimalarial compounds, which are crucially needed for the control and possibly the eradication of malaria.
Sanz, Laura M.; Crespo, Benigno; De-Cózar, Cristina; Ding, Xavier C.; Llergo, Jose L.; Burrows, Jeremy N.; García-Bustos, Jose F.; Gamo, Francisco-Javier
2012-01-01
Chemotherapy is still the cornerstone for malaria control. Developing drugs against Plasmodium parasites and monitoring their efficacy requires methods to accurately determine the parasite killing rate in response to treatment. Commonly used techniques essentially measure metabolic activity as a proxy for parasite viability. However, these approaches are susceptible to artefacts, as viability and metabolism are two parameters that are coupled during the parasite life cycle but can be differentially affected in response to drug actions. Moreover, traditional techniques do not allow to measure the speed-of-action of compounds on parasite viability, which is an essential efficacy determinant. We present here a comprehensive methodology to measure in vitro the direct effect of antimalarial compounds over the parasite viability, which is based on limiting serial dilution of treated parasites and re-growth monitoring. This methodology allows to precisely determine the killing rate of antimalarial compounds, which can be quantified by the parasite reduction ratio and parasite clearance time, which are key mode-of-action parameters. Importantly, we demonstrate that this technique readily permits to determine compound killing activities that might be otherwise missed by traditional, metabolism-based techniques. The analysis of a large set of antimalarial drugs reveals that this viability-based assay allows to discriminate compounds based on their antimalarial mode-of-action. This approach has been adapted to perform medium throughput screening, facilitating the identification of fast-acting antimalarial compounds, which are crucially needed for the control and possibly the eradication of malaria. PMID:22383983
Navigating Microbiological Food Safety in the Era of Whole-Genome Sequencing
Nasheri, Neda; Petronella, Nicholas; Pagotto, Franco
2016-01-01
SUMMARY The epidemiological investigation of a foodborne outbreak, including identification of related cases, source attribution, and development of intervention strategies, relies heavily on the ability to subtype the etiological agent at a high enough resolution to differentiate related from nonrelated cases. Historically, several different molecular subtyping methods have been used for this purpose; however, emerging techniques, such as single nucleotide polymorphism (SNP)-based techniques, that use whole-genome sequencing (WGS) offer a resolution that was previously not possible. With WGS, unlike traditional subtyping methods that lack complete information, data can be used to elucidate phylogenetic relationships and disease-causing lineages can be tracked and monitored over time. The subtyping resolution and evolutionary context provided by WGS data allow investigators to connect related illnesses that would be missed by traditional techniques. The added advantage of data generated by WGS is that these data can also be used for secondary analyses, such as virulence gene detection, antibiotic resistance gene profiling, synteny comparisons, mobile genetic element identification, and geographic attribution. In addition, several software packages are now available to generate in silico results for traditional molecular subtyping methods from the whole-genome sequence, allowing for efficient comparison with historical databases. Metagenomic approaches using next-generation sequencing have also been successful in the detection of nonculturable foodborne pathogens. This review addresses state-of-the-art techniques in microbial WGS and analysis and then discusses how this technology can be used to help support food safety investigations. Retrospective outbreak investigations using WGS are presented to provide organism-specific examples of the benefits, and challenges, associated with WGS in comparison to traditional molecular subtyping techniques. PMID:27559074
Nguyen, Phung Anh; Yang, Hsuan-Chia; Xu, Rong; Li, Yu-Chuan Jack
2018-01-01
Traditional Chinese Medicine utilization has rapidly increased worldwide. However, there is limited database provides the information of TCM herbs and diseases. The study aims to identify and evaluate the meaningful associations between TCM herbs and breast cancer by using the association rule mining (ARM) techniques. We employed the ARM techniques for 19.9 million TCM prescriptions by using Taiwan National Health Insurance claim database from 1999 to 2013. 364 TCM herbs-breast cancer associations were derived from those prescriptions and were then filtered by their support of 20. Resulting of 296 associations were evaluated by comparing to a gold-standard that was curated information from Chinese-Wikipedia with the following terms, cancer, tumor, malignant. All 14 TCM herbs-breast cancer associations with their confidence of 1% were valid when compared to gold-standard. For other confidences, the statistical results showed consistently with high precisions. We thus succeed to identify the TCM herbs-breast cancer associations with useful techniques.
ERIC Educational Resources Information Center
Charconnet, Marie-George
This study describes various patterns of peer tutoring and is based on the use of cultural traditions and endogenous methods, on techniques and equipment acquired from other cultures, on problems presented by the adoption of educational technologies, and on methods needing little sophisticated equipment. A dozen peer tutoring systems are…
Software risk estimation and management techniques at JPL
NASA Technical Reports Server (NTRS)
Hihn, J.; Lum, K.
2002-01-01
In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.
Teaching Life: Re-Creating and Re-Teaching Literature: Conception to Instruction.
ERIC Educational Resources Information Center
Kerner, Howard A.
Suggesting that instructors approach literature from a multidisciplinary life-based stance, this paper presents syllabi, pedagogical techniques, and a student essay which illustrates a life-based approach to literary themes. The first section of the paper deals with creative curricular re-packaging of great literature in which traditional course…
Considering the Efficacy of Web-Based Worked Examples in Introductory Chemistry
ERIC Educational Resources Information Center
Crippen, Kent J.; Earl, Boyd L.
2004-01-01
Theory suggests that studying worked examples and engaging in self-explanation will improve learning and problem solving. A growing body of evidence supports the use of web-based assessments for improving undergraduate performance in traditional large enrollment courses. This article describes a study designed to investigate these techniques in a…
Mobile Formative Assessment Tool Based on Data Mining Techniques for Supporting Web-Based Learning
ERIC Educational Resources Information Center
Chen, Chih-Ming; Chen, Ming-Chuan
2009-01-01
Current trends clearly indicate that online learning has become an important learning mode. However, no effective assessment mechanism for learning performance yet exists for e-learning systems. Learning performance assessment aims to evaluate what learners learned during the learning process. Traditional summative evaluation only considers final…
Career Goal-Based E-Learning Recommendation Using Enhanced Collaborative Filtering and PrefixSpan
ERIC Educational Resources Information Center
Ma, Xueying; Ye, Lu
2018-01-01
This article describes how e-learning recommender systems nowadays have applied different kinds of techniques to recommend personalized learning content for users based on their preference, goals, interests and background information. However, the cold-start problem which exists in traditional recommendation algorithms are still left over in…
Sun, Lei; Jia, Yun-xian; Cai, Li-ying; Lin, Guo-yu; Zhao, Jin-song
2013-09-01
The spectrometric oil analysis(SOA) is an important technique for machine state monitoring, fault diagnosis and prognosis, and SOA based remaining useful life(RUL) prediction has an advantage of finding out the optimal maintenance strategy for machine system. Because the complexity of machine system, its health state degradation process can't be simply characterized by linear model, while particle filtering(PF) possesses obvious advantages over traditional Kalman filtering for dealing nonlinear and non-Gaussian system, the PF approach was applied to state forecasting by SOA, and the RUL prediction technique based on SOA and PF algorithm is proposed. In the prediction model, according to the estimating result of system's posterior probability, its prior probability distribution is realized, and the multi-step ahead prediction model based on PF algorithm is established. Finally, the practical SOA data of some engine was analyzed and forecasted by the above method, and the forecasting result was compared with that of traditional Kalman filtering method. The result fully shows the superiority and effectivity of the
Advanced Navigation Strategies For Asteroid Sample Return Missions
NASA Technical Reports Server (NTRS)
Getzandanner, K.; Bauman, J.; Williams, B.; Carpenter, J.
2010-01-01
Flyby and rendezvous missions to asteroids have been accomplished using navigation techniques derived from experience gained in planetary exploration. This paper presents analysis of advanced navigation techniques required to meet unique challenges for precision navigation to acquire a sample from an asteroid and return it to Earth. These techniques rely on tracking data types such as spacecraft-based laser ranging and optical landmark tracking in addition to the traditional Earth-based Deep Space Network radio metric tracking. A systematic study of navigation strategy, including the navigation event timeline and reduction in spacecraft-asteroid relative errors, has been performed using simulation and covariance analysis on a representative mission.
The application analysis of the multi-angle polarization technique for ocean color remote sensing
NASA Astrophysics Data System (ADS)
Zhang, Yongchao; Zhu, Jun; Yin, Huan; Zhang, Keli
2017-02-01
The multi-angle polarization technique, which uses the intensity of polarized radiation as the observed quantity, is a new remote sensing means for earth observation. With this method, not only can the multi-angle light intensity data be provided, but also the multi-angle information of polarized radiation can be obtained. So, the technique may solve the problems, those could not be solved with the traditional remote sensing methods. Nowadays, the multi-angle polarization technique has become one of the hot topics in the field of the international quantitative research on remote sensing. In this paper, we firstly introduce the principles of the multi-angle polarization technique, then the situations of basic research and engineering applications are particularly summarized and analysed in 1) the peeled-off method of sun glitter based on polarization, 2) the ocean color remote sensing based on polarization, 3) oil spill detection using polarization technique, 4) the ocean aerosol monitoring based on polarization. Finally, based on the previous work, we briefly present the problems and prospects of the multi-angle polarization technique used in China's ocean color remote sensing.
NASA Technical Reports Server (NTRS)
Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel
2012-01-01
In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.
Defining the questions: a research agenda for nontraditional authentication in arms control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauck, Danielle K; Mac Arthur, Duncan W; Smith, Morag K
Many traditional authentication techniques have been based on hardware solutions. Thus authentication of measurement system hardware has been considered in terms of physical inspection and destructive analysis. Software authentication has implied hash function analysis or authentication tools such as Rose. Continuity of knowledge is maintained through TIDs and cameras. Although there is ongoing progress improving all of these authentication methods, there has been little discussion of the human factors involved in authentication. Issues of non-traditional authentication include sleight-of-hand substitutions, monitor perception vs. reality, and visual diversions. Since monitor confidence in a measurement system depends on the product of their confidencesmore » in each authentication element, it is important to investigate all authentication techniques, including the human factors. This paper will present an initial effort to identify the most important problems that traditional authentication approaches in safeguards have not addressed and are especially relevant to arms control verification. This will include a survey of the literature and direct engagement with nontraditional experts in areas like psychology and human factors. Based on the identification of problem areas, potential research areas will be identified and a possible research agenda will be developed.« less
Teaching medical students ultrasound-guided vascular access - which learning method is best?
Lian, Alwin; Rippey, James C R; Carr, Peter J
2017-05-15
Ultrasound is recommended to guide insertion of peripheral intravenous vascular cannulae (PIVC) where difficulty is experienced. Ultrasound machines are now common-place and junior doctors are often expected to be able to use them. The educational standards for this skill are highly varied, ranging from no education, to self-guided internet-based education, to formal, face-to-face traditional education. In an attempt to decide which educational technique our institution should introduce, a small pilot trial comparing educational techniques was designed. Thirty medical students were enrolled and allocated to one of three groups. PIVC placing ability was then observed, tested and graded on vascular access phantoms. The formal, face-to-face traditional education was rated best by the students, and had the highest success rate in PIVC placement, the improvement statistically significant compared to no education (p = 0.01) and trending towards significance when compared to self-directed internet-based education (p<0.06). The group receiving traditional face-to-face teaching on ultrasound-guided vascular access, performed significantly better than those not receiving education. As the number of ultrasound machines in clinical areas increases, it is important that education programs to support their safe and appropriate use are developed.
Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta
2008-04-22
Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge.
Nano-Al Based Energetics: Rapid Heating Studies and a New Preparation Technique
NASA Astrophysics Data System (ADS)
Sullivan, Kyle; Kuntz, Josh; Gash, Alex; Zachariah, Michael
2011-06-01
Nano-Al based thermites have become an attractive alternative to traditional energetic formulations due to their increased energy density and high reactivity. Understanding the intrinsic reaction mechanism has been a difficult task, largely due to the lack of experimental techniques capable of rapidly and uniform heating a sample (~104- 108 K/s). The current work presents several studies on nano-Al based thermites, using rapid heating techniques. A new mechanism termed a Reactive Sintering Mechanism is proposed for nano-Al based thermites. In addition, new experimental techniques for nanocomposite thermite deposition onto thin Pt electrodes will be discussed. This combined technique will offer more precise control of the deposition, and will serve to further our understanding of the intrinsic reaction mechanism of rapidly heated energetic systems. An improved mechanistic understanding will lead to the development of optimized formulations and architectures. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael
2013-05-01
This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.
NASA Technical Reports Server (NTRS)
Klarer, P.
1994-01-01
An alternative methodology for designing an autonomous navigation and control system is discussed. This generalized hybrid system is based on a less sequential and less anthropomorphic approach than that used in the more traditional artificial intelligence (AI) technique. The architecture is designed to allow both synchronous and asynchronous operations between various behavior modules. This is accomplished by intertask communications channels which implement each behavior module and each interconnection node as a stand-alone task. The proposed design architecture allows for construction of hybrid systems which employ both subsumption and traditional AI techniques as well as providing for a teleoperator's interface. Implementation of the architecture is planned for the prototype Robotic All Terrain Lunar Explorer Rover (RATLER) which is described briefly.
Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.
Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L
2008-06-01
Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.
Multiscale Analysis of Heart Rate Dynamics: Entropy and Time Irreversibility Measures
Peng, Chung-Kang; Goldberger, Ary L.
2016-01-01
Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and nonequilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools— multiscale entropy and multiscale time irreversibility—are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs. PMID:18172763
Change detection from remotely sensed images: From pixel-based to object-based approaches
NASA Astrophysics Data System (ADS)
Hussain, Masroor; Chen, Dongmei; Cheng, Angela; Wei, Hui; Stanley, David
2013-06-01
The appetite for up-to-date information about earth's surface is ever increasing, as such information provides a base for a large number of applications, including local, regional and global resources monitoring, land-cover and land-use change monitoring, and environmental studies. The data from remote sensing satellites provide opportunities to acquire information about land at varying resolutions and has been widely used for change detection studies. A large number of change detection methodologies and techniques, utilizing remotely sensed data, have been developed, and newer techniques are still emerging. This paper begins with a discussion of the traditionally pixel-based and (mostly) statistics-oriented change detection techniques which focus mainly on the spectral values and mostly ignore the spatial context. This is succeeded by a review of object-based change detection techniques. Finally there is a brief discussion of spatial data mining techniques in image processing and change detection from remote sensing data. The merits and issues of different techniques are compared. The importance of the exponential increase in the image data volume and multiple sensors and associated challenges on the development of change detection techniques are highlighted. With the wide use of very-high-resolution (VHR) remotely sensed images, object-based methods and data mining techniques may have more potential in change detection.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Using qPCR for Water Microbial Risk Assessments
Microbial risk assessment (MRA) has traditionally utilized microbiological data that was obtained by culture-based techniques that are expensive and time consuming. With the advent of PCR methods there is a realistic opportunity to conduct MRA studies economically, in less time,...
Non-linear eigensolver-based alternative to traditional SCF methods
NASA Astrophysics Data System (ADS)
Gavin, Brendan; Polizzi, Eric
2013-03-01
The self-consistent iterative procedure in Density Functional Theory calculations is revisited using a new, highly efficient and robust algorithm for solving the non-linear eigenvector problem (i.e. H(X)X = EX;) of the Kohn-Sham equations. This new scheme is derived from a generalization of the FEAST eigenvalue algorithm, and provides a fundamental and practical numerical solution for addressing the non-linearity of the Hamiltonian with the occupied eigenvectors. In contrast to SCF techniques, the traditional outer iterations are replaced by subspace iterations that are intrinsic to the FEAST algorithm, while the non-linearity is handled at the level of a projected reduced system which is orders of magnitude smaller than the original one. Using a series of numerical examples, it will be shown that our approach can outperform the traditional SCF mixing techniques such as Pulay-DIIS by providing a high converge rate and by converging to the correct solution regardless of the choice of the initial guess. We also discuss a practical implementation of the technique that can be achieved effectively using the FEAST solver package. This research is supported by NSF under Grant #ECCS-0846457 and Intel Corporation.
Algorithms for Efficient Computation of Transfer Functions for Large Order Flexible Systems
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Giesy, Daniel P.
1998-01-01
An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, still-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open- and closed-loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, the present method was up to two orders of magnitude faster than a traditional method. The present method generally showed good to excellent accuracy throughout the range of test frequencies, while traditional methods gave adequate accuracy for lower frequencies, but generally deteriorated in performance at higher frequencies with worst case errors being many orders of magnitude times the correct values.
Towards Semantic e-Science for Traditional Chinese Medicine
Chen, Huajun; Mao, Yuxin; Zheng, Xiaoqing; Cui, Meng; Feng, Yi; Deng, Shuiguang; Yin, Aining; Zhou, Chunying; Tang, Jinming; Jiang, Xiaohong; Wu, Zhaohui
2007-01-01
Background Recent advances in Web and information technologies with the increasing decentralization of organizational structures have resulted in massive amounts of information resources and domain-specific services in Traditional Chinese Medicine. The massive volume and diversity of information and services available have made it difficult to achieve seamless and interoperable e-Science for knowledge-intensive disciplines like TCM. Therefore, information integration and service coordination are two major challenges in e-Science for TCM. We still lack sophisticated approaches to integrate scientific data and services for TCM e-Science. Results We present a comprehensive approach to build dynamic and extendable e-Science applications for knowledge-intensive disciplines like TCM based on semantic and knowledge-based techniques. The semantic e-Science infrastructure for TCM supports large-scale database integration and service coordination in a virtual organization. We use domain ontologies to integrate TCM database resources and services in a semantic cyberspace and deliver a semantically superior experience including browsing, searching, querying and knowledge discovering to users. We have developed a collection of semantic-based toolkits to facilitate TCM scientists and researchers in information sharing and collaborative research. Conclusion Semantic and knowledge-based techniques are suitable to knowledge-intensive disciplines like TCM. It's possible to build on-demand e-Science system for TCM based on existing semantic and knowledge-based techniques. The presented approach in the paper integrates heterogeneous distributed TCM databases and services, and provides scientists with semantically superior experience to support collaborative research in TCM discipline. PMID:17493289
NASA Astrophysics Data System (ADS)
Demigha, Souâd.
2016-03-01
The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.
A Comparison of Collaborative and Traditional Instruction in Higher Education
ERIC Educational Resources Information Center
Gubera, Chip; Aruguete, Mara S.
2013-01-01
Although collaborative instructional techniques have become popular in college courses, it is unclear whether collaborative techniques can replace more traditional instructional methods. We examined the efficacy of collaborative courses (in-class, collaborative activities with no lectures) compared to traditional lecture courses (in-class,…
Improving Iranian High School Students' Reading Comprehension Using the Tenets of Genre Analysis
ERIC Educational Resources Information Center
Adelnia, Rezvan; Salehi, Hadi
2016-01-01
This study is an attempt to investigate impact of using a technique, namely, genre-based approach on improving reading ability on Iranian EFL learners' achievement. Therefore, an attempt was made to compare genre-based approach to teaching reading with traditional approaches. For achieving this purpose, by administering the Oxford Quick Placement…
Student-Led Engagement of Journal Article Authors in the Classroom Using Web-Based Videoconferencing
ERIC Educational Resources Information Center
Stockman, Brian J.
2015-01-01
The learning environment described here uses Web-based videoconferencing technology to merge the traditional classroom journal article discussion with student-led interviews of journal article authors. Papers that describe recent applications of a given technique are selected, with the author engagement occurring at the end of a three or four week…
ERIC Educational Resources Information Center
Long, Donna R.
1985-01-01
Describes the implementation of a first-year comprehension-based Spanish language program at New Mexico State University. Includes a discussion of the history of the program and of the problems encountered in changing a traditional curriculum. Also describes the materials, classroom practice, and testing and evaluation techniques used in the…
ERIC Educational Resources Information Center
Brown, Nicola
2017-01-01
While teaching methods tend to be updated frequently, the implementation of new innovative assessment tools is much slower. For example project based learning has become popular as a teaching technique, however, the assessment tends to be via traditional reports. This paper reports on the implementation and evaluation of using website development…
Dave, Vivek S; Shahin, Hend I; Youngren-Ortiz, Susanne R; Chougule, Mahavir B; Haware, Rahul V
2017-10-30
The density, porosity, breaking force, viscoelastic properties, and the presence or absence of any structural defects or irregularities are important physical-mechanical quality attributes of popular solid dosage forms like tablets. The irregularities associated with these attributes may influence the drug product functionality. Thus, an accurate and efficient characterization of these properties is critical for successful development and manufacturing of a robust tablets. These properties are mainly analyzed and monitored with traditional pharmacopeial and non-pharmacopeial methods. Such methods are associated with several challenges such as lack of spatial resolution, efficiency, or sample-sparing attributes. Recent advances in technology, design, instrumentation, and software have led to the emergence of newer techniques for non-invasive characterization of physical-mechanical properties of tablets. These techniques include near infrared spectroscopy, Raman spectroscopy, X-ray microtomography, nuclear magnetic resonance (NMR) imaging, terahertz pulsed imaging, laser-induced breakdown spectroscopy, and various acoustic- and thermal-based techniques. Such state-of-the-art techniques are currently applied at various stages of development and manufacturing of tablets at industrial scale. Each technique has specific advantages or challenges with respect to operational efficiency and cost, compared to traditional analytical methods. Currently, most of these techniques are used as secondary analytical tools to support the traditional methods in characterizing or monitoring tablet quality attributes. Therefore, further development in the instrumentation and software, and studies on the applications are necessary for their adoption in routine analysis and monitoring of tablet physical-mechanical properties. Copyright © 2017 Elsevier B.V. All rights reserved.
Cessford, Tara; Meneilly, Graydon S; Arishenkoff, Shane; Eddy, Christopher; Chen, Luke Y C; Kim, Daniel J; Ma, Irene W Y
2017-12-08
To determine whether sonographic versions of physical examination techniques can accurately identify splenomegaly, Castell's method (Ann Intern Med 1967; 67:1265-1267), the sonographic Castell's method, spleen tip palpation, and the sonographic spleen tip technique were compared with reference measurements. Two clinicians trained in bedside sonography patients recruited from an urban hematology clinic. Each patient was examined for splenomegaly using conventional percussion and palpation techniques (Castell's method and spleen tip palpation, respectively), as well as the sonographic versions of these maneuvers (sonographic Castell's method and sonographic spleen tip technique). Results were compared with a reference standard based on professional sonographer measurements. The sonographic Castell's method had greater sensitivity (91.7% [95% confidence interval, 61.5% to 99.8%]) than the traditional Castell's method (83.3% [95% confidence interval, 51.6% to 97.9%]) but took longer to perform [mean ± SD, 28.8 ± 18.6 versus 18.8 ± 8.1 seconds; P = .01). Palpable and positive sonographic spleen tip results were both 100% specific, but the sonographic spleen tip method was more sensitive (58.3% [95% confidence interval, 27.7% to 84.8%] versus 33.3% [95% confidence interval, 9.9% to 65.1%]). Sonographic versions of traditional physical examination maneuvers have greater diagnostic accuracy than the physical examination maneuvers from which they are derived but may take longer to perform. We recommend a combination of traditional physical examination and sonographic techniques when evaluating for splenomegaly at the bedside. © 2017 by the American Institute of Ultrasound in Medicine.
Space station advanced automation
NASA Technical Reports Server (NTRS)
Woods, Donald
1990-01-01
In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.
Travis, Fred; Shear, Jonathan
2010-12-01
This paper proposes a third meditation-category--automatic self-transcending--to extend the dichotomy of focused attention and open monitoring proposed by Lutz. Automatic self-transcending includes techniques designed to transcend their own activity. This contrasts with focused attention, which keeps attention focused on an object; and open monitoring, which keeps attention involved in the monitoring process. Each category was assigned EEG bands, based on reported brain patterns during mental tasks, and meditations were categorized based on their reported EEG. Focused attention, characterized by beta/gamma activity, included meditations from Tibetan Buddhist, Buddhist, and Chinese traditions. Open monitoring, characterized by theta activity, included meditations from Buddhist, Chinese, and Vedic traditions. Automatic self-transcending, characterized by alpha1 activity, included meditations from Vedic and Chinese traditions. Between categories, the included meditations differed in focus, subject/object relation, and procedures. These findings shed light on the common mistake of averaging meditations together to determine mechanisms or clinical effects. Copyright © 2010 Elsevier Inc. All rights reserved.
Research on BIM-based building information value chain reengineering
NASA Astrophysics Data System (ADS)
Hui, Zhao; Weishuang, Xie
2017-04-01
The achievement of value and value-added factor to the building engineering information is accomplished through a chain-flow, that is, building the information value chain. Based on the deconstruction of the information chain on the construction information in the traditional information mode, this paper clarifies the value characteristics and requirements of each stage of the construction project. In order to achieve building information value-added, the paper deconstructs the traditional building information value chain, reengineer the information value chain model on the basis of the theory and techniques of BIM, to build value-added management model and analyse the value of the model.
NASA Astrophysics Data System (ADS)
Tinney, Charles Evan
2007-12-01
By using the book "Physics for Scientists and Engineers" by Raymond A. Serway as a guide, CD problem sets for teaching a calculus-based physics course were developed, programmed, and evaluated for homework assignments during the 2003-2004 academic year at Utah State University. These CD sets were used to replace the traditionally handwritten and submitted homework sets. They included a research-based format that guided the students through problem-solving techniques using responseactivated helps and suggestions. The CD contents were designed to help the student improve his/her physics problem-solving skills. The analyzed score results showed a direct correlation between the scores obtained on the homework and the students' time spent per problem, as well as the number of helps used per problem.
Injection and adhesion palatoplasty: a preliminary study in a canine model.
Martínez-Álvarez, Concepción; González-Meli, Beatriz; Berenguer-Froehner, Beatriz; Paradas-Lara, Irene; López-Gordillo, Yamila; Rodríguez-Bobada, Cruz; González, Pablo; Chamorro, Manuel; Arias, Pablo; Hilborn, Jöns; Casado-Gómez, Inmaculada; Martínez-Sanz, Elena
2013-08-01
Raising mucoperiosteal flaps in traditional palatoplasty impairs mid-facial growth. Hyaluronic acid-based hydrogels have been successfully tested for minimally invasive craniofacial bone generation in vivo as carriers of bone morphogenetic protein-2 (BMP-2). We aimed to develop a novel flapless technique for cleft palate repair by injecting a BMP-2 containing hydrogel. Dog pups with congenital cleft palate were either non-treated (n=4) or treated with two-flap palatoplasty (n=6) or with the proposed injection/adhesion technique (n=5). The experimental approach was to inject a hyaluronic acid-based hydrogel containing hydroxyapatite and BMP-2 subperiosteally at the cleft palate margins of pups aged six weeks. At week ten, a thin strip of the medial edge mucosa was removed and the margins were closed directly. Occlusal photographs and computed tomography (CT) scans were obtained up to week 20. Four weeks after the gel injection the cleft palate margins had reached the midline and engineered bone had enlarged the palatal bones. Removal of the medial edge mucosa and suturing allowed complete closure of the cleft. Compared to traditional palatoplasty, the injection/adhesion technique was easier, and the post-surgical recovery was faster. CT on week 20 revealed some overlapping or "bending" of palatal shelves in the two-flap repair group, which was not observed in the experimental nor control groups. A minimally invasive technique for cleft palate repair upon injectable scaffolds in a dog model of congenital cleft palate is feasible. Results suggest better growth of palatal bones. This represents an attractive clinical alternative to traditional palatoplasty for cleft palate patients. Copyright © 2013 Elsevier Inc. All rights reserved.
Nondestructive detection of infested chestnuts based on NIR spectroscopy
USDA-ARS?s Scientific Manuscript database
Insect feeding is a significant postharvest problem for processors of Chestnuts (Castanea sativa, Miller). In most cases, damage from insects is 'hidden', i.e. not visually detectable on the fruit surface. Consequently, traditional sorting techniques, including manual sorting, are generally inadequa...
Shokrollahi, K; Cooper, M A; Hiew, L Y
2009-06-01
Pinnaplasty using the Mustardé and Furnas techniques is increasingly popular. One adjunctive technique that is often used involves the elevation of a fascial flap beneath which sutures are placed for additional cover, potentially reducing the risk of complications and possibly recurrence. This flap is traditionally raised with a proximal base, but it can be raised distally with a number of advantages as we illustrate. More significantly, we demonstrate the potential for an entirely new operation to correct prominent ears with benefits including a natural end result, simplicity, reduced operative time, less tissue trauma and the use of a single buried knot. We illustrate the use of this procedure adjunctively to reinforce suture-based techniques in a series of 15 patients.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less
Seaweed cultivation: Traditional way and its reformation
NASA Astrophysics Data System (ADS)
Fei, Xiu-Geng; Bao, Ying; Lu, Shan
1999-09-01
Seaweed cultivation or phycoculture has been developed rather fast in recent years. The total production of cultivated seaweed at present is about 6250×103 tons fresh weight. The total cultivation area is estimated as 200×103 hectare. The annual total value of cultivated seaweeds has been estimated to be more than 3 billion US dollars. Phycoculture provides many job opportunities for the coastal region people, has the potential to improve marine environments and thus even induce global change. All traditional cultivation methods and techniques are based on or start from the individual plant or the cultivated seaweed population. Modern biological science and biotechnology achievements have benefited agriculture a lot, but traditional seaweed cultivation has not changed much since its founding. This is because seaweed cultivation has been quite conservative for quite a long period and has accumulated many problems requiring solution. Four main problems might be the most universal ones holding back further development of the industry. New ways of seaweed cultivation must be developed, new techniques must be perfected, and new problems solved. This paper mainly discusses the main problems of traditional seaweed cultivation at present and its possible further development and reformation in the future.
X-ray spatial frequency heterodyne imaging of protein-based nanobubble contrast agents
Rand, Danielle; Uchida, Masaki; Douglas, Trevor; Rose-Petruck, Christoph
2014-01-01
Spatial Frequency Heterodyne Imaging (SFHI) is a novel x-ray scatter imaging technique that utilizes nanoparticle contrast agents. The enhanced sensitivity of this new technique relative to traditional absorption-based x-ray radiography makes it promising for applications in biomedical and materials imaging. Although previous studies on SFHI have utilized only metal nanoparticle contrast agents, we show that nanomaterials with a much lower electron density are also suitable. We prepared protein-based “nanobubble” contrast agents that are comprised of protein cage architectures filled with gas. Results show that these nanobubbles provide contrast in SFHI comparable to that of gold nanoparticles of similar size. PMID:25321797
Schulz, Alexandra; Daali, Samira; Javed, Mehreen; Fuchs, Paul Christian; Brockmann, Michael; Igressa, Alhadi; Charalampaki, Patra
2016-12-01
At present, no ideal diagnostic tools exist in the market to excise cancer tissue with the required safety margins and to achieve optimal aesthetic results using tissue-conserving techniques. In this prospective study, confocal laser endomicroscopy (CLE) and the traditional gold standard of magnifying glasses (MG) were compared regarding the boundaries of in vivo basal cell carcinoma and squamous cell carcinoma. Tumour diameters defined by both methods were measured and compared with those determined by histopathological examination. Nineteen patients were included in the study. The CLE technique was found to be superior to excisional margins based on MG only. Re-excision was required in 68% of the cases following excision based on MG evaluation, but only in 27% of the cases for whom excision margins were based on CLE. Our results are promising regarding the distinction between tumour and healthy surrounding tissue, and indicate that presurgical mapping of basal cell carcinoma and squamous cell carcinoma is possible. The tool itself should be developed further with special attention to early detection of skin cancer.
Acoustics based assessment of respiratory diseases using GMM classification.
Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J
2010-01-01
The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.
Temporomandibular joint arthroscopy technique using a single working cannula.
Srouji, S; Oren, D; Zoabi, A; Ronen, O; Zraik, H
2016-11-01
The traditional arthroscopy technique includes the creation of three ports in order to enable visualization, operation, and arthrocentesis. The aim of this study was to assess an advanced temporomandibular joint (TMJ) arthroscopy technique that requires only a single cannula, through which a one-piece instrument containing a visualization canal, irrigation canal, and a working canal is inserted, as an alternative to the traditional double-puncture technique. This retrospective study assessed eight patients (13 TMJs) with pain and/or limited range of movement that was refractory to conservative therapy, who were treated between June 2015 and December 2015. The temporomandibular joint disorder (TMD) was diagnosed by physical examination and mouth opening measurements. The duration of surgery was recorded and compared to that documented for traditional arthroscopies performed by the same surgeon. Operative single-cannula arthroscopy (OSCA) was performed using a holmium YAG (Ho:YAG) 230μm fibre laser for ablation. The OSCA technique proved effective in improving mouth opening in all patients (mean increase 9.12±1.96mm) and in reducing pain (mean visual analogue scale decrease of 3.25±1.28). The operation time was approximately half that of the traditional technique. The OSCA technique is as efficient as the traditional technique, is simple to learn, and is simpler to execute. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Not another boring lecture: engaging learners with active learning techniques.
Wolff, Margaret; Wagner, Mary Jo; Poznanski, Stacey; Schiller, Jocelyn; Santen, Sally
2015-01-01
Core content in Emergency Medicine Residency Programs is traditionally covered in didactic sessions, despite evidence suggesting that learners do not retain a significant portion of what is taught during lectures. We describe techniques that medical educators can use when leading teaching sessions to foster engagement and encourage self-directed learning, based on current literature and evidence about learning. When these techniques are incorporated, sessions can be effective in delivering core knowledge, contextualizing content, and explaining difficult concepts, leading to increased learning. Copyright © 2015 Elsevier Inc. All rights reserved.
Development of a Transportable Gravity Gradiometer Based on Atom Interferometry
NASA Astrophysics Data System (ADS)
Yu, N.; Kohel, J. M.; Aveline, D. C.; Kellogg, J. R.; Thompson, R. J.; Maleki, L.
2007-12-01
JPL is developing a transportable gravity gradiometer based on light-pulse atom interferometers for NASA's Earth Science Technology Office's Instrument Incubator Program. The inertial sensors in this instrument employ a quantum interference measurement technique, analogous to the precise phase measurements in atomic clocks, which offers increased sensitivity and improved long-term stability over traditional mechanical devices. We report on the implementation of this technique in JPL's gravity gradiometer, and on the current performance of the mobile instrument. We also discuss the prospects for satellite-based gravity field mapping, including high-resolution monitoring of time-varying fields from a single satellite platform and multi-component measurements of the gravitational gradient tensor, using atom interferometer-based instruments.
Low-grade fibrosarcoma of the anterior skull base: endoscopic resection and repair.
Kuhn, Frederick A; Javer, Amin R
2003-01-01
Fibrosarcomas of the paranasal sinuses and skull base are uncommon tumors. Traditionally, "open approach" surgery remains the mainstay for treatment of choice for these tumors. A 49-year-old man underwent resection of a right anterior skull base fibrosarcoma using the endoscopic approach. Close follow-up using both endoscopic and imaging methods over a period of four years has revealed a well-healed skull base with no evidence of recurrence. Significant resistance exists at present for such a technique to deal with malignant diseases of the head and neck but results from advanced centers continue to prove that this may be a technique worth mastering and improving on.
The contemporary mindfulness movement and the question of nonself1.
Samuel, Geoffrey
2015-08-01
Mindfulness-based stress reduction (MBSR), mindfulness-based cognitive therapy (MBCT), and other "mindfulness"-based techniques have rapidly gained a significant presence within contemporary society. Clearly these techniques, which derive or are claimed to derive from Buddhist meditational practices, meet genuine human needs. However, questions are increasingly raised regarding what these techniques meant in their original context(s), how they have been transformed in relation to their new Western and global field of activity, what might have been lost (or gained) on the way, and how the entire contemporary mindfulness phenomenon might be understood. The article points out that first-generation mindfulness practices, such as MBSR and MBCT, derive from modernist versions of Buddhism, and omit or minimize key aspects of the Buddhist tradition, including the central Buddhist philosophical emphasis on the deconstruction of the self. Nonself (or no self) fits poorly into the contemporary therapeutic context, but is at the core of the Buddhist enterprise from which contemporary "mindfulness" has been abstracted. Instead of focussing narrowly on the practical efficacy of the first generation of mindfulness techniques, we might see them as an invitation to explore the much wider range of practices available in the traditions from which they originate. Rather, too, than simplifying and reducing these practices to fit current Western conceptions of knowledge, we might seek to incorporate more of their philosophical basis into our Western adaptations. This might lead to a genuine and productive expansion of both scientific knowledge and therapeutic possibilities. © The Author(s) 2014.
Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation
NASA Astrophysics Data System (ADS)
Sleesongsom, S.; Bureerat, S.
2018-03-01
This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.
Rapid Monitoring of Bacteria and Fungi aboard the International Space Station (ISS)
NASA Technical Reports Server (NTRS)
Gunter, D.; Flores, G.; Effinger, M.; Maule, J.; Wainwright, N.; Steele, A.; Damon, M.; Wells, M.; Williams, S.; Morris, H.;
2009-01-01
Microorganisms within spacecraft have traditionally been monitored with culture-based techniques. These techniques involve growth of environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies or return of samples to Earth for ground-based analysis. Data obtained over the past 4 decades have enhanced our understanding of the microbial ecology within space stations. However, the approach has been limited by the following factors: i) Many microorganisms (estimated > 95%) in the environment cannot grow on conventional growth media; ii) Significant time lags (3-5 days for incubation and up to several months to return samples to ground); iii) Condensation in contact slides hinders colony counting by crew; and iv) Growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and beta-1, 3-glucan, found in the cell walls of gramnegative bacteria and fungi, respectively. The technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device, known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). LOCADPTS was launched to the ISS in December 2006, and here we present data obtained from Mach 2007 until the present day. These data include a comparative study between LOCADPTS analysis and existing culture-based methods; and an exploratory survey of surface endotoxin and beta-1, 3-glucan throughout the ISS. While a general correlation between LOCAD-PTS and traditional culture-based methods should not be expected, we will suggest new requirements for microbial monitoring based upon culture-independent parameters measured by LOCAD-PTS.
Fortuna, A O; Gurd, J R
1999-01-01
During certain medical procedures, it is important to continuously measure the respiratory flow of a patient, as lack of proper ventilation can cause brain damage and ultimately death. The monitoring of the ventilatory condition of a patient is usually performed with the aid of flowmeters. However, water and other secretions present in the expired air can build up and ultimately block a traditional, restriction-based flowmeter; by using an orifice plate flowmeter, such blockages are minimized. This paper describes the design of an orifice plate flowmetering system including, especially, a description of the numerical and computational techniques adopted in order to simulate human respiratory and sinusoidal air flow across various possible designs for the orifice plate flowmeter device. Parallel computation and multigrid techniques were employed in order to reduce execution time. The simulated orifice plate was later built and tested under unsteady sinusoidal flows. Experimental tests show reasonable agreement with the numerical simulation, thereby reinforcing the general hypothesis that computational exploration of the design space is sufficiently accurate to allow designers of such systems to use this in preference to the more traditional, mechanical prototyping techniques.
Matsukawa, Keitaro; Yato, Yoshiyuki; Kato, Takashi; Imabayashi, Hideaki; Asazuma, Takashi; Nemoto, Koichi
2014-02-15
The insertional torque of pedicle screws using the cortical bone trajectory (CBT) was measured in vivo. To investigate the effectiveness of the CBT technique by measurement of the insertional torque. The CBT follows a mediolateral and caudocephalad directed path, engaging with cortical bone maximally from the pedicle to the vertebral body. Some biomechanical studies have demonstrated favorable characteristics of the CBT technique in cadaveric lumbar spine. However, no in vivo study has been reported on the mechanical behavior of this new trajectory. The insertional torque of pedicle screws using CBT and traditional techniques were measured intraoperatively in 48 consecutive patients. A total of 162 screws using the CBT technique and 36 screws using the traditional technique were compared. In 8 of 48 patients, the side-by-side comparison of 2 different insertional techniques for each vertebra were performed, which formed the H group. In addition, the insertional torque was correlated with bone mineral density. The mean maximum insertional torque of CBT screws and traditional screws were 2.49 ± 0.99 Nm and 1.24 ± 0.54 Nm, respectively. The CBT screws showed 2.01 times higher torque and the difference was significant between the 2 techniques (P < 0.01). In the H group, the insertional torque were 2.71 ± 1.36 Nm in the CBT screws and 1.58 ± 0.44 Nm in the traditional screws. The CBT screws demonstrated 1.71 times higher torque and statistical significance was achieved (P < 0.01). Positive linear correlations between maximum insertional torque and bone mineral density were found in both technique, the correlation coefficient of traditional screws (r = 0.63, P < 0.01) was higher than that of the CBT screws (r = 0.59, P < 0.01). The insertional torque using the CBT technique is about 1.7 times higher than the traditional technique. 2.
Wilk, Brian L
2015-01-01
Over the course of the past two to three decades, intraoral digital impression systems have gained acceptance due to high accuracy and ease of use as they have been incorporated into the fabrication of dental implant restorations. The use of intraoral digital impressions enables the clinician to produce accurate restorations without the unpleasant aspects of traditional impression materials and techniques. This article discusses the various types of digital impression systems and their accuracy compared to traditional impression techniques. The cost, time, and patient satisfaction components of both techniques will also be reviewed.
Zhang, Qing-Wen; Li, Yong
2014-05-01
Accelerated soil erosion is considered as a major land degradation process resulting in increased sediment production and sediment-associated nutrient inputs to the rivers. Over the last decade, several soil conservation programs for erosion control have been conducted throughout Northeastern China. Reliable information on soil erosion rates is an essential prerequisite to assess the effectiveness of soil conservation measures. A study was carried out in Baiquan County of Northeastern China to assess the effectiveness of soil conservation measures in reducing soil erosion using the (137)Cs tracer technique and related techniques. This study reports the use of (137)Cs measurements to quantify medium-term soil erosion rates in traditional slope farmland, contour cropping farmland and terrace farmland in the Dingjiagou catchment and the Xingsheng catchment of Baiquan County. The (137)Cs reference inventory of 2532 ± 670 Bq m(-2) was determined. Based on the principle of the (137)Cs tracer technique, soil erosion rates were estimated. The results showed that severe erosion on traditional slope farmland is the dominant soil erosion process in the area. The terrace measure reduced soil erosion rates by 16% for the entire slope. Typical net soil erosion rates are estimated to be 28.97 Mg per hectare per year for traditional slope farmland and 25.04 Mg per hectare per year for terrace farmland in the Dingjiagou catchment. In contrast to traditional slope farmland with a soil erosion rate of 34.65 Mg per hectare per year, contour cultivation reduced the soil erosion rate by 53% resulting in a soil erosion rate of 22.58 Mg per hectare per year in the Xingsheng catchment. These results indicated that soil losses can be controlled by changing tillage practices from the traditional slope farmland cultivation to the terrace or contour cultivation.
Detection and discrimination of four aspergillus section nigri species by pcr
USDA-ARS?s Scientific Manuscript database
Species of Aspergillus section Nigri are not easily distinguished by traditional morphological techniques, and typically are identified by DNA sequencing methods. We developed four PCR primers to distinguish between A. niger, A. awamori, A. carbonarius and A. tubingensis, based on species-conserved...
AI tools in computer based problem solving
NASA Technical Reports Server (NTRS)
Beane, Arthur J.
1988-01-01
The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.
Zeinalizadeh, Mehdi; Sadrehosseini, Seyed Mousa; Habibi, Zohreh; Nejat, Farideh; Silva, Harley Brito da; Singh, Harminder
2017-03-01
OBJECTIVE Congenital transsphenoidal encephaloceles are rare malformations, and their surgical treatment remains challenging. This paper reports 3 cases of transsphenoidal encephalocele in 8- to 24-month-old infants, who presented mainly with airway obstruction, respiratory distress, and failure to thrive. METHODS The authors discuss the surgical management of these lesions via a minimally invasive endoscopic endonasal approach, as compared with the traditional transcranial and transpalatal approaches. A unique endonasal management algorithm for these lesions is outlined. The lesions were repaired with no resection of the encephalocele sac, and the cranial base defects were reconstructed with titanium mesh plates and vascular nasoseptal flaps. RESULTS Reduction of the encephalocele and reconstruction of the skull base was successfully accomplished in all 3 cases, with favorable results. CONCLUSIONS The described endonasal management algorithm for congenital transsphenoidal encephaloceles is a safe, viable alternative to traditional transcranial and transpalatal approaches, and avoids much of the morbidity associated with these open techniques.
Decomposition-Based Decision Making for Aerospace Vehicle Design
NASA Technical Reports Server (NTRS)
Borer, Nicholas K.; Mavris, DImitri N.
2005-01-01
Most practical engineering systems design problems have multiple and conflicting objectives. Furthermore, the satisfactory attainment level for each objective ( requirement ) is likely uncertain early in the design process. Systems with long design cycle times will exhibit more of this uncertainty throughout the design process. This is further complicated if the system is expected to perform for a relatively long period of time, as now it will need to grow as new requirements are identified and new technologies are introduced. These points identify a need for a systems design technique that enables decision making amongst multiple objectives in the presence of uncertainty. Traditional design techniques deal with a single objective or a small number of objectives that are often aggregates of the overarching goals sought through the generation of a new system. Other requirements, although uncertain, are viewed as static constraints to this single or multiple objective optimization problem. With either of these formulations, enabling tradeoffs between the requirements, objectives, or combinations thereof is a slow, serial process that becomes increasingly complex as more criteria are added. This research proposal outlines a technique that attempts to address these and other idiosyncrasies associated with modern aerospace systems design. The proposed formulation first recasts systems design into a multiple criteria decision making problem. The now multiple objectives are decomposed to discover the critical characteristics of the objective space. Tradeoffs between the objectives are considered amongst these critical characteristics by comparison to a probabilistic ideal tradeoff solution. The proposed formulation represents a radical departure from traditional methods. A pitfall of this technique is in the validation of the solution: in a multi-objective sense, how can a decision maker justify a choice between non-dominated alternatives? A series of examples help the reader to observe how this technique can be applied to aerospace systems design and compare the results of this so-called Decomposition-Based Decision Making to more traditional design approaches.
Effects-Based Operations in the Cyber Domain
2017-05-03
as the joint targeting methodology . The description that Batschelet gave the traditional targeting methodology included a process of, “Decide, Detect...technology, requires new planning and methodology to fight back. This paper evaluates current Department of Defense doctrine to look at ways to conduct...developing its cyber tactics, techniques, and procedures, which, includes various targeting methodologies , such as the use of effects-based
Mining Student Data Captured from a Web-Based Tutoring Tool: Initial Exploration and Results
ERIC Educational Resources Information Center
Merceron, Agathe; Yacef, Kalina
2004-01-01
In this article we describe the initial investigations that we have conducted on student data collected from a web-based tutoring tool. We have used some data mining techniques such as association rule and symbolic data analysis, as well as traditional SQL queries to gain further insight on the students' learning and deduce information to improve…
Adaptivity in Game-Based Learning: A New Perspective on Story
NASA Astrophysics Data System (ADS)
Berger, Florian; Müller, Wolfgang
Game-based learning as a novel form of e-learning still has issues in fundamental questions, the lack of a general model for adaptivity being one of them. Since adaptive techniques in traditional e-learning applications bear close similarity to certain interactive storytelling approaches, we propose a new notion of story as the joining element of arbitraty learning paths.
OpenMP Parallelization and Optimization of Graph-Based Machine Learning Algorithms
Meng, Zhaoyi; Koniges, Alice; He, Yun Helen; ...
2016-09-21
In this paper, we investigate the OpenMP parallelization and optimization of two novel data classification algorithms. The new algorithms are based on graph and PDE solution techniques and provide significant accuracy and performance advantages over traditional data classification algorithms in serial mode. The methods leverage the Nystrom extension to calculate eigenvalue/eigenvectors of the graph Laplacian and this is a self-contained module that can be used in conjunction with other graph-Laplacian based methods such as spectral clustering. We use performance tools to collect the hotspots and memory access of the serial codes and use OpenMP as the parallelization language to parallelizemore » the most time-consuming parts. Where possible, we also use library routines. We then optimize the OpenMP implementations and detail the performance on traditional supercomputer nodes (in our case a Cray XC30), and test the optimization steps on emerging testbed systems based on Intel’s Knights Corner and Landing processors. We show both performance improvement and strong scaling behavior. Finally, a large number of optimization techniques and analyses are necessary before the algorithm reaches almost ideal scaling.« less
Microscope self-calibration based on micro laser line imaging and soft computing algorithms
NASA Astrophysics Data System (ADS)
Apolinar Muñoz Rodríguez, J.
2018-06-01
A technique to perform microscope self-calibration via micro laser line and soft computing algorithms is presented. In this technique, the microscope vision parameters are computed by means of soft computing algorithms based on laser line projection. To implement the self-calibration, a microscope vision system is constructed by means of a CCD camera and a 38 μm laser line. From this arrangement, the microscope vision parameters are represented via Bezier approximation networks, which are accomplished through the laser line position. In this procedure, a genetic algorithm determines the microscope vision parameters by means of laser line imaging. Also, the approximation networks compute the three-dimensional vision by means of the laser line position. Additionally, the soft computing algorithms re-calibrate the vision parameters when the microscope vision system is modified during the vision task. The proposed self-calibration improves accuracy of the traditional microscope calibration, which is accomplished via external references to the microscope system. The capability of the self-calibration based on soft computing algorithms is determined by means of the calibration accuracy and the micro-scale measurement error. This contribution is corroborated by an evaluation based on the accuracy of the traditional microscope calibration.
Hansen, J V; Nelson, R D
1997-01-01
Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.
Tensor-GMRES method for large sparse systems of nonlinear equations
NASA Technical Reports Server (NTRS)
Feng, Dan; Pulliam, Thomas H.
1994-01-01
This paper introduces a tensor-Krylov method, the tensor-GMRES method, for large sparse systems of nonlinear equations. This method is a coupling of tensor model formation and solution techniques for nonlinear equations with Krylov subspace projection techniques for unsymmetric systems of linear equations. Traditional tensor methods for nonlinear equations are based on a quadratic model of the nonlinear function, a standard linear model augmented by a simple second order term. These methods are shown to be significantly more efficient than standard methods both on nonsingular problems and on problems where the Jacobian matrix at the solution is singular. A major disadvantage of the traditional tensor methods is that the solution of the tensor model requires the factorization of the Jacobian matrix, which may not be suitable for problems where the Jacobian matrix is large and has a 'bad' sparsity structure for an efficient factorization. We overcome this difficulty by forming and solving the tensor model using an extension of a Newton-GMRES scheme. Like traditional tensor methods, we show that the new tensor method has significant computational advantages over the analogous Newton counterpart. Consistent with Krylov subspace based methods, the new tensor method does not depend on the factorization of the Jacobian matrix. As a matter of fact, the Jacobian matrix is never needed explicitly.
Ding, Weifu; Zhang, Jiangshe; Leung, Yee
2016-10-01
In this paper, we predict air pollutant concentration using a feedforward artificial neural network inspired by the mechanism of the human brain as a useful alternative to traditional statistical modeling techniques. The neural network is trained based on sparse response back-propagation in which only a small number of neurons respond to the specified stimulus simultaneously and provide a high convergence rate for the trained network, in addition to low energy consumption and greater generalization. Our method is evaluated on Hong Kong air monitoring station data and corresponding meteorological variables for which five air quality parameters were gathered at four monitoring stations in Hong Kong over 4 years (2012-2015). Our results show that our training method has more advantages in terms of the precision of the prediction, effectiveness, and generalization of traditional linear regression algorithms when compared with a feedforward artificial neural network trained using traditional back-propagation.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Forest control and regulation ... a comparison of traditional methods and alternatives
LeRoy C. Hennes; Michael J. Irving; Daniel I. Navon
1971-01-01
Two traditional techniques of forest control and regulation-formulas and area-volume check-are compared to linear programing, as used in a new computerized planning system called Timber Resource Allocation Method ( Timber RAM). Inventory data from a National Forest in California illustrate how each technique is used. The traditional methods are simpler to apply and...
NASA Technical Reports Server (NTRS)
Maule, J.; Wainwright, N.; Steele, A.; Gunter, D.; Flores, G.; Effinger, M.; Danibm N,; Wells, M.; Williams, S.; Morris, H.;
2008-01-01
Microorganisms within the space stations Salyut, Mir and the International Space Station (ISS), have traditionally been monitored with culture-based techniques. These techniques involve growing environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies; and return of samples to Earth for ground-based analysis. This approach has provided a wealth of useful data and enhanced our understanding of the microbial ecology within space stations. However, the approach is also limited by the following: i) More than 95% microorganisms in the environment cannot grow on conventional growth media; ii) Significant time lags occur between onboard sampling and colony visualization (3-5 days) and ground-based analysis (as long as several months); iii) Colonies are often difficult to visualize due to condensation within contact slide media plates; and iv) Techniques involve growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and -1, 3-glucan, found in the cell walls of gram-negative bacteria and fungi, respectively. This technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device. This handheld device and sampling system is known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). A poster will be presented that describes a comparative study between LOCAD-PTS analysis and existing culture-based methods onboard the ISS; together with an exploratory survey of surface endotoxin throughout the ISS. It is concluded that while a general correlation between LOCAD-PTS and traditional culture-based methods should not necessarily be expected, a combinatorial approach can be adopted where both sets of data are used together to generate a more complete story of the microbial ecology on the ISS.
System identification through nonstationary data using Time-Frequency Blind Source Separation
NASA Astrophysics Data System (ADS)
Guo, Yanlin; Kareem, Ahsan
2016-06-01
Classical output-only system identification (SI) methods are based on the assumption of stationarity of the system response. However, measured response of buildings and bridges is usually non-stationary due to strong winds (e.g. typhoon, and thunder storm etc.), earthquakes and time-varying vehicle motions. Accordingly, the response data may have time-varying frequency contents and/or overlapping of modal frequencies due to non-stationary colored excitation. This renders traditional methods problematic for modal separation and identification. To address these challenges, a new SI technique based on Time-Frequency Blind Source Separation (TFBSS) is proposed. By selectively utilizing "effective" information in local regions of the time-frequency plane, where only one mode contributes to energy, the proposed technique can successfully identify mode shapes and recover modal responses from the non-stationary response where the traditional SI methods often encounter difficulties. This technique can also handle response with closely spaced modes which is a well-known challenge for the identification of large-scale structures. Based on the separated modal responses, frequency and damping can be easily identified using SI methods based on a single degree of freedom (SDOF) system. In addition to the exclusive advantage of handling non-stationary data and closely spaced modes, the proposed technique also benefits from the absence of the end effects and low sensitivity to noise in modal separation. The efficacy of the proposed technique is demonstrated using several simulation based studies, and compared to the popular Second-Order Blind Identification (SOBI) scheme. It is also noted that even some non-stationary response data can be analyzed by the stationary method SOBI. This paper also delineates non-stationary cases where SOBI and the proposed scheme perform comparably and highlights cases where the proposed approach is more advantageous. Finally, the performance of the proposed method is evaluated using a full-scale non-stationary response of a tall building during an earthquake and found it to perform satisfactorily.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Endoscopic versus traditional saphenous vein harvesting: a prospective, randomized trial.
Allen, K B; Griffith, G L; Heimansohn, D A; Robison, R J; Matheny, R G; Schier, J J; Fitzgerald, E B; Shaar, C J
1998-07-01
Saphenous vein harvested with a traditional longitudinal technique often results in leg wound complications. An alternative endoscopic harvest technique may decrease these complications. One hundred twelve patients scheduled for elective coronary artery bypass grafting were prospectively randomized to have vein harvested using either an endoscopic (group A, n = 54) or traditional technique (group B, n = 58). Groups A and B, respectively, were similar with regard to length of vein harvested (41 +/- 8 cm versus 40 +/- 14 cm), bypasses done (4.1 +/- 1.1 versus 4.2 +/- 1.4), age, preoperative risk stratification, and risks for wound complication (diabetes, sex, obesity, preoperative anemia, hypoalbuminemia, and peripheral vascular disease). Leg wound complications were significantly (p < or = 0.02) reduced in group A (4% [2 of 51] versus 19% [11 of 58]). Univariate analysis identified traditional incision (p < or = 0.02) and diabetes (p < or = 0.05) as wound complication risk factors. Multiple logistic regression analysis identified only the traditional harvest technique as a risk factor for leg wound complications with no significant interaction between harvest technique and any preoperative risk factor (p < or = 0.03). Harvest rate (0.9 +/- 0.4 cm/min versus 1.2 +/- 0.5 cm/min) was slower for group A (p < or = 0.02) and conversion from endoscopic to a traditional harvest occurred in 5.6% (3 of 54) of patients. In a prospective, randomized trial, saphenous vein harvested endoscopically was associated with fewer wound complications than the traditional longitudinal method.
Hou, Feixia; Wen, Longlian; Peng, Cheng; Guo, Jinlin
2018-01-01
Seahorse documented in Chinese pharmacopeia possess important medicinal efficacy and are used as an ingredient in traditional Chinese medicines. The growing international trade threatens the species. DNA barcoding holds a great application potentiality in wildlife conservation and might prevent the illegal trade of threatened species. The COI gene was used to identify seahorse, and nine Hippocampus species were found in the three large traditional Chinese medicines markets of China. All inter-specific genetic variations were larger than 2%. Mean genetic distances between species were 17-fold larger than those within the species. Phylogenetic tree showed that each species clustered in the appropriate branch. All results demonstrated that COI-based barcoding technique could be used to identify seahorse species and played a major role in monitoring the seahorse trade.
Error-free holographic frames encryption with CA pixel-permutation encoding algorithm
NASA Astrophysics Data System (ADS)
Li, Xiaowei; Xiao, Dan; Wang, Qiong-Hua
2018-01-01
The security of video data is necessary in network security transmission hence cryptography is technique to make video data secure and unreadable to unauthorized users. In this paper, we propose a holographic frames encryption technique based on the cellular automata (CA) pixel-permutation encoding algorithm. The concise pixel-permutation algorithm is used to address the drawbacks of the traditional CA encoding methods. The effectiveness of the proposed video encoding method is demonstrated by simulation examples.
Epileptic seizure detection in EEG signal using machine learning techniques.
Jaiswal, Abeg Kumar; Banka, Haider
2018-03-01
Epilepsy is a well-known nervous system disorder characterized by seizures. Electroencephalograms (EEGs), which capture brain neural activity, can detect epilepsy. Traditional methods for analyzing an EEG signal for epileptic seizure detection are time-consuming. Recently, several automated seizure detection frameworks using machine learning technique have been proposed to replace these traditional methods. The two basic steps involved in machine learning are feature extraction and classification. Feature extraction reduces the input pattern space by keeping informative features and the classifier assigns the appropriate class label. In this paper, we propose two effective approaches involving subpattern based PCA (SpPCA) and cross-subpattern correlation-based PCA (SubXPCA) with Support Vector Machine (SVM) for automated seizure detection in EEG signals. Feature extraction was performed using SpPCA and SubXPCA. Both techniques explore the subpattern correlation of EEG signals, which helps in decision-making process. SVM is used for classification of seizure and non-seizure EEG signals. The SVM was trained with radial basis kernel. All the experiments have been carried out on the benchmark epilepsy EEG dataset. The entire dataset consists of 500 EEG signals recorded under different scenarios. Seven different experimental cases for classification have been conducted. The classification accuracy was evaluated using tenfold cross validation. The classification results of the proposed approaches have been compared with the results of some of existing techniques proposed in the literature to establish the claim.
Mo, Yun; Zhang, Zhongzhao; Meng, Weixiao; Ma, Lin; Wang, Yao
2014-01-01
Indoor positioning systems based on the fingerprint method are widely used due to the large number of existing devices with a wide range of coverage. However, extensive positioning regions with a massive fingerprint database may cause high computational complexity and error margins, therefore clustering methods are widely applied as a solution. However, traditional clustering methods in positioning systems can only measure the similarity of the Received Signal Strength without being concerned with the continuity of physical coordinates. Besides, outage of access points could result in asymmetric matching problems which severely affect the fine positioning procedure. To solve these issues, in this paper we propose a positioning system based on the Spatial Division Clustering (SDC) method for clustering the fingerprint dataset subject to physical distance constraints. With the Genetic Algorithm and Support Vector Machine techniques, SDC can achieve higher coarse positioning accuracy than traditional clustering algorithms. In terms of fine localization, based on the Kernel Principal Component Analysis method, the proposed positioning system outperforms its counterparts based on other feature extraction methods in low dimensionality. Apart from balancing online matching computational burden, the new positioning system exhibits advantageous performance on radio map clustering, and also shows better robustness and adaptability in the asymmetric matching problem aspect. PMID:24451470
Lan, Yihua; Li, Cunhua; Ren, Haozheng; Zhang, Yong; Min, Zhifang
2012-10-21
A new heuristic algorithm based on the so-called geometric distance sorting technique is proposed for solving the fluence map optimization with dose-volume constraints which is one of the most essential tasks for inverse planning in IMRT. The framework of the proposed method is basically an iterative process which begins with a simple linear constrained quadratic optimization model without considering any dose-volume constraints, and then the dose constraints for the voxels violating the dose-volume constraints are gradually added into the quadratic optimization model step by step until all the dose-volume constraints are satisfied. In each iteration step, an interior point method is adopted to solve each new linear constrained quadratic programming. For choosing the proper candidate voxels for the current dose constraint adding, a so-called geometric distance defined in the transformed standard quadratic form of the fluence map optimization model was used to guide the selection of the voxels. The new geometric distance sorting technique can mostly reduce the unexpected increase of the objective function value caused inevitably by the constraint adding. It can be regarded as an upgrading to the traditional dose sorting technique. The geometry explanation for the proposed method is also given and a proposition is proved to support our heuristic idea. In addition, a smart constraint adding/deleting strategy is designed to ensure a stable iteration convergence. The new algorithm is tested on four cases including head-neck, a prostate, a lung and an oropharyngeal, and compared with the algorithm based on the traditional dose sorting technique. Experimental results showed that the proposed method is more suitable for guiding the selection of new constraints than the traditional dose sorting method, especially for the cases whose target regions are in non-convex shapes. It is a more efficient optimization technique to some extent for choosing constraints than the dose sorting method. By integrating a smart constraint adding/deleting scheme within the iteration framework, the new technique builds up an improved algorithm for solving the fluence map optimization with dose-volume constraints.
On prognostic models, artificial intelligence and censored observations.
Anand, S S; Hamilton, P W; Hughes, J G; Bell, D A
2001-03-01
The development of prognostic models for assisting medical practitioners with decision making is not a trivial task. Models need to possess a number of desirable characteristics and few, if any, current modelling approaches based on statistical or artificial intelligence can produce models that display all these characteristics. The inability of modelling techniques to provide truly useful models has led to interest in these models being purely academic in nature. This in turn has resulted in only a very small percentage of models that have been developed being deployed in practice. On the other hand, new modelling paradigms are being proposed continuously within the machine learning and statistical community and claims, often based on inadequate evaluation, being made on their superiority over traditional modelling methods. We believe that for new modelling approaches to deliver true net benefits over traditional techniques, an evaluation centric approach to their development is essential. In this paper we present such an evaluation centric approach to developing extensions to the basic k-nearest neighbour (k-NN) paradigm. We use standard statistical techniques to enhance the distance metric used and a framework based on evidence theory to obtain a prediction for the target example from the outcome of the retrieved exemplars. We refer to this new k-NN algorithm as Censored k-NN (Ck-NN). This reflects the enhancements made to k-NN that are aimed at providing a means for handling censored observations within k-NN.
Lom, Barbara
2012-01-01
The traditional science lecture, where an instructor delivers a carefully crafted monolog to a large audience of students who passively receive the information, has been a popular mode of instruction for centuries. Recent evidence on the science of teaching and learning indicates that learner-centered, active teaching strategies can be more effective learning tools than traditional lectures. Yet most colleges and universities retain lectures as their central instructional method. This article highlights several simple collaborative teaching techniques that can be readily deployed within traditional lecture frameworks to promote active learning. Specifically, this article briefly introduces the techniques of: reader’s theatre, think-pair-share, roundtable, jigsaw, in-class quizzes, and minute papers. Each technique is broadly applicable well beyond neuroscience courses and easily modifiable to serve an instructor’s specific pedagogical goals. The benefits of each technique are described along with specific examples of how each technique might be deployed within a traditional lecture to create more active learning experiences. PMID:23494568
Is a Team-based Learning Approach to Anatomy Teaching Superior to Didactic Lecturing?
Ghorbani, Naghme; Karbalay-Doust, Saied; Noorafshan, Ali
2014-02-01
Team-based learning (TBL) is used in the medical field to implement interactive learning in small groups. The learning of anatomy and its subsequent application requires the students to recall a great deal of factual content. The aims of this study were to evaluate the students' satisfaction, engagement and knowledge gain in anatomy through the medium of TBL in comparison to the traditional lecture method. This study, carried out from February to June 2012, included 30 physical therapy students of the Shiraz University of Medical Science, School of Rehabilitation Sciences. Classic TBL techniques were modified to cover lower limb anatomy topics in the first year of the physical therapy curriculum. Anatomy lectures were replaced with TBL, which required the preparation of assigned content, specific discussion topics, an individual self-assessment test (IRAT) and the analysis of discussion topics. The teams then subsequently retook the assessment test as a group (GRAT). The first eight weeks of the curriculum were taught using traditional didactic lecturing, while during the second eight weeks the modified TBL method was used. The students evaluated these sessions through a questionnaire. The impact of TBL on student engagement and educational achievement was determined using numerical data, including the IRAT, GRAT and final examination scores. Students had a higher satisfaction rate with the TBL teaching according to the Likert scale. Additionally, higher scores were obtained in the TBL-based final examination in comparison to the lecture-based midterm exam. The students' responses showed that the TBL technique could be used alone or in conjunction with traditional didactic lecturing in order to teach anatomy more effectively.
Is Peer Interaction Necessary for Optimal Active Learning?
ERIC Educational Resources Information Center
Linton, Debra L.; Farmer, Jan Keith; Peterson, Ernie
2014-01-01
Meta-analyses of active-learning research consistently show that active-learning techniques result in greater student performance than traditional lecture-based courses. However, some individual studies show no effect of active-learning interventions. This may be due to inexperienced implementation of active learning. To minimize the effect of…
Perceptions of Saudi Students towards Electronic and Traditional Writing Groups
ERIC Educational Resources Information Center
Alqurashi, Fahad
2008-01-01
This paper reports the findings of an experiment that investigated the reactions of Saudi college students to collaborative learning techniques introduced in two modalities: face-to-face and web-based learning. Quantitative data were collected with a questionnaire that examined the changes of three constructs: attitudes toward collaboration,…
Inquiry in the Physical Geology Classroom: Supporting Students' Conceptual Model Development
ERIC Educational Resources Information Center
Miller, Heather R.; McNeal, Karen S.; Herbert, Bruce E.
2010-01-01
This study characterizes the impact of an inquiry-based learning (IBL) module versus a traditionally structured laboratory exercise. Laboratory sections were randomized into experimental and control groups. The experimental group was taught using IBL pedagogical techniques and included manipulation of large-scale data-sets, use of multiple…
Multiyear, Multi-Instructor Evaluation of a Large-Class Interactive-Engagement Curriculum
ERIC Educational Resources Information Center
Cahill, Michael J.; Hynes, K. Mairin; Trousil, Rebecca; Brooks, Lisa A.; McDaniel, Mark A.; Repice, Michelle; Zhao, Jiuqing; Frey, Regina F.
2014-01-01
Interactive-engagement (IE) techniques consistently enhance conceptual learning gains relative to traditional-lecture courses, but attitudinal gains typically emerge only in small, inquiry-based curricula. The current study evaluated whether a "scalable IE" curriculum--a curriculum used in a large course (~130 students per section) and…
Multicultural Choral Music Pedagogy Based on the Facets Model
ERIC Educational Resources Information Center
Yoo, Hyesoo
2017-01-01
Multicultural choral music has distinct characteristics in that indigenous folk elements are frequently incorporated into a Western European tonal system. Because of this, multicultural choral music is often taught using Western styles (e.g., "bel canto") rather than through traditional singing techniques from their cultures of origin.…
THE TRANSCULTURAL RESEARCH AND TRAINING INSTITUTE (TCI).
ERIC Educational Resources Information Center
LOUBERT, J. DANIEL
MANY AMERICANS EMPLOYED OVERSEAS, ESPECIALLY NAVY AND MARINE PERSONNEL, NEED KNOWLEDGE OF THE CULTURES IN WHICH THEY LIVE. THERE IS CRITICISM OF TRADITIONAL WAYS OF SELECTING PERSONS AND TRAINING THEM. A NUMBER OF NEW TECHNIQUES, BASED ON EXPERIMENTAL TRAINING IN SIMULATION OF FOREIGN SOCIETIES, SEEM TO PROVIDE FOR OVERCOMING INTERNALIZATION…
Assessing Students in the Margin: Challenges, Strategies, and Techniques
ERIC Educational Resources Information Center
Russell, Michael; Kavanaugh, Maureen
2011-01-01
The importance of student assessment, particularly for summative purposes, has increased greatly over the past thirty years. At the same time, emphasis on including all students in assessment programs has also increased. Assessment programs, whether they are large-scale, district-based, or teacher developed, have traditionally attempted to assess…
On Using Various Mathematics Instructions versus Traditional Instruction: An Action Research
ERIC Educational Resources Information Center
Alzabut, Jehad
2017-01-01
In this research, I provide an overview of potentially selected interactive mathematical instructions that help learners-educators identifying the most effective practices for teaching a course on differential equations. Based on my practical experience, positive and negative aspects of the used techniques are discussed. Immediate reactions on the…
Pig Mandible as a Valuable Tool to Improve Periodontal Surgery Techniques
ERIC Educational Resources Information Center
Zangrando, Mariana S. Ragghianti; Sant'Ana, Adriana C. P.; Greghi, Sebastião L. A.; de Rezende, Maria Lucia R.; Damante, Carla A.
2014-01-01
Clinical education in dental practice is a challenge for professionals and students. The traditional method of clinical training in Periodontology usually is based on following the procedure and practicing under supervision, until achieving proficiency. However, laboratory practice is required before direct care in patients. Specific anatomic…
The patient relationship and therapeutic techniques of the South Sotho traditional healer.
Pinkoane, M G; Greeff, M; Williams, M J S
2005-11-01
Until 1996 the practice of traditional healers was outlawed in South Africa and not afforded a legal position in the community of health care providers. In 1978 the World Health Organization (WHO) identified traditional healers as those people forming an essential core of primary health care workers for rural people in the Third World Countries. However in 1994 the new South African government identified traditional healers as forming an essential element of primary health care workers. It is estimated that 80% of the black population uses traditional medicine because it is deeply rooted in their culture, which is linked to their religion. The traditional healer shares with the patient a world view which is completely alien to biomedical personnel. Therapeutic techniques typically used in traditional healing conflict with the therapeutic techniques used in biomedicine. The patients' perceptions of traditional healing, their needs and expectations, may be the driving force behind their continuous persistence to consult a traditional healer, even after these patients may have sought the therapeutic techniques of biomedical personnel. The operation of both systems in the same society creates a problem to both providers and recipients of health care. Confusion then arises and the consumer consequently chooses the services closer to her. The researcher aimed at investigating the characteristics of the relationship between the traditional healers and the patients, explored the therapeutic techniques that are used in the South Sotho traditional healing process, and investigated the views of both the traditional healers and the patients about the South -Sotho traditional healing process, to facilitate incorporation of the traditional healers in the National Health Care Delivery System. A qualitative research design was followed. Participants were identified by means of a non-probable, purposive voluntary sample. Data was collected by means of a video camera and semi-structured interviews with the six traditional healers and twelve patients, as well as by taking field notes after each session. Data analysis was achieved by means of using a checklist for the video recordings, and decoding was done for the interviews. A co-coder and the researcher analysed the data independently, after which three consensus discussions took place to finalise the analysed data. The researcher made conclusions, identified shortcomings, and made recommendations for application to nursing education, nursing research and nursing practice. The recommendations for nursing are reflected in the form of guidelines for the incorporation of the traditional healers in the National Health Care Delivery System.
A systematic mapping study of process mining
NASA Astrophysics Data System (ADS)
Maita, Ana Rocío Cárdenas; Martins, Lucas Corrêa; López Paz, Carlos Ramón; Rafferty, Laura; Hung, Patrick C. K.; Peres, Sarajane Marques; Fantinato, Marcelo
2018-05-01
This study systematically assesses the process mining scenario from 2005 to 2014. The analysis of 705 papers evidenced 'discovery' (71%) as the main type of process mining addressed and 'categorical prediction' (25%) as the main mining task solved. The most applied traditional technique is the 'graph structure-based' ones (38%). Specifically concerning computational intelligence and machine learning techniques, we concluded that little relevance has been given to them. The most applied are 'evolutionary computation' (9%) and 'decision tree' (6%), respectively. Process mining challenges, such as balancing among robustness, simplicity, accuracy and generalization, could benefit from a larger use of such techniques.
Neutron-gamma discrimination via PSD plastic scintillator and SiPMs
NASA Astrophysics Data System (ADS)
Taggart, M. P.; Payne, C.; Sellin, P. J.
2016-10-01
The reduction in availability and inevitable increase in cost of traditional neutron detectors based on the 3He neutron capture reaction has resulted in a concerted effort to seek out new techniques and detection media to meet the needs of national nuclear security. Traditionally, the alternative has been provided through pulse shape discrimination (PSD) using liquid scintillators. However, these are not without their own inherent issues, primarily concerning user safety and ongoing maintenance. A potential system devised to separate neutron and gamma ray pulses utilising the PSD technique takes advantage of recent improvements in silicon photomultiplier (SiPM) technology and the development of plastic scintillators exhibiting the PSD phenomena. In this paper we present the current iteration of this ongoing work having achieved a Figure of Merit (FoM) of 1.39 at 1.5 MeVee.
Current application of chemometrics in traditional Chinese herbal medicine research.
Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke
2016-07-15
Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.
Efficient Computation of Closed-loop Frequency Response for Large Order Flexible Systems
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Giesy, Daniel P.
1997-01-01
An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, full-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open and closed loop loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, a speed-up of almost two orders of magnitude was observed while accuracy improved by up to 5 decimal places.
Simplified Microarray Technique for Identifying mRNA in Rare Samples
NASA Technical Reports Server (NTRS)
Almeida, Eduardo; Kadambi, Geeta
2007-01-01
Two simplified methods of identifying messenger ribonucleic acid (mRNA), and compact, low-power apparatuses to implement the methods, are at the proof-of-concept stage of development. These methods are related to traditional methods based on hybridization of nucleic acid, but whereas the traditional methods must be practiced in laboratory settings, these methods could be practiced in field settings. Hybridization of nucleic acid is a powerful technique for detection of specific complementary nucleic acid sequences, and is increasingly being used for detection of changes in gene expression in microarrays containing thousands of gene probes. A traditional microarray study entails at least the following six steps: 1. Purification of cellular RNA, 2. Amplification of complementary deoxyribonucleic acid [cDNA] by polymerase chain reaction (PCR), 3. Labeling of cDNA with fluorophores of Cy3 (a green cyanine dye) and Cy5 (a red cyanine dye), 4. Hybridization to a microarray chip, 5. Fluorescence scanning the array(s) with dual excitation wavelengths, and 6. Analysis of the resulting images. This six-step procedure must be performed in a laboratory because it requires bulky equipment.
Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta
2008-01-01
Background Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. Results The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. Conclusion GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge. PMID:18430222
ERIC Educational Resources Information Center
Walsh, Jeffrey A.; Braithwaite, Jeremy
2008-01-01
This work, drawing on the literature on alcohol consumption, sexual behavior, and researching sensitive topics, tests the efficacy of the unmatched-count technique (UCT) in establishing higher rates of truthful self-reporting when compared to traditional survey techniques. Traditional techniques grossly underestimate the scope of problems…
[Comparative trial between traditional cesarean section and Misgav-Ladach technique].
Gutiérrez, José Gabriel Tamayo; Coló, José Antonio Sereno; Arreola, María Sandra Huape
2008-02-01
The cesarean section was designed to extract to the neoborn, when the childbirth becomes difficult by the natural routes. The institutional obstetrical work demands long surgical time and high raw materials; therefore, simpler procedures must be implemented. To compare traditional cesarean section vs Misgav-Ladach technique to assess surgical time, and hospital stay and costs. Forty-eight pregnant patients at term with obstetrical indication for cesarean delivery were randomized in two groups: 24 were submitted to traditional cesarean and 24 to Misgav-Ladach technique. The outcomes included surgical time, bleeding, amount of sutures employed, pain intensity and some others adverse effects. The surgical time with Misgav-Ladach technique was shorter compared with traditional cesarean section, bleeding was consistently lesser and pain was also low. None adverse effects were registered in both groups. Although short follow-up showed significant operative time reduction and less bleeding, longer follow-up should be desirable in order to confirm no abdominal adhesions.
Efficient Algorithms for Handling Nondeterministic Automata
NASA Astrophysics Data System (ADS)
Vojnar, Tomáš
Finite (word, tree, or omega) automata play an important role in different areas of computer science, including, for instance, formal verification. Often, deterministic automata are used for which traditional algorithms for important operations such as minimisation and inclusion checking are available. However, the use of deterministic automata implies a need to determinise nondeterministic automata that often arise during various computations even when the computations start with deterministic automata. Unfortunately, determinisation is a very expensive step since deterministic automata may be exponentially bigger than the original nondeterministic automata. That is why, it appears advantageous to avoid determinisation and work directly with nondeterministic automata. This, however, brings a need to be able to implement operations traditionally done on deterministic automata on nondeterministic automata instead. In particular, this is the case of inclusion checking and minimisation (or rather reduction of the size of automata). In the talk, we review several recently proposed techniques for inclusion checking on nondeterministic finite word and tree automata as well as Büchi automata. These techniques are based on using the so called antichains, possibly combined with a use of suitable simulation relations (and, in the case of Büchi automata, the so called Ramsey-based or rank-based approaches). Further, we discuss techniques for reducing the size of nondeterministic word and tree automata using quotienting based on the recently proposed notion of mediated equivalences. The talk is based on several common works with Parosh Aziz Abdulla, Ahmed Bouajjani, Yu-Fang Chen, Peter Habermehl, Lisa Kaati, Richard Mayr, Tayssir Touili, Lorenzo Clemente, Lukáš Holík, and Chih-Duo Hong.
Peacock, Juandre; Harkrider, Lauren N; Bagdasarov, Zhanna; Connelly, Shane; Johnson, James F; Thiel, Chase E; Macdougall, Alexandra E; Mumford, Michael D; Devenport, Lynn D
2013-09-01
Case-based instruction has been regarded by many as a viable alternative to traditional lecture-based education and training. However, little is known about how case-based training techniques impact training effectiveness. This study examined the effects of two such techniques: (a) presentation of alternative outcome scenarios to a case, and (b) conducting a structured outcome evaluation. Consistent with the hypotheses, results indicate that presentation of alternative outcome scenarios reduced knowledge acquisition, reduced sensemaking and ethical decision-making strategy use, and reduced decision ethicality. Conducting a structured outcome evaluation had no impact on these outcomes. Results indicate that those who use case-based instruction should take care to use clear, less complex cases with only a singular outcome if they are seeking these types of outcomes.
De la Torre González, Francisco Javier; Gutiérrez Avendaño, Daniel Oswaldo; Gschaedler Mathis, Anne Christine; Kirchmayr, Manuel Reinhart
2018-06-06
Non- Saccharomyces yeasts are widespread microorganisms and some time ago were considered contaminants in the beverage industry. However, nowadays they have gained importance for their ability to produce aromatic compounds, which in alcoholic beverages improves aromatic complexity and therefore the overall quality. Thus, identification and differentiation of the species involved in fermentation processes is vital and can be classified in traditional methods and techniques based on molecular biology. Traditional methods, however, can be expensive, laborious and/or unable to accurately discriminate on strain level. In the present study, a total of 19 strains of Pichia kluyveri isolated from mezcal, tejuino and cacao fermentations were analyzed with rep-PCR fingerprinting and matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS). The comparative analysis between MS spectra and rep-PCR patterns obtained from these strains showed a high similarity between both methods. However, minimal differences between the obtained rep-PCR and MALDI-TOF MS clusters could be observed. The data shown suggests that MALDI-TOF MS is a promising alternative technique for rapid, reliable and cost-effective differentiation of natives yeast strains isolated from different traditional fermented foods and beverages. This article is protected by copyright. All rights reserved.
Qualitative and quantitative detection of T7 bacteriophages using paper based sandwich ELISA.
Khan, Mohidus Samad; Pande, Tripti; van de Ven, Theo G M
2015-08-01
Viruses cause many infectious diseases and consequently epidemic health threats. Paper based diagnostics and filters can offer attractive options for detecting and deactivating pathogens. However, due to their infectious characteristics, virus detection using paper diagnostics is more challenging compared to the detection of bacteria, enzymes, DNA or antigens. The major objective of this study was to prepare reliable, degradable and low cost paper diagnostics to detect viruses, without using sophisticated optical or microfluidic analytical instruments. T7 bacteriophage was used as a model virus. A paper based sandwich ELISA technique was developed to detect and quantify the T7 phages in solution. The paper based sandwich ELISA detected T7 phage concentrations as low as 100 pfu/mL to as high as 10(9) pfu/mL. The compatibility of paper based sandwich ELISA with the conventional titre count was tested using T7 phage solutions of unknown concentrations. The paper based sandwich ELISA technique is faster and economical compared to the traditional detection techniques. Therefore, with proper calibration and right reagents, and by following the biosafety regulations, the paper based technique can be said to be compatible and economical to the sophisticated laboratory diagnostic techniques applied to detect pathogenic viruses and other microorganisms. Copyright © 2015 Elsevier B.V. All rights reserved.
[Construction of biopharmaceutics classification system of Chinese materia medica].
Liu, Yang; Wei, Li; Dong, Ling; Zhu, Mei-Ling; Tang, Ming-Min; Zhang, Lei
2014-12-01
Based on the characteristics of multicomponent of traditional Chinese medicine and drawing lessons from the concepts, methods and techniques of biopharmaceutics classification system (BCS) in chemical field, this study comes up with the science framework of biopharmaceutics classification system of Chinese materia medica (CMMBCS). Using the different comparison method of multicomponent level and the CMMBCS method of overall traditional Chinese medicine, the study constructs the method process while setting forth academic thoughts and analyzing theory. The basic role of this system is clear to reveal the interaction and the related absorption mechanism of multicomponent in traditional Chinese medicine. It also provides new ideas and methods for improving the quality of Chinese materia medica and the development of new drug research.
NASA Technical Reports Server (NTRS)
Lee, S. Daniel
1990-01-01
We propose a distributed agent architecture (DAA) that can support a variety of paradigms based on both traditional real-time computing and artificial intelligence. DAA consists of distributed agents that are classified into two categories: reactive and cognitive. Reactive agents can be implemented directly in Ada to meet hard real-time requirements and be deployed on on-board embedded processors. A traditional real-time computing methodology under consideration is the rate monotonic theory that can guarantee schedulability based on analytical methods. AI techniques under consideration for reactive agents are approximate or anytime reasoning that can be implemented using Bayesian belief networks as in Guardian. Cognitive agents are traditional expert systems that can be implemented in ART-Ada to meet soft real-time requirements. During the initial design of cognitive agents, it is critical to consider the migration path that would allow initial deployment on ground-based workstations with eventual deployment on on-board processors. ART-Ada technology enables this migration while Lisp-based technologies make it difficult if not impossible. In addition to reactive and cognitive agents, a meta-level agent would be needed to coordinate multiple agents and to provide meta-level control.
[Acceptance and mindfulness-based cognitive-behavioral therapies].
Ngô, Thanh-Lan
2013-01-01
Cognitive behavioral therapy (CBT) is one of the main approaches in psychotherapy. It teaches the patient to examine the link between dysfunctional thoughts and maladaptive behaviors and to re- evaluate the cognitive biases involved in the maintenance of symptoms by using strategies such as guided discovery. CBT is constantly evolving in part to improve its' effectiveness and accessibility. Thus in the last decade, increasingly popular approaches based on mindfulness and acceptance have emerged. These therapies do not attempt to modify cognitions even when they are biased and dysfunctional but rather seek a change in the relationship between the individual and the symptoms. This article aims to present the historical context that has allowed the emergence of this trend, the points of convergence and divergence with traditional CBT as well as a brief presentation of the different therapies based on mindfulness meditation and acceptance. Hayes (2004) described three successive waves in behavior therapy, each characterized by "dominant assumptions, methods and goals": traditional behavior therapy, cognitive therapy and therapies based on mindfulness meditation and acceptance. The latter consider that human suffering occurs when the individual lives a restricted life in order avoid pain and immediate discomfort to the detriment of his global wellbeing. These therapies combine mindfulness, experiential, acceptance strategies with traditional behavior principles in order to attain lasting results. There are significant points of convergence between traditional CBT and therapies based on mindfulness meditation and acceptance. They are both empirically validated, based upon a theoretical model postulating that avoidance is key in the maintenance of psychopathology and they recommend an approach strategy in order to overcome the identified problem. They both use behavioral techniques in the context of a collaborative relationship in order to identify precise problems and to achieve specific goals. They focus on the present moment rather than on historical causes. However, they also present significant differences: control vs acceptance of thoughts, focus on cognition vs behavior, focus on the relationship between the individual and his thoughts vs cognitive content, goal of modifying dysfunctional beliefs vs metacognitive processes, use of experiential vs didactic methods, focus on symptoms vs quality of life, strategies used before vs after the unfolding of full emotional response. The main interventions based on mindfulness meditation and acceptance are: Acceptance and Commitment Therapy, Functional Analytic Therapy, the expanded model of Behavioral Activation, Metacognitive Therapy, Mindfulness based Cognitive Therapy, Dialectic Behavior Therapy, Integrative Behavioral Couples Therapy and Compassionate Mind Training. These are described in this article. They offer concepts and techniques which might enhance therapeutic efficacy. They teach a new way to deploy attention and to enter into a relationship with current experience (for example, defusion) in order to diminish cognitive reactivity, a maintenance factor for psychopathology, and to enhance psychological flexibility. The focus on cognitive process, metacognition as well as cognitive content might yield additional benefits in therapy. It is possible to combine traditional CBT with third wave approaches by using psychoeducation and cognitive restructuring in the beginning phases of therapy in order to establish thought bias and to then encourage acceptance of internal experiences as well as exposure to feared stimuli rather than to continue to use cognitive restructuring techniques. Traditional CBT and third wave approaches seem to impact different processes: the former enhance the capacity to observe and describe experiences and the latter diminish experiential avoidance and increase conscious action as well as acceptance. The identification of personal values helps to motivate the individual to undertake actions required in order to enhance quality of life. In the case of chronic illness, it diminishes suffering by increasing acceptance. Although the evidence base supporting the efficacy of third wave approaches is less robust than in the case of traditional cognitive or behavior therapy, therapies based on mindfulness meditation and acceptance are promising interventions that might help to elucidate change process and offer complementary strategies in order to help patients.
López Martín, M Beatriz; Erice Calvo-Sotelo, Alejo
To compare presurgical hand hygiene with hydroalcoholic solution following the WHO protocol with traditional presurgical hand hygiene. Cultures of the hands of surgeons and surgical nurses were performed before and after presurgical hand hygiene and after removing gloves at the end of surgery. Cultures were done in 2different days: the first day after traditional presurgical hand hygiene, and the second day after presurgical hand hygiene with hydroalcoholic solution following the WHO protocol. The duration of the traditional hand hygiene was measured and compared with the duration (3min) of the WHO protocol. The cost of the products used in the traditional technique was compared with the cost of the hydroalcoholic solution used. The variability of the traditional technique was determined by observation. Following presurgical hand hygiene with hydroalcoholic solution, colony-forming units (CFU) were detected in 5 (7.3%) subjects, whereas after traditional presurgical hand hygiene CFU were detected in 14 subjects (20.5%) (p < 0.05). After glove removal, the numbers of CFU were similar. The time employed in hand hygiene with hydroalcoholic solution (3min) was inferior to the time employed in the traditional technique (p < 0.05), its cost was less than half, and there was no variability. Compared with other techniques, presurgical hand hygiene with hydroalcoholic solution significantly decreases CFU, has similar latency time, a lower cost, and saves time. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Vidya Sagar, R.; Raghu Prasad, B. K.
2012-03-01
This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.
Verification and Validation of KBS with Neural Network Components
NASA Technical Reports Server (NTRS)
Wen, Wu; Callahan, John
1996-01-01
Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.
Visualisation of urban airborne laser scanning data with occlusion images
NASA Astrophysics Data System (ADS)
Hinks, Tommy; Carr, Hamish; Gharibi, Hamid; Laefer, Debra F.
2015-06-01
Airborne Laser Scanning (ALS) was introduced to provide rapid, high resolution scans of landforms for computational processing. More recently, ALS has been adapted for scanning urban areas. The greater complexity of urban scenes necessitates the development of novel methods to exploit urban ALS to best advantage. This paper presents occlusion images: a novel technique that exploits the geometric complexity of the urban environment to improve visualisation of small details for better feature recognition. The algorithm is based on an inversion of traditional occlusion techniques.
Ahn, T; Moon, S; Youk, Y; Jung, Y; Oh, K; Kim, D
2005-05-30
A novel mode analysis method and differential mode delay (DMD) measurement technique for a multimode optical fiber based on optical frequency domain reflectometry has been proposed for the first time. We have used a conventional OFDR with a tunable external cavity laser and a Michelson interferometer. A few-mode optical multimode fiber was prepared to test our proposed measurement technique. We have also compared the OFDR measurement results with those obtained using a traditional time-domain measurement method.
Jain, Rahi; Venkatasubramanian, Padma
2014-01-01
Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.
Cognitive techniques and language: A return to behavioral origins.
Froján Parga, María X; Núñez de Prado Gordillo, Miguel; de Pascual Verdú, Ricardo
2017-08-01
the main purpose of this study is to offer an alternative explanatory account of the functioning of cognitive techniques that is based on the principles of associative learning and highlights their verbal nature. The traditional accounts are questioned and analyzed in the light of the situation of psychology in the 1970s. conceptual analysis is employed to revise the concepts of language, cognition and behavior. Several operant- and Pavlovian-based approaches to these phenomena are presented, while particular emphasis is given to Mowrer’s (1954) approach and Ryle (1949) and Wittgenstein’s (1953) philosophical contributions to the field. several logical problems are found in regard to the theoretical foundations of cognitive techniques. A combination of both operant and Pavlovian paradigms based on the above-mentioned approaches is offered as an alternative explanatory account of cognitive techniques. This new approach could overcome the conceptual fragilities of the cognitive standpoint and its dependence upon constructs of dubious logical and scientific validity.
NASA Astrophysics Data System (ADS)
Ferreira, Tiago Miguel; Maio, Rui; Vicente, Romeu
2017-04-01
The buildings' capacity to maintain minimum structural safety levels during natural disasters, such as earthquakes, is recognisably one of the aspects that most influence urban resilience. Moreover, the public investment in risk mitigation strategies is fundamental, not only to promote social and urban and resilience, but also to limit consequent material, human and environmental losses. Despite the growing awareness of this issue, there is still a vast number of traditional masonry buildings spread throughout many European old city centres that lacks of adequate seismic resistance, requiring therefore urgent retrofitting interventions in order to both reduce their seismic vulnerability and to cope with the increased seismic requirements of recent code standards. Thus, this paper aims at contributing to mitigate the social and economic impacts of earthquake damage scenarios through the development of vulnerability-based comparative analysis of some of the most popular retrofitting techniques applied after the 1998 Azores earthquake. The influence of each technique individually and globally studied resorting to a seismic vulnerability index methodology integrated into a GIS tool and damage and loss scenarios are constructed and critically discussed. Finally, the economic balance resulting from the implementation of that techniques are also examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coletti, Chiara, E-mail: chiara.coletti@studenti.u
During the firing of bricks, mineralogical and textural transformations produce an artificial aggregate characterised by significant porosity. Particularly as regards pore-size distribution and the interconnection model, porosity is an important parameter to evaluate and predict the durability of bricks. The pore system is in fact the main element, which correlates building materials and their environment (especially in cases of aggressive weathering, e.g., salt crystallisation and freeze-thaw cycles) and determines their durability. Four industrial bricks with differing compositions and firing temperatures were analysed with “direct” and “indirect” techniques, traditional methods (mercury intrusion porosimetry, hydric tests, nitrogen adsorption) and new analytical approachesmore » based on digital image reconstruction of 2D and 3D models (back-scattered electrons and computerised X-ray micro-Tomography, respectively). The comparison of results from different analytical methods in the “overlapping ranges” of porosity and the careful reconstruction of a cumulative curve, allowed overcoming their specific limitations and achieving better knowledge on the pore system of bricks. - Highlights: •Pore-size distribution and structure of the pore system in four commercial bricks •A multi-analytical approach combining “direct” and “indirect” techniques •Traditional methods vs. new approaches based on 2D/3D digital image reconstruction •The use of “overlapping ranges” to overcome the limitations of various techniques.« less
Zhao, Guang-Hui; Li, Juan; Blair, David; Li, Xiao-Yan; Elsheikha, Hany M; Lin, Rui-Qing; Zou, Feng-Cai; Zhu, Xing-Quan
2012-01-01
Schistosomiasis is a serious parasitic disease caused by blood-dwelling flukes of the genus Schistosoma. Throughout the world, schistosomiasis is associated with high rates of morbidity and mortality, with close to 800 million people at risk of infection. Precise methods for identification of Schistosoma species and diagnosis of schistosomiasis are crucial for an enhanced understanding of parasite epidemiology that informs effective antiparasitic treatment and preventive measures. Traditional approaches for the diagnosis of schistosomiasis include etiological, immunological and imaging techniques. Diagnosis of schistosomiasis has been revolutionized by the advent of new molecular technologies to amplify parasite nucleic acids. Among these, polymerase chain reaction-based methods have been useful in the analysis of genetic variation among Schistosoma spp. Mass spectrometry is now extending the range of biological molecules that can be detected. In this review, we summarize traditional, non-DNA-based diagnostic methods and then describe and discuss the current and developing molecular techniques for the diagnosis, species differentiation and phylogenetic analysis of Schistosoma spp. These exciting techniques provide foundations for further development of more effective and precise approaches to differentiate schistosomes and diagnose schistosomiasis in the clinic, and also have important implication for exploring novel measures to control schistosomiasis in the near future. Copyright © 2012 Elsevier Inc. All rights reserved.
Deformation and Fabric in Compacted Clay Soils
NASA Astrophysics Data System (ADS)
Wensrich, C. M.; Pineda, J.; Luzin, V.; Suwal, L.; Kisi, E. H.; Allameh-Haery, H.
2018-05-01
Hydromechanical anisotropy of clay soils in response to deformation or deposition history is related to the micromechanics of platelike clay particles and their orientations. In this article, we examine the relationship between microstructure, deformation, and moisture content in kaolin clay using a technique based on neutron scattering. This technique allows for the direct characterization of microstructure within representative samples using traditional measures such as orientation density and soil fabric tensor. From this information, evidence for a simple relationship between components of the deviatoric strain tensor and the deviatoric fabric tensor emerge. This relationship may provide a physical basis for future anisotropic constitutive models based on the micromechanics of these materials.
Hernández-Rodríguez, Patricia; Díaz, César A; Dalmau, Ernesto A; Quintero, Gladys M
2011-01-01
Leptospirosis is caused by Leptospira, gram negative spirochaetes whose microbiologic identification is difficult due to their low rate of growth and metabolic activity. In Colombia leptospirosis diagnosis is achieved by serological techniques without unified criteria for what positive titers are. In this study we compared polymerase chain reaction (PCR) with microbiological culture and dark field microscopy for the diagnosis of leptospirosis. Microbiological and molecular techniques were performed on 83 samples of urine taken from bovines in the savannahs surrounding Bogotá in Colombia, with presumptive diagnosis of leptospirosis. 117 samples of urine taken from healthy bovines were used as negative controls. 83 samples were MAT positive with titers ≥ 1:50; 81 with titers ≥ 1:100; and 66 with titers ≥ 1:200. 36% of the total samples (73/200) were Leptospira positives by microbiological culture, 32% (63/200) by dark field microscopy and 37% (74/200) by PCR. Amplicons obtained by PCR were 482 base pair long which are Leptospira specific. An amplicon of 262 base pairs typical of pathogenic Leptospira was observed in 71 out of the 74 PCR positive samples. The remaining 3 samples showed a 240 base pair amplicon which is typical of saprophytic Leptospira. PCR as a Leptospira diagnosis technique was 100% sensitive and 99% specific in comparison to microbiological culture. Kappa value of 0.99 indicated an excellent concordance between these techniques. Sensitivity and specificity reported for MAT when compared to microbiological culture was 0.95 and 0.89 with a ≥ 1:50 cut off. PCR was a reliable method for the rapid and precise diagnosis of leptospirosis when compared to traditional techniques in our study. The research presented here will be helpful to improve diagnosis and control of leptospirosis in Colombia and other endemic countries. Copyright © 2010 Elsevier B.V. All rights reserved.
Li, Wen-Long; Qu, Hai-Bin
2016-10-01
In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.
3D direct writing fabrication of electrodes for electrochemical storage devices
NASA Astrophysics Data System (ADS)
Wei, Min; Zhang, Feng; Wang, Wei; Alexandridis, Paschalis; Zhou, Chi; Wu, Gang
2017-06-01
Among different printing techniques, direct ink writing is commonly used to fabricate 3D battery and supercapacitor electrodes. The major advantages of using the direct ink writing include effectively building 3D structure for energy storage devices and providing higher power density and higher energy density than traditional techniques due to the increased surface area of electrode. Nevertheless, direct ink writing has high standards for the printing inks, which requires high viscosity, high yield stress under shear and compression, and well-controlled viscoelasticity. Recently, a number of 3D-printed energy storage devices have been reported, and it is very important to understand the printing process and the ink preparation process for further material design and technology development. We discussed current progress of direct ink writing technologies by using various electrode materials including carbon nanotube-based material, graphene-based material, LTO (Li4Ti5O12), LFP (LiFePO4), LiMn1-xFexPO4, and Zn-based metallic oxide. Based on achieve electrochemical performance, these 3D-printed devices deliver performance comparable to the energy storage device fabricated using traditional methods still leaving large room for further improvement. Finally, perspectives are provided on the potential future direction of 3D printing for all solid-state electrochemical energy storage devices.
Yin, Mengchen; Ma, Junming; Huang, Quan; Xia, Ye; Shen, Qixing; Zhao, Chenglong; Tao, Jun; Chen, Ni; Yu, Zhingxing; Ye, Jie; Mo, Wen; Xiao, Jianru
2016-10-18
The low-profile angle-stable spacer Zero-P is a new kind of cervical fusion system that is claimed to limit the potential drawbacks and complications. The purpose of this meta-analysis was to compare the clinical and radiological results of the new Zero-P implant with those of the traditional anterior cage and plate in the treatment of symptomatic cervical spondylosis, and provides clinicians with evidence on which to base their clinical decision making. The following electronic databases were searched: PMedline, PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, Evidence Based Medicine Reviews, VIP, and CNKI. Conference posters and abstracts were also electronically searched. The efficacy was evaluated in intraoperative time, intraoperative blood loss, fusion rate and dysphagia. For intraoperative time and intraoperative blood loss, the meta-analysis revealed that the Zero-P surgical technique is not superior to the cage and plate technique . For fusion rate, the two techniques both had good bone fusion, however, this difference is not statistically significant. For decrease of JOA and dysphagia, the pooled data showed that the Zero-P surgical technique is superior to the cage and plate technique. Zero-P interbody fusion can attain good clinical efficacy and a satisfactory fusion rate in the treatment of symptomatic cervical spondylosis. It also can effectively reduce the risk of postoperative dysphagia and its complications. However, owing to the lack of long-term follow-up, its long-term efficacy remains unknown.
Neural net diagnostics for VLSI test
NASA Technical Reports Server (NTRS)
Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.
1990-01-01
This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.
USDA-ARS?s Scientific Manuscript database
Traditional microbiological techniques for estimating populations of viable bacteria can be laborious and time consuming. The Most Probable Number (MPN) technique is especially tedious as multiple series of tubes must be inoculated at several different dilutions. Recently, an instrument (TEMPOTM) ...
Madu, C N; Quint, D J; Normolle, D P; Marsh, R B; Wang, E Y; Pierce, L J
2001-11-01
To delineate with computed tomography (CT) the anatomic regions containing the supraclavicular (SCV) and infraclavicular (IFV) nodal groups, to define the course of the brachial plexus, to estimate the actual radiation dose received by these regions in a series of patients treated in the traditional manner, and to compare these doses to those received with an optimized dosimetric technique. Twenty patients underwent contrast material-enhanced CT for the purpose of radiation therapy planning. CT scans were used to study the location of the SCV and IFV nodal regions by using outlining of readily identifiable anatomic structures that define the nodal groups. The brachial plexus was also outlined by using similar methods. Radiation therapy doses to the SCV and IFV were then estimated by using traditional dose calculations and optimized planning. A repeated measures analysis of covariance was used to compare the SCV and IFV depths and to compare the doses achieved with the traditional and optimized methods. Coverage by the 90% isodose surface was significantly decreased with traditional planning versus conformal planning as the depth to the SCV nodes increased (P < .001). Significantly decreased coverage by using the 90% isodose surface was demonstrated for traditional planning versus conformal planning with increasing IFV depth (P = .015). A linear correlation was found between brachial plexus depth and SCV depth up to 7 cm. Conformal optimized planning provided improved dosimetric coverage compared with standard techniques.
[Soft-ridged bench terrace design in hilly loess region].
Cao, Shixiong; Chen, Li; Gao, Wangsheng
2005-08-01
Reconfiguration of hillside field into terrace is regarded as one of the key techniques for water and soil conservation in mountainous regions. On slopes exceeding 30 degrees, the traditional techniques of terracing are difficult to apply as risers (i.e., backslopes), and if not reinforced, are so abrupt and easy to collapse under gravity alone, thus damaging the terrace. To improve the reconfiguration of hillside field into terrace, holistic techniques of soft-ridged bench terrace engineering, including revegetation, with trees and planting grasses on riser slopes, were tested between 1997 and 2001 in Xiabiangou watershed of Yan' an, Shaanxi Province. A "working with Nature" engineering approach, riser slopes of 45 degrees, similar to the pre-existing slope of 35 degrees, was employed to radically reduce gravity-erosion. Based on the concepts of biodiversity and the principles of landscape ecology, terrace benches, bunds, and risers were planted with trees, shrubs, forage grasses, and crops, serving to generate a diverse array of plants, a semi-forested area, and to stabilize terrace bunds. Soft-ridged bench terrace made it possible to significantly reduce hazards arising from gravity erosion, and reduce the costs of individual bench construction and maintenance by 24.9% and 55.5% of the costs under traditional techniques, respectively. Such a construction allowed an enrichment and concentration of nutrients in the soils of terrace bunds, providing an ideal environment for a range of plants to grow and develop. The terrace riser could be planted with drought-resistant plants ranging from forage grasses to trees, and this riser vegetation would turn the exposed bunds and risers existing under traditional techniques into plant-covered belts, great green ribbons decorating farmland and contributing to the enhancement of the landscape biology.
Sakadjian, Alex; Panchuk, Derek; Pearce, Alan J
2014-06-01
This study investigated the effectiveness of action observation (AO) on facilitating learning of the power clean technique (kinematics) compared with traditional strength coaching methods and whether improvements in performance (kinetics) were associated with an improvement in lifting technique. Fifteen subjects (age, 20.9 ± 2.3 years) with no experience in performing the power clean exercise attended 12 training and testing sessions over a 4-week period. Subjects were assigned to 2 matched groups, based on preintervention power clean performance and performed 3 sets of 5 repetitions of the power clean exercise at each training session. Subjects in the traditional coaching group (TC; n = 7) received the standard coaching feedback (verbal cues and physical practice), whereas subjects in the AO group (n = 8) received similar verbal coaching cues and physical practice but also observed a video of a skilled model before performing each set. Kinematic data were collected from video recordings of subjects who were fitted with joint center markings during testing, whereas kinetic data were collected from a weightlifting analyzer attached to the barbell. Subjects were tested before intervention, at the end of weeks 2 and 3, and at after intervention at the end of week 4. Faster improvements (3%) were observed in power clean technique with AO-facilitated learning in the first week and performance improvements (mean peak power of the subject's 15 repetitions) over time were significant (p < 0.001). In addition, performance improvement was significantly associated (R = 0.215) with technique improvements. In conclusion, AO combined with verbal coaching and physical practice of the power clean exercise resulted in significantly faster technique improvements and improvement in performance compared with traditional coaching methods.
The Method of Anschauung: From Johann H. Pestalozzi to Herbert Spencer.
ERIC Educational Resources Information Center
Takaya, Keiichi
2003-01-01
One of the major inventions of modern education is the instructional use of "Anschauung," an experience-based learning technique that was influential both as a method of instruction (more effective than mere book-learning and rote memorization) and as a rejection of old social arrangements that inculcated traditional values through deductive and…
ERIC Educational Resources Information Center
Sadowsky, Cristina L.; McDonald, John W.
2009-01-01
Physical rehabilitation following spinal cord injury-related paralysis has traditionally focused on teaching compensatory techniques, thus enabling the individual to achieve day-to-day function despite significant neurological deficits. But the concept of an irreparable central nervous system (CNS) is slowly being replaced with evidence related to…
Cotton Island: Students' Learning Motivation Using a Virtual World
ERIC Educational Resources Information Center
Wyss, Jamie; Lee, Seung-Eun; Domina, Tanya; MacGillivray, Maureen
2014-01-01
As technology advances, it is important for teachers to seamlessly integrate technology into their innovative teaching techniques. Using virtual worlds is one alternative to traditional teaching methods that can provide rich learning experiences. The purpose of this article is twofold: (a) to present Cotton Island, an avatar-based 3-D virtual…
The Predictive Validity of CBM Writing Indices for Eighth-Grade Students
ERIC Educational Resources Information Center
Amato, Janelle M.; Watkins, Marley W.
2011-01-01
Curriculum-based measurement (CBM) is an alternative to traditional assessment techniques. Technical work has begun to identify CBM writing indices that are psychometrically sound for monitoring older students' writing proficiency. This study examined the predictive validity of CBM writing indices in a sample of 447 eighth-grade students.…
Interactive Exercises for an Introductory Weather and Climate Course
ERIC Educational Resources Information Center
Carbone, Gregory J.; Power, Helen C.
2005-01-01
Students learn more from introductory weather and climate courses when they can relate theoretical material to personal experience. The ubiquity of weather should make the link obvious but instructors can foster this connection with a variety of simple methods. Here we describe traditional and web-based techniques that encourage students to…
Confidence-Based Assessments within an Adult Learning Environment
ERIC Educational Resources Information Center
Novacek, Paul
2013-01-01
Traditional knowledge assessments rely on multiple-choice type questions that only report a right or wrong answer. The reliance within the education system on this technique infers that a student who provides a correct answer purely through guesswork possesses knowledge equivalent to a student who actually knows the correct answer. A more complete…
Cultural Models of Domestic Violence: Perspectives of Social Work and Anthropology Students
ERIC Educational Resources Information Center
Collins, Cyleste C.; Dressler, William W.
2008-01-01
This study employed a unique theoretical approach and a series of participant-based ethnographic interviewing techniques that are traditionally used in cognitive anthropology to examine and compare social work and anthropology students' cultural models of the causes of domestic violence. The study findings indicate that although social work…
Evidence-Based Behavioral Treatment of Dog Phobia with Young Children: Two Case Examples
ERIC Educational Resources Information Center
May, Anna C.; Rudy, Brittany M.; Davis, Thompson E., III; Matson, Johnny L.
2013-01-01
Specific phobias are among the most common anxiety disorders, especially in children. Unfortunately, a paucity of literature exists regarding the treatment of specific phobia in young children, despite the knowledge that traditional techniques (i.e., cognitive-behavioral therapy [CBT]) may not be practical. Therefore, the purpose of this article…
Muhammad, Saqib; Han, Shengli; Xie, Xiaoyu; Wang, Sicen; Aziz, Muhammad Majid
2017-01-01
Cell membrane chromatography is a simple, specific, and time-saving technique for studying drug-receptor interactions, screening of active components from complex mixtures, and quality control of traditional Chinese medicines. However, the short column life, low sensitivity, low column efficiency (so cannot resolve satisfactorily mixture of compounds), low peak capacity, and inefficient in structure identification were bottleneck in its application. Combinations of cell membrane chromatography with multidimensional chromatography such as two-dimensional liquid chromatography and high sensitivity detectors like mass have significantly reduced many of the above-mentioned shortcomings. This paper provides an overview of the current advances in online two-dimensional-based cell membrane chromatography for screening target components from traditional Chinese medicines with particular emphasis on the instrumentation, preparation of cell membrane stationary phase, advantages, and disadvantages compared to alternative approaches. The last section of the review summarizes the applications of the online two-dimensional high-performance liquid chromatography based cell membrane chromatography reported since its emergence to date (2010-June 2016). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Counting malaria parasites with a two-stage EM based algorithm using crowsourced data.
Cabrera-Bean, Margarita; Pages-Zamora, Alba; Diaz-Vilor, Carles; Postigo-Camps, Maria; Cuadrado-Sanchez, Daniel; Luengo-Oroz, Miguel Angel
2017-07-01
Malaria eradication of the worldwide is currently one of the main WHO's global goals. In this work, we focus on the use of human-machine interaction strategies for low-cost fast reliable malaria diagnostic based on a crowdsourced approach. The addressed technical problem consists in detecting spots in images even under very harsh conditions when positive objects are very similar to some artifacts. The clicks or tags delivered by several annotators labeling an image are modeled as a robust finite mixture, and techniques based on the Expectation-Maximization (EM) algorithm are proposed for accurately counting malaria parasites on thick blood smears obtained by microscopic Giemsa-stained techniques. This approach outperforms other traditional methods as it is shown through experimentation with real data.
Wavelet-based audio embedding and audio/video compression
NASA Astrophysics Data System (ADS)
Mendenhall, Michael J.; Claypoole, Roger L., Jr.
2001-12-01
Watermarking, traditionally used for copyright protection, is used in a new and exciting way. An efficient wavelet-based watermarking technique embeds audio information into a video signal. Several effective compression techniques are applied to compress the resulting audio/video signal in an embedded fashion. This wavelet-based compression algorithm incorporates bit-plane coding, index coding, and Huffman coding. To demonstrate the potential of this audio embedding and audio/video compression algorithm, we embed an audio signal into a video signal and then compress. Results show that overall compression rates of 15:1 can be achieved. The video signal is reconstructed with a median PSNR of nearly 33 dB. Finally, the audio signal is extracted from the compressed audio/video signal without error.
Wong, Alex K; Davis, Gabrielle B; Nguyen, T JoAnna; Hui, Kenneth J W S; Hwang, Brian H; Chan, Linda S; Zhou, Zhao; Schooler, Wesley G; Chandrasekhar, Bala S; Urata, Mark M
2014-07-01
Traditional visualization techniques in microsurgery require strict positioning in order to maintain the field of visualization. However, static posturing over time may lead to musculoskeletal strain and injury. Three-dimensional high-definition (3DHD) visualization technology may be a useful adjunct to limiting static posturing and improving ergonomics in microsurgery. In this study, we aimed to investigate the benefits of using the 3DHD technology over traditional techniques. A total of 14 volunteers consisting of novice and experienced microsurgeons performed femoral anastomoses on male Sprague-Dawley retired breeder rats using traditional techniques as well as the 3DHD technology and compared the two techniques. Participants subsequently completed a questionnaire regarding their preference in terms of operational parameters, ergonomics, overall quality, and educational benefits. Efficiency was also evaluated by mean times to complete the anastomosis with each technique. A total of 27 anastomoses were performed, 14 of 14 using the traditional microscope and 13 of 14 using the 3DHD technology. Preference toward the traditional modality was noted with respect to the parameters of precision, field adjustments, zoom and focus, depth perception, and overall quality. The 3DHD technique was preferred for improved stamina and less back and eye strain. Participants believed that the 3DHD technique was the better method for learning microsurgery. Longer mean time of anastomosis completion was noted in participants utilizing the 3DHD technique. The 3DHD technology may prove to be valuable in improving proper ergonomics in microsurgery. In addition, it may be useful in medical education when applied to the learning of new microsurgical skills. More studies are warranted to determine its efficacy and safety in a clinical setting. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
What defines an Expert? - Uncertainty in the interpretation of seismic data
NASA Astrophysics Data System (ADS)
Bond, C. E.
2008-12-01
Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?
Kopp, Sandra L; Smith, Hugh M
2011-01-01
Little is known about the use of Web-based education in regional anesthesia training. Benefits of Web-based education include the ability to standardize learning material quality and content, build appropriate learning progressions, use interactive multimedia technologies, and individualize delivery of course materials. The goals of this investigation were (1) to determine whether module design influences regional anesthesia knowledge acquisition, (2) to characterize learner preference patterns among anesthesia residents, and (3) to determine whether learner preferences play a role in knowledge acquisition. Direct comparison of knowledge assessments, learning styles, and learner preferences will be made between an interactive case-based and a traditional textbook-style module design. Forty-three Mayo Clinic anesthesiology residents completed 2 online modules, a knowledge pretest, posttest, an Index of Learning Styles assessment, and a participant satisfaction survey. Interscalene and lumbar plexus regional techniques were selected as the learning content for 4 Web modules constructed using the Blackboard Vista coursework application. One traditional textbook-style module and 1 interactive case-based module were designed for each of the interscalene and lumbar plexus techniques. Participants scored higher on the postmodule knowledge assessment for both of the interscalene and lumbar plexus modules. Postmodule knowledge performance scores were independent of both module design (interactive case-based versus traditional textbook style) and learning style preferences. However, nearly all participants reported a preference for Web-based learning and believe that it should be used in anesthesia resident education. Participants did not feel that Web-base learning should replace the current lecture-based curriculum. All residents scored higher on the postmodule knowledge assessment, but this improvement was independent of the module design and individual learning styles. Although residents believe that online learning should be used in anesthesia training, the results of this study do not demonstrate improved learning or justify the time and expense of developing complex case-based training modules. While there may be practical benefits of Web-based education, educators in regional anesthesia should be cautious about developing curricula based on learner preference data.
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Lu, J.; Hovorka, S. D.; Freifeld, B. M.; Islam, A.
2015-12-01
Monitoring techniques capable of deep subsurface detection are desirable for early warning and leakage pathway identification in geologic carbon storage formations. This work investigates the feasibility of a leakage detection technique based on pulse testing, which is a traditional hydrogeological characterization tool. In pulse testing, the monitoring reservoir is stimulated at a fixed frequency and the acquired pressure perturbation signals are analyzed in the frequency domain to detect potential deviations in the reservoir's frequency domain response function. Unlike traditional time-domain analyses, the frequency-domain analysis aims to minimize the interference of reservoir noise by imposing coded injection patterns such that the reservoir responses to injection can be uniquely determined. We have established the theoretical basis of the approach in previous work. Recently, field validation of this pressure-based, leakage detection technique was conducted at a CO2-EOR site located in Mississippi, USA. During the demonstration, two sets of experiments were performed using 90-min and 150-min pulsing periods, for both with and without leak scenarios. Because of the lack of pre-existing leakage pathways, artificial leakage CO2 was simulated by rate-controlled venting from one of the monitoring wells. Our results show that leakage events caused a significant deviation in the amplitude of the frequency response function, indicating that pulse testing may be used as a cost-effective monitoring technique with a strong potential for automation.
Feed-forward frequency offset estimation for 32-QAM optical coherent detection.
Xiao, Fei; Lu, Jianing; Fu, Songnian; Xie, Chenhui; Tang, Ming; Tian, Jinwen; Liu, Deming
2017-04-17
Due to the non-rectangular distribution of the constellation points, traditional fast Fourier transform based frequency offset estimation (FFT-FOE) is no longer suitable for 32-QAM signal. Here, we report a modified FFT-FOE technique by selecting and digitally amplifying the inner QPSK ring of 32-QAM after the adaptive equalization, which is defined as QPSK-selection assisted FFT-FOE. Simulation results show that no FOE error occurs with a FFT size of only 512 symbols, when the signal-to-noise ratio (SNR) is above 17.5 dB using our proposed FOE technique. However, the error probability of traditional FFT-FOE scheme for 32-QAM is always intolerant. Finally, our proposed FOE scheme functions well for 10 Gbaud dual polarization (DP)-32-QAM signal to reach 20% forward error correction (FEC) threshold of BER=2×10-2, under the scenario of back-to-back (B2B) transmission.
Iodine Absorption Cells Purity Testing.
Hrabina, Jan; Zucco, Massimo; Philippe, Charles; Pham, Tuan Minh; Holá, Miroslava; Acef, Ouali; Lazar, Josef; Číp, Ondřej
2017-01-06
This article deals with the evaluation of the chemical purity of iodine-filled absorption cells and the optical frequency references used for the frequency locking of laser standards. We summarize the recent trends and progress in absorption cell technology and we focus on methods for iodine cell purity testing. We compare two independent experimental systems based on the laser-induced fluorescence method, showing an improvement of measurement uncertainty by introducing a compensation system reducing unwanted influences. We show the advantages of this technique, which is relatively simple and does not require extensive hardware equipment. As an alternative to the traditionally used methods we propose an approach of hyperfine transitions' spectral linewidth measurement. The key characteristic of this method is demonstrated on a set of testing iodine cells. The relationship between laser-induced fluorescence and transition linewidth methods will be presented as well as a summary of the advantages and disadvantages of the proposed technique (in comparison with traditional measurement approaches).
Iodine Absorption Cells Purity Testing
Hrabina, Jan; Zucco, Massimo; Philippe, Charles; Pham, Tuan Minh; Holá, Miroslava; Acef, Ouali; Lazar, Josef; Číp, Ondřej
2017-01-01
This article deals with the evaluation of the chemical purity of iodine-filled absorption cells and the optical frequency references used for the frequency locking of laser standards. We summarize the recent trends and progress in absorption cell technology and we focus on methods for iodine cell purity testing. We compare two independent experimental systems based on the laser-induced fluorescence method, showing an improvement of measurement uncertainty by introducing a compensation system reducing unwanted influences. We show the advantages of this technique, which is relatively simple and does not require extensive hardware equipment. As an alternative to the traditionally used methods we propose an approach of hyperfine transitions’ spectral linewidth measurement. The key characteristic of this method is demonstrated on a set of testing iodine cells. The relationship between laser-induced fluorescence and transition linewidth methods will be presented as well as a summary of the advantages and disadvantages of the proposed technique (in comparison with traditional measurement approaches). PMID:28067834
Delineation of fault zones using imaging radar
NASA Technical Reports Server (NTRS)
Toksoz, M. N.; Gulen, L.; Prange, M.; Matarese, J.; Pettengill, G. H.; Ford, P. G.
1986-01-01
The assessment of earthquake hazards and mineral and oil potential of a given region requires a detailed knowledge of geological structure, including the configuration of faults. Delineation of faults is traditionally based on three types of data: (1) seismicity data, which shows the location and magnitude of earthquake activity; (2) field mapping, which in remote areas is typically incomplete and of insufficient accuracy; and (3) remote sensing, including LANDSAT images and high altitude photography. Recently, high resolution radar images of tectonically active regions have been obtained by SEASAT and Shuttle Imaging Radar (SIR-A and SIR-B) systems. These radar images are sensitive to terrain slope variations and emphasize the topographic signatures of fault zones. Techniques were developed for using the radar data in conjunction with the traditional types of data to delineate major faults in well-known test sites, and to extend interpretation techniques to remote areas.
Podsakoff, Nathan P; Podsakoff, Philip M; Mackenzie, Scott B; Klinger, Ryan L
2013-01-01
Several researchers have persuasively argued that the most important evidence to consider when assessing construct validity is whether variations in the construct of interest cause corresponding variations in the measures of the focal construct. Unfortunately, the literature provides little practical guidance on how researchers can go about testing this. Therefore, the purpose of this article is to describe how researchers can use video techniques to test whether their scales measure what they purport to measure. First, we discuss how researchers can develop valid manipulations of the focal construct that they hope to measure. Next, we explain how to design a study to use this manipulation to test the validity of the scale. Finally, comparing and contrasting traditional and contemporary perspectives on validation, we discuss the advantages and limitations of video-based validation procedures. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Reyes, Camilo; Mason, Eric; Solares, C. Arturo
2014-01-01
Introduction A substantial body of literature has been devoted to the distinct characteristics and surgical options to repair the skull base. However, the skull base is an anatomically challenging location that requires a three-dimensional reconstruction approach. Furthermore, advances in endoscopic skull base surgery encompass a wide range of surgical pathology, from benign tumors to sinonasal cancer. This has resulted in the creation of wide defects that yield a new challenge in skull base reconstruction. Progress in technology and imaging has made this approach an internationally accepted method to repair these defects. Objectives Discuss historical developments and flaps available for skull base reconstruction. Data Synthesis Free grafts in skull base reconstruction are a viable option in small defects and low-flow leaks. Vascularized flaps pose a distinct advantage in large defects and high-flow leaks. When open techniques are used, free flap reconstruction techniques are often necessary to repair large entry wound defects. Conclusions Reconstruction of skull base defects requires a thorough knowledge of surgical anatomy, disease, and patient risk factors associated with high-flow cerebrospinal fluid leaks. Various reconstruction techniques are available, from free tissue grafting to vascularized flaps. Possible complications that can befall after these procedures need to be considered. Although endonasal techniques are being used with increasing frequency, open techniques are still necessary in selected cases. PMID:25992142
The design and implementation of hydrographical information management system (HIMS)
NASA Astrophysics Data System (ADS)
Sui, Haigang; Hua, Li; Wang, Qi; Zhang, Anming
2005-10-01
With the development of hydrographical work and information techniques, the large variety of hydrographical information including electronic charts, documents and other materials are widely used, and the traditional management mode and techniques are unsuitable for the development of the Chinese Marine Safety Administration Bureau (CMSAB). How to manage all kinds of hydrographical information has become an important and urgent problem. A lot of advanced techniques including GIS, RS, spatial database management and VR techniques are introduced for solving these problems. Some design principles and key techniques of the HIMS including the mixed mode base on B/S, C/S and stand-alone computer mode, multi-source & multi-scale data organization and management, multi-source data integration and diverse visualization of digital chart, efficient security control strategies are illustrated in detail. Based on the above ideas and strategies, an integrated system named Hydrographical Information Management System (HIMS) was developed. And the HIMS has been applied in the Shanghai Marine Safety Administration Bureau and obtained good evaluation.
Forensic identification of resampling operators: A semi non-intrusive approach.
Cao, Gang; Zhao, Yao; Ni, Rongrong
2012-03-10
Recently, several new resampling operators have been proposed and successfully invalidate the existing resampling detectors. However, the reliability of such anti-forensic techniques is unaware and needs to be investigated. In this paper, we focus on the forensic identification of digital image resampling operators including the traditional type and the anti-forensic type which hides the trace of traditional resampling. Various resampling algorithms involving geometric distortion (GD)-based, dual-path-based and postprocessing-based are investigated. The identification is achieved in the manner of semi non-intrusive, supposing the resampling software could be accessed. Given an input pattern of monotone signal, polarity aberration of GD-based resampled signal's first derivative is analyzed theoretically and measured by effective feature metric. Dual-path-based and postprocessing-based resampling can also be identified by feeding proper test patterns. Experimental results on various parameter settings demonstrate the effectiveness of the proposed approach. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Massicotte, Richard; Mafu, Akier A.; Ahmad, Darakhshan; Deshaies, Francis; Pichette, Gilbert; Belhumeur, Pierre
2017-01-01
The present study was undertaken to compare the use of flow cytometry (FCM) and traditional culture methods for efficacy assessment of six disinfectants used in Quebec hospitals including: two quaternary ammonium-based, two activated hydrogen peroxide-based, one phenol-based, and one sodium hypochlorite-based. Four nosocomial bacterial species, Escherichia coli, Staphylococcus aureus, Pseudomonas aeruginosa, and Vancomycin-resistant Enterococci faecalis, were exposed to minimum lethal concentrations (MLCs) and sublethal concentrations (1/2 MLCs) of disinfectants under study. The results showed a strong correlation between the two techniques for the presence of dead and live cell populations, as well as, evidence of injured populations with the FCM. The only exception was observed with sodium hypochlorite at higher concentrations where fluorescence was diminished and underestimating dead cell population. The results also showed that FCM can replace traditional microbiological methods to study disinfectant efficacy on bacteria. Furthermore, FCM profiles for E. coli and E. faecalis cells exposed to sublethal concentrations exhibited distinct populations of injured cells, opening a new aspect for future research and investigation to elucidate the role of injured, cultural/noncuturable/resuscitable cell populations in infection control. PMID:28217115
X-ray microtomography-based measurements of meniscal allografts.
Mickiewicz, P; Binkowski, M; Bursig, H; Wróbel, Z
2015-05-01
X-ray microcomputed tomography (XMT) is a technique widely used to image hard and soft tissues. Meniscal allografts as collagen structures can be imaged and analyzed using XMT. The aim of this study was to present an XMT scanning protocol that can be used to obtain the 3D geometry of menisci. It was further applied to compare two methods of meniscal allograft measurement: traditional (based on manual measurement) and novel (based on digital measurement of 3D models of menisci obtained with use of XMT scanner). The XMT-based menisci measurement is a reliable method for assessing the geometry of a meniscal allograft by measuring the basic meniscal dimensions known from traditional protocol. Thirteen dissected menisci were measured according the same principles traditionally applied in a tissue bank. Next, the same specimens were scanned by a laboratory scanner in the XMT Lab. The images were processed to obtain a 3D mesh. 3D models of allograft geometry were then measured using a novel protocol enhanced by computer software. Then, both measurements were compared using statistical tests. The results showed significant differences (P<0.05) between the lengths of the medial and lateral menisci measured in the tissue bank and the XMT Lab. Also, medial meniscal widths were significantly different (P<0.05). Differences in meniscal lengths may result from difficulties in dissected meniscus measurements in tissue banks, and may be related to the elastic structure of the dissected meniscus. Errors may also be caused by the lack of highlighted landmarks on the meniscal surface in this study. The XMT may be a good technique for assessing meniscal dimensions without actually touching the specimen. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Semiotics and agents for integrating and navigating through multimedia representations of concepts
NASA Astrophysics Data System (ADS)
Joyce, Dan W.; Lewis, Paul H.; Tansley, Robert H.; Dobie, Mark R.; Hall, Wendy
1999-12-01
The purpose of this paper is two-fold. We begin by exploring the emerging trend to view multimedia information in terms of low-level and high-level components; the former being feature-based and the latter the 'semantics' intrinsic to what is portrayed by the media object. Traditionally, this has been viewed by employing analogies with generative linguistics. Recently, a new perceptive based on the semiotic tradition has been alluded to in several papers. We believe this to be a more appropriate approach. From this, we propose an approach for tackling this problem which uses an associative data structure expressing authored information together with intelligent agents acting autonomously over this structure. We then show how neural networks can be used to implement such agents. The agents act as 'vehicles' for bridging the gap between multimedia semantics and concrete expressions of high-level knowledge, but we suggest that traditional neural network techniques for classification are not architecturally adequate.
Measuring suspended sediment: Chapter 10
Gray, J.R.; Landers, M.N.
2013-01-01
Suspended sediment in streams and rivers can be measured using traditional instruments and techniques and (or) surrogate technologies. The former, as described herein, consists primarily of both manually deployed isokinetic samplers and their deployment protocols developed by the Federal Interagency Sedimentation Project. They are used on all continents other than Antarctica. The reliability of the typically spatially rich but temporally sparse data produced by traditional means is supported by a broad base of scientific literature since 1940. However, the suspended sediment surrogate technologies described herein – based on hydroacoustic, nephelometric, laser, and pressure difference principles – tend to produce temporally rich but in some cases spatially sparse datasets. The value of temporally rich data in the accuracy of continuous sediment-discharge records is hard to overstate, in part because such data can often overcome the shortcomings of poor spatial coverage. Coupled with calibration data produced by traditional means, surrogate technologies show considerable promise toward providing the fluvial sediment data needed to increase and bring more consistency to sediment-discharge measurements worldwide.
USDA-ARS?s Scientific Manuscript database
Recently, an instrument (TEMPOTM) has been developed to automate the Most Probable Number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique to traditional microbiological plating methods or PetrifilmTM for estimating the t...
Westermann, Robert W; DeBerardino, Thomas; Amendola, Annunziato
2014-01-01
Introduction The High Tibial Osteotomy (HTO) is a reliable procedure in addressing uni- compartmental arthritis with associated coronal deformities. With osteotomy of the proximal tibia, there is a risk of altering the tibial slope in the sagittal plane. Surgical techniques continue to evolve with trends towards procedure reproducibility and simplification. We evaluated a modification of the Arthrex iBalance technique in 18 paired cadaveric knees with the goals of maintaining sagittal slope, increasing procedure efficiency, and decreasing use of intraoperative fluoroscopy. Methods Nine paired cadaveric knees (18 legs) underwent iBalance medial opening wedge high tibial osteotomies. In each pair, the right knee underwent an HTO using the modified technique, while all left knees underwent the traditional technique. Independent observers evaluated postoperative factors including tibial slope, placement of hinge pin, and implant placement. Specimens were then dissected to evaluate for any gross muscle, nerve or vessel injury. Results Changes to posterior tibial slope were similar using each technique. The change in slope in traditional iBalance technique was -0.3° ±2.3° and change in tibial slope using the modified iBalance technique was -0.4° ±2.3° (p=0.29). Furthermore, we detected no differences in posterior tibial slope between preoperative and postoperative specimens (p=0.74 traditional, p=0.75 modified). No differences in implant placement were detected between traditional and modified techniques. (p=0.85). No intraoperative iatrogenic complications (i.e. lateral cortex fracture, blood vessel or nerve injury) were observed in either group after gross dissection. Discussion & Conclusions Alterations in posterior tibial slope are associated with HTOs. Both traditional and modified iBalance techniques appear reliable in coronal plane corrections without changing posterior tibial slope. The present modification of the Arthrex iBalance technique may increase the efficiency of the operation and decrease radiation exposure to patients without compromising implant placement or global knee alignment. PMID:25328454
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Applications of artificial intelligence V; Proceedings of the Meeting, Orlando, FL, May 18-20, 1987
NASA Technical Reports Server (NTRS)
Gilmore, John F. (Editor)
1987-01-01
The papers contained in this volume focus on current trends in applications of artificial intelligence. Topics discussed include expert systems, image understanding, artificial intelligence tools, knowledge-based systems, heuristic systems, manufacturing applications, and image analysis. Papers are presented on expert system issues in automated, autonomous space vehicle rendezvous; traditional versus rule-based programming techniques; applications to the control of optional flight information; methodology for evaluating knowledge-based systems; and real-time advisory system for airborne early warning.
Atkins, Stephen J; Bentley, Ian; Brooks, Darrell; Burrows, Mark P; Hurst, Howard T; Sinclair, Jonathan K
2015-06-01
Core stability training traditionally uses stable-base techniques. Less is known as to the use of unstable-base techniques, such as suspension training, to activate core musculature. This study sought to assess the neuromuscular activation of global core stabilizers when using suspension training techniques, compared with more traditional forms of isometric exercise. Eighteen elite level, male youth swimmers (age, 15.5 ± 2.3 years; stature, 163.3 ± 12.7 cm; body mass, 62.2 ± 11.9 kg) participated in this study. Surface electromyography (sEMG) was used to determine the rate of muscle contraction in postural musculature, associated with core stability and torso bracing (rectus abdominus [RA], external obliques [EO], erector spinae [ES]). A maximal voluntary contraction test was used to determine peak amplitude for all muscles. Static bracing of the core was achieved using a modified "plank" position, with and without a Swiss ball, and held for 30 seconds. A mechanically similar "plank" was then held using suspension straps. Analysis of sEMG revealed that suspension produced higher peak amplitude in the RA than using a prone or Swiss ball "plank" (p = 0.04). This difference was not replicated in either the EO or ES musculature. We conclude that suspension training noticeably improves engagement of anterior core musculature when compared with both lateral and posterior muscles. Further research is required to determine how best to activate both posterior and lateral musculature when using all forms of core stability training.
GROUND WATER MONITORING AND SAMPLING: MULTI-LEVEL VERSUS TRADITIONAL METHODS – WHAT’S WHAT?
Recent studies have been conducted to evaluate different sampling techniques for determining VOC concentrations in groundwater. Samples were obtained using multi-level and traditional sampling techniques in three monitoring wells at the Raymark Superfund site in Stratford, CT. Ve...
Lepper, Paul A; D'Spain, Gerald L
2007-08-01
The performance of traditional techniques of passive localization in ocean acoustics such as time-of-arrival (phase differences) and amplitude ratios measured by multiple receivers may be degraded when the receivers are placed on an underwater vehicle due to effects of scattering. However, knowledge of the interference pattern caused by scattering provides a potential enhancement to traditional source localization techniques. Results based on a study using data from a multi-element receiving array mounted on the inner shroud of an autonomous underwater vehicle show that scattering causes the localization ambiguities (side lobes) to decrease in overall level and to move closer to the true source location, thereby improving localization performance, for signals in the frequency band 2-8 kHz. These measurements are compared with numerical modeling results from a two-dimensional time domain finite difference scheme for scattering from two fluid-loaded cylindrical shells. Measured and numerically modeled results are presented for multiple source aspect angles and frequencies. Matched field processing techniques quantify the source localization capabilities for both measurements and numerical modeling output.
Adaptive x-ray optics development at AOA-Xinetics
NASA Astrophysics Data System (ADS)
Lillie, Charles F.; Cavaco, Jeff L.; Brooks, Audrey D.; Ezzo, Kevin; Pearson, David D.; Wellman, John A.
2013-05-01
Grazing-incidence optics for X-ray applications require extremely smooth surfaces with precise mirror figures to provide well focused beams and small image spot sizes for astronomical telescopes and laboratory test facilities. The required precision has traditionally been achieved by time-consuming grinding and polishing of thick substrates with frequent pauses for precise metrology to check the mirror figure. More recently, substrates with high quality surface finish and figures have become available at reasonable cost, and techniques have been developed to mechanically adjust the figure of these traditionally polished substrates for ground-based applications. The beam-bending techniques currently in use are mechanically complex, however, with little control over mid-spatial frequency errors. AOA-Xinetics has been developing been developing techniques for shaping grazing incidence optics with surface-normal and surface-parallel electrostrictive Lead magnesium niobate (PMN) actuators bonded to mirror substrates for several years. These actuators are highly reliable; exhibit little to no hysteresis, aging or creep; and can be closely spaced to correct low and mid-spatial frequency errors in a compact package. In this paper we discuss recent development of adaptive x-ray optics at AOA-Xinetics.
Adaptive x-ray optics development at AOA-Xinetics
NASA Astrophysics Data System (ADS)
Lillie, Charles F.; Pearson, David D.; Cavaco, Jeffrey L.; Plinta, Audrey D.; Wellman, John A.
2012-10-01
Grazing-incidence optics for X-ray applications require extremely smooth surfaces with precise mirror figures to provide well focused beams and small image spot sizes for astronomical telescopes and laboratory test facilities. The required precision has traditionally been achieved by time-consuming grinding and polishing of thick substrates with frequent pauses for precise metrology to check the mirror figure. More recently, substrates with high quality surface finish and figures have become available at reasonable cost, and techniques have been developed to mechanically adjust the figure of these traditionally polished substrates for ground-based applications. The beam-bending techniques currently in use are mechanically complex, however, with little control over mid-spatial frequency errors. AOA-Xinetics has been developing been developing techniques for shaping grazing incidence optics with surface-normal and surface-parallel electrostrictive Lead magnesium niobate (PMN) actuators bonded to mirror substrates for several years. These actuators are highly reliable; exhibit little to no hysteresis, aging or creep; and can be closely spaced to correct low and mid-spatial frequency errors in a compact package. In this paper we discuss recent development of adaptive x-ray optics at AOAXinetics.
Onay, Ulaş; Akpınar, Sercan; Akgün, Rahmi Can; Balçık, Cenk; Tuncay, Ismail Cengiz
2013-01-01
The aim of this study was to compare new knotless single-row and double-row suture anchor techniques with traditional transosseous suture techniques for different sized rotator cuff tears in an animal model. The study included 56 cadaveric sheep shoulders. Supraspinatus cuff tears of 1 cm repaired with new knotless single-row suture anchor technique and supraspinatus and infraspinatus rotator cuff tears of 3 cm repaired with double-row suture anchor technique were compared to traditional transosseous suture techniques and control groups. The repaired tendons were loaded with 5 mm/min static velocity with 2.5 kgN load cell in Instron 8874 machine until the repair failure. The 1 cm transosseous group was statistically superior to 1 cm control group (p=0.021, p<0.05) and the 3 cm SpeedBridge group was statistically superior to the 1 cm SpeedFix group (p=0.012, p<0.05). The differences between the other groups were not statistically significant. No significant difference was found between the new knotless suture anchor techniques and traditional transosseous suture techniques.
Durant, Nefertiti H; Joseph, Rodney P; Cherrington, Andrea; Cuffee, Yendelela; Knight, BernNadette; Lewis, Dwight; Allison, Jeroan J
2014-01-16
Innovative approaches are needed to promote physical activity among young adult overweight and obese African American women. We sought to describe key elements that African American women desire in a culturally relevant Internet-based tool to promote physical activity among overweight and obese young adult African American women. A mixed-method approach combining nominal group technique and traditional focus groups was used to elicit recommendations for the development of an Internet-based physical activity promotion tool. Participants, ages 19 to 30 years, were enrolled in a major university. Nominal group technique sessions were conducted to identify themes viewed as key features for inclusion in a culturally relevant Internet-based tool. Confirmatory focus groups were conducted to verify and elicit more in-depth information on the themes. Twenty-nine women participated in nominal group (n = 13) and traditional focus group sessions (n = 16). Features that emerged to be included in a culturally relevant Internet-based physical activity promotion tool were personalized website pages, diverse body images on websites and in videos, motivational stories about physical activity and women similar to themselves in size and body shape, tips on hair care maintenance during physical activity, and online social support through social media (eg, Facebook, Twitter). Incorporating existing social media tools and motivational stories from young adult African American women in Internet-based tools may increase the feasibility, acceptability, and success of Internet-based physical activity programs in this high-risk, understudied population.
Ultra-sensitive Chip-based Photonic Temperature Sensor Using Ring Resonator Structures
2014-02-10
273.15 K to 373 K [15]. An optical analog of this, using infrared light to probe strain- free fiber Bragg gratings ( FBG ), exhibits temperature...sensors [9, 12, 13]. However, FBGs are susceptible to strain and are relatively large. Instead, we consider the use of ring resonators. In recent years...Traditionally, photonic thermometers such as those based on Fiber Bragg gratings ( FBG ) employ continuous wavelength scanning techniques to measure
Ardley, Nicholas D; Lau, Ken K; Buchan, Kevin
2013-12-01
Cervical spine injuries occur in 4-8 % of adults with head trauma. Dual acquisition technique has been traditionally used for the CT scanning of brain and cervical spine. The purpose of this study was to determine the efficacy of radiation dose reduction by using a single acquisition technique that incorporated both anatomical regions with a dedicated neck detection algorithm. Thirty trauma patients for brain and cervical spine CT were included and were scanned with the single acquisition technique. The radiation doses from the single CT acquisition technique with the neck detection algorithm, which allowed appropriate independent dose administration relevant to brain and cervical spine regions, were recorded. Comparison was made both to the doses calculated from the simulation of the traditional dual acquisitions with matching parameters, and to the doses of retrospective dual acquisition legacy technique with the same sample size. The mean simulated dose for the traditional dual acquisition technique was 3.99 mSv, comparable to the average dose of 4.2 mSv from 30 previous patients who had CT of brain and cervical spine as dual acquisitions. The mean dose from the single acquisition technique was 3.35 mSv, resulting in a 16 % overall dose reduction. The images from the single acquisition technique were of excellent diagnostic quality. The new single acquisition CT technique incorporating the neck detection algorithm for brain and cervical spine significantly reduces the overall radiation dose by eliminating the unavoidable overlapping range between 2 anatomical regions which occurs with the traditional dual acquisition technique.
Coronal Axis Measurement of the Optic Nerve Sheath Diameter Using a Linear Transducer.
Amini, Richard; Stolz, Lori A; Patanwala, Asad E; Adhikari, Srikar
2015-09-01
The true optic nerve sheath diameter cutoff value for detecting elevated intracranial pressure is variable. The variability may stem from the technique used to acquire sonographic measurements of the optic nerve sheath diameter as well as sonographic artifacts inherent to the technique. The purpose of this study was to compare the traditional visual axis technique to an infraorbital coronal axis technique for assessing the optic nerve sheath diameter using a high-frequency linear array transducer. We conducted a cross-sectional study at an academic medical center. Timed optic nerve sheath diameter measurements were obtained on both eyes of healthy adult volunteers with a 10-5-MHz broadband linear array transducer using both traditional visual axis and coronal axis techniques. Optic nerve sheath diameter measurements were obtained by 2 sonologists who graded the difficulty of each technique and were blinded to each other's measurements for each participant. A total of 42 volunteers were enrolled, yielding 84 optic nerve sheath diameter measurements. There were no significant differences in the measurements between the techniques on either eye (P = .23 [right]; P = .99 [left]). Additionally, there was no difference in the degree of difficulty obtaining the measurements between the techniques (P = .16). There was a statistically significant difference in the time required to obtain the measurements between the traditional and coronal techniques (P < .05). Infraorbital coronal axis measurements are similar to measurements obtained in the traditional visual axis. The infraorbital coronal axis technique is slightly faster to perform and is not technically challenging. © 2015 by the American Institute of Ultrasound in Medicine.
USDA-ARS?s Scientific Manuscript database
Passive acoustic techniques for the measurement of Sediment-Generated Noise (SGN) in gravel-bed rivers present a promising alternative to traditional bedload measurement techniques. Where traditional methods are often prohibitively costly, particularly in labor requirements, and produce point-scale ...
NASA Astrophysics Data System (ADS)
Starkey, Eleanor; Barnes, Mhari; Quinn, Paul; Large, Andy
2016-04-01
Pressures associated with flooding and climate change have significantly increased over recent years. Natural Flood Risk Management (NFRM) is now seen as being a more appropriate and favourable approach in some locations. At the same time, catchment managers are also encouraged to adopt a more integrated, evidence-based and bottom-up approach. This includes engaging with local communities. Although NFRM features are being more readily installed, there is still limited evidence associated with their ability to reduce flood risk and offer multiple benefits. In particular, local communities and land owners are still uncertain about what the features entail and how they will perform, which is a huge barrier affecting widespread uptake. Traditional hydrometric monitoring techniques are well established but they still struggle to successfully monitor and capture NFRM performance spatially and temporally in a visual and more meaningful way for those directly affected on the ground. Two UK-based case studies are presented here where unique NFRM features have been carefully designed and installed in rural headwater catchments. This includes a 1km2 sub-catchment of the Haltwhistle Burn (northern England) and a 2km2 sub-catchment of Eddleston Water (southern Scotland). Both of these pilot sites are subject to prolonged flooding in winter and flash flooding in summer. This exacerbates sediment, debris and water quality issues downstream. Examples of NFRM features include ponds, woody debris and a log feature inspired by the children's game 'Kerplunk'. They have been tested and monitored over the 2015-2016 winter storms using low-cost techniques by both researchers and members of the community ('citizen scientists'). Results show that monitoring techniques such as regular consumer specification time-lapse cameras, photographs, videos and 'kite-cams' are suitable for long-term and low-cost monitoring of a variety of NFRM features. These techniques have been compared against traditional hydrometric monitoring equipment. It is clear that traditional techniques are expensive, require specialist skills and outputs are complicated to the untrained eye. These alternative methods tested are visually more meaningful, can be interpreted by all stakeholders and techniques can be easily utilised by citizen scientists, land owners or flood groups. Such techniques therefore offer a before, during and after NFRM monitoring solution which can be more realistically and readily implemented, supports engagement and subsequent uptake and maintenance of NFRM features on a local level. Although monitoring techniques presented are relatively simple, they are regarded as being essential given that many schemes are not monitored at all.
Amihai, Ido; Kozhevnikov, Maria
2014-01-01
Based on evidence of parasympathetic activation, early studies defined meditation as a relaxation response. Later research attempted to categorize meditation as either involving focused or distributed attentional systems. Neither of these hypotheses received strong empirical support, and most of the studies investigated Theravada style meditative practices. In this study, we compared neurophysiological (EEG, EKG) and cognitive correlates of meditative practices that are thought to utilize either focused or distributed attention, from both Theravada and Vajrayana traditions. The results of Study 1 show that both focused (Shamatha) and distributed (Vipassana) attention meditations of the Theravada tradition produced enhanced parasympathetic activation indicative of a relaxation response. In contrast, both focused (Deity) and distributed (Rig-pa) meditations of the Vajrayana tradition produced sympathetic activation, indicative of arousal. Additionally, the results of Study 2 demonstrated an immediate dramatic increase in performance on cognitive tasks following only Vajrayana styles of meditation, indicating enhanced phasic alertness due to arousal. Furthermore, our EEG results showed qualitatively different patterns of activation between Theravada and Vajrayana meditations, albeit highly similar activity between meditations within the same tradition. In conclusion, consistent with Tibetan scriptures that described Shamatha and Vipassana techniques as those that calm and relax the mind, and Vajrayana techniques as those that require ‘an awake quality’ of the mind, we show that Theravada and Vajrayana meditations are based on different neurophysiological mechanisms, which give rise to either a relaxation or arousal response. Hence, it may be more appropriate to categorize meditations in terms of relaxation vs. arousal, whereas classification methods that rely on the focused vs. distributed attention dichotomy may need to be reexamined. PMID:25051268
Haith-Cooper, Melanie
2003-01-01
The use of problem-based learning (PBL) in Health Professional curricula is becoming more wide spread. Although the way in which the tutor facilitates PBL can have a major impact on students' learning (Andrews and Jones 1996), the literature provides little consistency as to how the tutor can effectively facilitate PBL (Haith-Cooper 2000). It is therefore important to examine the facilitation role to promote effective learning through the use of PBL. This article is the first of two parts exploring a study that was undertaken to investigate tutors' experiences of facilitating PBL. This part focuses on the methodology and the combining of innovative processes with traditional philosophical traditions to develop a systematic educational research methodology. The study was undertaken respecting the philosophy of hermeneutic phenomenology but utilised alternative data collection and analysis technique. Video conferencing and e-mail were used in conjunction with more traditional processes to access a worldwide sample. This paper explores some of the issues that arose when undertaking such a study. The second article then focuses on exploring the findings of the study and their implications for the facilitation of PBL.
Hypopharyngeal perforation near-miss during transesophageal echocardiography.
Aviv, Jonathan E; Di Tullio, Marco R; Homma, Shunichi; Storper, Ian S; Zschommler, Anne; Ma, Guoguang; Petkova, Eva; Murphy, Mark; Desloge, Rosemary; Shaw, Gary; Benjamin, Stanley; Corwin, Steven
2004-05-01
The traditional blind passage of a transesophageal echocardiography probe transorally through the hypopharynx is considered safe. Yet, severe hypopharyngeal complications during transesophageal echocardiography at several institutions led the authors to investigate whether traditional probe passage results in a greater incidence of hypopharyngeal injuries when compared with probe passage under direct visualization. Randomized, prospective clinical study. In 159 consciously sedated adults referred for transesophageal echocardiography, the authors performed transesophageal echocardiography with concomitant transnasal videoendoscopic monitoring of the hypopharynx. Subjects were randomly assigned to receive traditional (blind) or experimental (optical) transesophageal echocardiography. The primary outcome measure was frequency of hypopharyngeal injuries (hypopharyngeal lacerations or hematomas), and the secondary outcome measure was number of hypopharyngeal contacts. No perforation occurred with either technique. However, hypopharyngeal lacerations or hematomas occurred in 19 of 80 (23.8%) patients with the traditional technique (11 superficial lacerations of pyriform sinus, 1 laceration of pharynx, 12 arytenoid hematomas, 2 vocal fold hematomas, and 1 pyriform hematoma) and in 1 of 79 patients (1.3%) with the optical technique (superficial pyriform laceration) (P =.001). All traumatized patients underwent flexible laryngoscopy, but none required additional intervention. Respectively, hypopharyngeal contacts were more frequent with the traditional than with the optical technique at the pyriform sinus (70.0% vs. 10.1% [P =.001]), arytenoid (55.0% vs. 3.8% [P =.001]), and vocal fold (15.0% vs. 3.86% [P =.016]). Optically guided trans-esophageal echocardiography results in significantly fewer hypopharyngeal injuries and fewer contacts than traditional, blind transesophageal echocardiography. The optically guided technique may result in decreased frequency of potentially significant complications and therefore in improved patient safety.
Real time automatic detection of bearing fault in induction machine using kurtogram analysis.
Tafinine, Farid; Mokrani, Karim
2012-11-01
A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.
Using AVIRIS data and multiple-masking techniques to map urban forest trees species
Q. Xiao; S.L. Ustin; E.G. McPherson
2004-01-01
Tree type and species information are critical parameters for urban forest management, benefit cost analysis and urban planning. However, traditionally, these parameters have been derived based on limited field samples in urban forest management practice. In this study we used high-resolution Airborne Visible Infrared Imaging Spectrometer (AVIRIS) data and multiple-...
ERIC Educational Resources Information Center
Abeysekera, Indra
2011-01-01
This study examines three instructional methods (traditional, interactive, and group case-based study), and student opinions on their preference for learning financial accounting in large classes at a metropolitan university in Sri Lanka. It analyses the results of a survey questionnaire of students, using quantitative techniques to determine the…
[Development of operation patient security detection system].
Geng, Shu-Qin; Tao, Ren-Hai; Zhao, Chao; Wei, Qun
2008-11-01
This paper describes a patient security detection system developed with two dimensional bar codes, wireless communication and removal storage technique. Based on the system, nurses and correlative personnel check code wait operation patient to prevent the defaults. The tests show the system is effective. Its objectivity and currency are more scientific and sophisticated than current traditional method in domestic hospital.
Application of a Flexible, Clinically Driven Approach for Anger Reduction in the Case of Mr. P
ERIC Educational Resources Information Center
Kassinove, Howard; Tafrate, Raymond Chip
2011-01-01
We treat maladaptive anger in adults with a program based on traditional behavior therapy and cognitive behavior therapy. To these, we add client-centered motivational interviewing techniques. With the goal of modifying maladaptive stimulus-response relationships, our specific aim is to reduce anger reactivity to aversive triggers. Thus, in daily…
ERIC Educational Resources Information Center
Whitworth, David E.
2016-01-01
Laboratory-based practical classes are a common feature of life science teaching, during which students learn how to perform experiments and generate/interpret data. Practical classes are typically instructional, concentrating on providing topic- and technique-specific skills, however to produce research-capable graduates it is also important to…
Teaching with Dogs: Learning about Learning through Hands-on Experience in Dog Training
ERIC Educational Resources Information Center
McConnell, Bridget L.
2016-01-01
This paper summarizes a pilot study of an experiential learning technique that was designed to give undergraduate students a greater understanding of the principles and theories of learning and behavior, which is traditionally taught only in a lecture-based format. Students were assigned the role of a dog trainer, and they were responsible for…
Planning and Scheduling of Software Manufacturing Projects
1991-03-01
based on the previous results in social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing...planning and scheduling, and the traditional approaches to planning in artificial intelligence, and extends the techniques that have been developed by them...social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing planning and scheduling, and the
ERIC Educational Resources Information Center
Srinivasan, Deepa
2013-01-01
Recent rapid malware growth has exposed the limitations of traditional in-host malware-defense systems and motivated the development of secure virtualization-based solutions. By running vulnerable systems as virtual machines (VMs) and moving security software from inside VMs to the outside, the out-of-VM solutions securely isolate the anti-malware…
NASA Astrophysics Data System (ADS)
Jawak, Shridhar D.; Jadhav, Ajay; Luis, Alvarinho J.
2016-05-01
Supraglacial debris was mapped in the Schirmacher Oasis, east Antarctica, by using WorldView-2 (WV-2) high resolution optical remote sensing data consisting of 8-band calibrated Gram Schmidt (GS)-sharpened and atmospherically corrected WV-2 imagery. This study is a preliminary attempt to develop an object-oriented rule set to extract supraglacial debris for Antarctic region using 8-spectral band imagery. Supraglacial debris was manually digitized from the satellite imagery to generate the ground reference data. Several trials were performed using few existing traditional pixel-based classification techniques and color-texture based object-oriented classification methods to extract supraglacial debris over a small domain of the study area. Multi-level segmentation and attributes such as scale, shape, size, compactness along with spectral information from the data were used for developing the rule set. The quantitative analysis of error was carried out against the manually digitized reference data to test the practicability of our approach over the traditional pixel-based methods. Our results indicate that OBIA-based approach (overall accuracy: 93%) for extracting supraglacial debris performed better than all the traditional pixel-based methods (overall accuracy: 80-85%). The present attempt provides a comprehensive improved method for semiautomatic feature extraction in supraglacial environment and a new direction in the cryospheric research.
NASA Technical Reports Server (NTRS)
Olds, John Robert; Walberg, Gerald D.
1993-01-01
Multidisciplinary design optimization (MDO) is an emerging discipline within aerospace engineering. Its goal is to bring structure and efficiency to the complex design process associated with advanced aerospace launch vehicles. Aerospace vehicles generally require input from a variety of traditional aerospace disciplines - aerodynamics, structures, performance, etc. As such, traditional optimization methods cannot always be applied. Several multidisciplinary techniques and methods were proposed as potentially applicable to this class of design problem. Among the candidate options are calculus-based (or gradient-based) optimization schemes and parametric schemes based on design of experiments theory. A brief overview of several applicable multidisciplinary design optimization methods is included. Methods from the calculus-based class and the parametric class are reviewed, but the research application reported focuses on methods from the parametric class. A vehicle of current interest was chosen as a test application for this research. The rocket-based combined-cycle (RBCC) single-stage-to-orbit (SSTO) launch vehicle combines elements of rocket and airbreathing propulsion in an attempt to produce an attractive option for launching medium sized payloads into low earth orbit. The RBCC SSTO presents a particularly difficult problem for traditional one-variable-at-a-time optimization methods because of the lack of an adequate experience base and the highly coupled nature of the design variables. MDO, however, with it's structured approach to design, is well suited to this problem. The result of the application of Taguchi methods, central composite designs, and response surface methods to the design optimization of the RBCC SSTO are presented. Attention is given to the aspect of Taguchi methods that attempts to locate a 'robust' design - that is, a design that is least sensitive to uncontrollable influences on the design. Near-optimum minimum dry weight solutions are determined for the vehicle. A summary and evaluation of the various parametric MDO methods employed in the research are included. Recommendations for additional research are provided.
Denoising in digital speckle pattern interferometry using wave atoms.
Federico, Alejandro; Kaufmann, Guillermo H
2007-05-15
We present an effective method for speckle noise removal in digital speckle pattern interferometry, which is based on a wave-atom thresholding technique. Wave atoms are a variant of 2D wavelet packets with a parabolic scaling relation and improve the sparse representation of fringe patterns when compared with traditional expansions. The performance of the denoising method is analyzed by using computer-simulated fringes, and the results are compared with those produced by wavelet and curvelet thresholding techniques. An application of the proposed method to reduce speckle noise in experimental data is also presented.
NASA Technical Reports Server (NTRS)
Wells, Jeffrey M.; Jones, Thomas W.; Danehy, Paul M.
2005-01-01
Techniques for enhancing photogrammetric measurement of reflective surfaces by reducing noise were developed utilizing principles of light polarization. Signal selectivity with polarized light was also compared to signal selectivity using chromatic filters. Combining principles of linear cross polarization and color selectivity enhanced signal-to-noise ratios by as much as 800 fold. More typical improvements with combining polarization and color selectivity were about 100 fold. We review polarization-based techniques and present experimental results comparing the performance of traditional retroreflective targeting materials, cornercube targets returning depolarized light, and color selectivity.
Hit discovery and hit-to-lead approaches.
Keseru, György M; Makara, Gergely M
2006-08-01
Hit discovery technologies range from traditional high-throughput screening to affinity selection of large libraries, fragment-based techniques and computer-aided de novo design, many of which have been extensively reviewed. Development of quality leads using hit confirmation and hit-to-lead approaches present their own challenges, depending on the hit discovery method used to identify the initial hits. In this paper, we summarize common industry practices adopted to tackle hit-to-lead challenges and review how the advantages and drawbacks of different hit discovery techniques could affect the various issues hit-to-lead groups face.
Splatterplots: overcoming overdraw in scatter plots.
Mayorga, Adrian; Gleicher, Michael
2013-09-01
We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the data set as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how Splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.
Splatterplots: Overcoming Overdraw in Scatter Plots
Mayorga, Adrian; Gleicher, Michael
2014-01-01
We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen. PMID:23846097
Splatterplots: Overcoming Overdraw in Scatter Plots.
Mayorga, Adrian; Gleicher, Michael
2013-03-20
We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.
Digital differential confocal microscopy based on spatial shift transformation.
Liu, J; Wang, Y; Liu, C; Wilson, T; Wang, H; Tan, J
2014-11-01
Differential confocal microscopy is a particularly powerful surface profilometry technique in industrial metrology due to its high axial sensitivity and insensitivity to noise. However, the practical implementation of the technique requires the accurate positioning of point detectors in three-dimensions. We describe a simple alternative based on spatial transformation of a through-focus series of images obtained from a homemade beam scanning confocal microscope. This digital differential confocal microscopy approach is described and compared with the traditional Differential confocal microscopy approach. The ease of use of the digital differential confocal microscopy system is illustrated by performing measurements on a 3D standard specimen. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
A review of automated image understanding within 3D baggage computed tomography security screening.
Mouton, Andre; Breckon, Toby P
2015-01-01
Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT.
NASA Astrophysics Data System (ADS)
Zhan, Jinliang; Lu, Pei
2006-11-01
Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mossine, Andrew V.; Brooks, Allen F.; Ichiishi, Naoko
In a relatively short period of time, transition metal-mediated radiofluorination reactions have changed the PET radiochemistry landscape. These reactions have enabled the radiofluorination of a wide range of substrates, facilitating access to radiopharmaceuticals that were challenging to synthesize using traditional fluorine-18 radiochemistry. However, the process of adapting these new reactions for automated radiopharmaceutical production has revealed limitations in fitting them into the confines of traditional radiochemistry systems. In particular, the presence of bases (e.g. K 2CO 3) and/or phase transfer catalysts (PTC) (e.g. kryptofix 2.2.2) associated with fluorine-18 preparation has been found to be detrimental to reaction yields. We hypothesizedmore » that these limitations could be addressed through the development of alternate techniques for preparing [18F]fluoride. This approach also opens the possibility that an eluent can be individually tailored to meet the specific needs of a metal-catalyzed reaction of interest. In this communication, we demonstrate that various solutions of copper salts, bases, and ancillary ligands can be utilized to elute [ 18F]fluoride from ion exchange cartridges. The new procedures we present here are effective for fluorine-18 radiochemistry and, as proof of concept, have been used to optimize an otherwise base-sensitive copper-mediated radiofluorination reaction.« less
Mossine, Andrew V.; Brooks, Allen F.; Ichiishi, Naoko; ...
2017-03-22
In a relatively short period of time, transition metal-mediated radiofluorination reactions have changed the PET radiochemistry landscape. These reactions have enabled the radiofluorination of a wide range of substrates, facilitating access to radiopharmaceuticals that were challenging to synthesize using traditional fluorine-18 radiochemistry. However, the process of adapting these new reactions for automated radiopharmaceutical production has revealed limitations in fitting them into the confines of traditional radiochemistry systems. In particular, the presence of bases (e.g. K 2CO 3) and/or phase transfer catalysts (PTC) (e.g. kryptofix 2.2.2) associated with fluorine-18 preparation has been found to be detrimental to reaction yields. We hypothesizedmore » that these limitations could be addressed through the development of alternate techniques for preparing [18F]fluoride. This approach also opens the possibility that an eluent can be individually tailored to meet the specific needs of a metal-catalyzed reaction of interest. In this communication, we demonstrate that various solutions of copper salts, bases, and ancillary ligands can be utilized to elute [ 18F]fluoride from ion exchange cartridges. The new procedures we present here are effective for fluorine-18 radiochemistry and, as proof of concept, have been used to optimize an otherwise base-sensitive copper-mediated radiofluorination reaction.« less
Sheppard, Sean C; Hickling, Edward J; Earleywine, Mitch; Hoyt, Tim; Russo, Amanda R; Donati, Matthew R; Kip, Kevin E
2015-11-01
Stigma associated with disclosing military sexual trauma (MST) makes estimating an accurate base rate difficult. Anonymous assessment may help alleviate stigma. Although anonymous research has found higher rates of male MST, no study has evaluated whether providing anonymity sufficiently mitigates the impact of stigma on accurate reporting. This study used the unmatched count technique (UCT), a form of randomized response techniques, to gain information about the accuracy of base rate estimates of male MST derived via anonymous assessment of Operation Enduring Freedom (OEF)/Operation Iraqi Freedom (OIF) combat veterans. A cross-sectional convenience sample of 180 OEF/OIF male combat veterans, recruited via online websites for military populations, provided data about history of MST via traditional anonymous self-report and the UCT. The UCT revealed a rate of male MST more than 15 times higher than the rate derived via traditional anonymous assessment (1.1% vs. 17.2%). These data suggest that anonymity does not adequately mitigate the impact of stigma on disclosure of male MST. Results, though preliminary, suggest that published rates of male MST may substantially underestimate the true rate of this problem. The UCT has significant potential to improve base rate estimation of sensitive behaviors in the military. (c) 2015 APA, all rights reserved).
Ahirwal, M K; Kumar, Anil; Singh, G K
2013-01-01
This paper explores the migration of adaptive filtering with swarm intelligence/evolutionary techniques employed in the field of electroencephalogram/event-related potential noise cancellation or extraction. A new approach is proposed in the form of controlled search space to stabilize the randomness of swarm intelligence techniques especially for the EEG signal. Swarm-based algorithms such as Particles Swarm Optimization, Artificial Bee Colony, and Cuckoo Optimization Algorithm with their variants are implemented to design optimized adaptive noise canceler. The proposed controlled search space technique is tested on each of the swarm intelligence techniques and is found to be more accurate and powerful. Adaptive noise canceler with traditional algorithms such as least-mean-square, normalized least-mean-square, and recursive least-mean-square algorithms are also implemented to compare the results. ERP signals such as simulated visual evoked potential, real visual evoked potential, and real sensorimotor evoked potential are used, due to their physiological importance in various EEG studies. Average computational time and shape measures of evolutionary techniques are observed 8.21E-01 sec and 1.73E-01, respectively. Though, traditional algorithms take negligible time consumption, but are unable to offer good shape preservation of ERP, noticed as average computational time and shape measure difference, 1.41E-02 sec and 2.60E+00, respectively.
Security in MANETs using reputation-adjusted routing
NASA Astrophysics Data System (ADS)
Ondi, Attila; Hoffman, Katherine; Perez, Carlos; Ford, Richard; Carvalho, Marco; Allen, William
2009-04-01
Mobile Ad-Hoc Networks enable communication in various dynamic environments, including military combat operations. Their open and shared communication medium enables new forms of attack that are not applicable for traditional wired networks. Traditional security mechanisms and defense techniques are not prepared to cope with the new attacks and the lack of central authorities make identity verifications difficult. This work extends our previous work in the Biologically Inspired Tactical Security Infrastructure to provide a reputation-based weighing mechanism for linkstate routing protocols to protect the network from attackers that are corrupting legitimate network traffic. Our results indicate that the approach is successful in routing network traffic around compromised computers.
Traditional Agriculture and Permaculture.
ERIC Educational Resources Information Center
Pierce, Dick
1997-01-01
Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate…
Digital education and dynamic assessment of tongue diagnosis based on Mashup technique.
Tsai, Chin-Chuan; Lo, Yen-Cheng; Chiang, John Y; Sainbuyan, Natsagdorj
2017-01-24
To assess the digital education and dynamic assessment of tongue diagnosis based on Mashup technique (DEDATD) according to specifific user's answering pattern, and provide pertinent information tailored to user's specifific needs supplemented by the teaching materials constantly updated through the Mashup technique. Fifty-four undergraduate students were tested with DEDATD developed. The effificacy of the DEDATD was evaluated based on the pre- and post-test performance, with interleaving training sessions targeting on the weakness of the student under test. The t-test demonstrated that signifificant difference was reached in scores gained during pre- and post-test sessions, and positive correlation between scores gained and length of time spent on learning, while no signifificant differences between the gender and post-test score, and the years of students in school and the progress in score gained. DEDATD, coupled with Mashup technique, could provide updated materials fifiltered through diverse sources located across the network. The dynamic assessment could tailor each individual learner's needs to offer custom-made learning materials. DEDATD poses as a great improvement over the traditional teaching methods.
Techniques for characterization and eradication of potato cyst nematode: a review.
Bairwa, Aarti; Venkatasalam, E P; Sudha, R; Umamaheswari, R; Singh, B P
2017-09-01
Correct identification of species and pathotypes is must for eradication of potato cyst nematodes (PCN). The identification of PCN species after completing the life cycle is very difficult because it is based on morphological and morphometrical characteristics. Genetically different populations of PCN are morphologically same and differentiated based on the host differential study. Later on these traditional techniques have been replaced by biochemical techniques viz, one and two dimensional gel electrophoresis, capillary gel electrophoresis, isozymes, dot blot hybridization and isoelectric focusing etc. to distinguish both the species. One and two dimensional gel electrophoresis has used to examine inter- and intra-specific differences in proteins of Globodera rostochiensis and G. pallida . Now application of PCR and DNA based characterization techniques like RAPD, AFLP and RFLP are the important tools for differentiating inter- and intra specific variation in PCN and has given opportunities to accurate identification of PCN. For managing the PCN, till now we are following integrated pest management (IPM) strategies, however these strategies are not effective to eradicate the PCN. Therefore to eradicate the PCN we need noval management practices like RNAi (RNA interference) or Gene silencing.
Application of the "see one, do one, teach one" concept in surgical training.
Kotsis, Sandra V; Chung, Kevin C
2013-05-01
The traditional method of teaching in surgery is known as "see one, do one, teach one." However, many have argued that this method is no longer applicable, mainly because of concerns for patient safety. The purpose of this article is to show that the basis of the traditional teaching method is still valid in surgical training if it is combined with various adult learning principles. The authors reviewed literature regarding the history of the formation of the surgical residency program, adult learning principles, mentoring, and medical simulation. The authors provide examples for how these learning techniques can be incorporated into a surgical resident training program. The surgical residency program created by Dr. William Halsted remained virtually unchanged until recently with reductions in resident work hours and changes to a competency-based training system. Such changes have reduced the teaching time between attending physicians and residents. Learning principles such as experience, observation, thinking, and action and deliberate practice can be used to train residents. Mentoring is also an important aspect in teaching surgical technique. The authors review the different types of simulators-standardized patients, virtual reality applications, and high-fidelity mannequin simulators-and the advantages and disadvantages of using them. The traditional teaching method of "see one, do one, teach one" in surgical residency programs is simple but still applicable. It needs to evolve with current changes in the medical system to adequately train surgical residents and also provide patients with safe, evidence-based care.
A Biomechanical Modeling Guided CBCT Estimation Technique
Zhang, You; Tehrani, Joubin Nasehi; Wang, Jing
2017-01-01
Two-dimensional-to-three-dimensional (2D-3D) deformation has emerged as a new technique to estimate cone-beam computed tomography (CBCT) images. The technique is based on deforming a prior high-quality 3D CT/CBCT image to form a new CBCT image, guided by limited-view 2D projections. The accuracy of this intensity-based technique, however, is often limited in low-contrast image regions with subtle intensity differences. The solved deformation vector fields (DVFs) can also be biomechanically unrealistic. To address these problems, we have developed a biomechanical modeling guided CBCT estimation technique (Bio-CBCT-est) by combining 2D-3D deformation with finite element analysis (FEA)-based biomechanical modeling of anatomical structures. Specifically, Bio-CBCT-est first extracts the 2D-3D deformation-generated displacement vectors at the high-contrast anatomical structure boundaries. The extracted surface deformation fields are subsequently used as the boundary conditions to drive structure-based FEA to correct and fine-tune the overall deformation fields, especially those at low-contrast regions within the structure. The resulting FEA-corrected deformation fields are then fed back into 2D-3D deformation to form an iterative loop, combining the benefits of intensity-based deformation and biomechanical modeling for CBCT estimation. Using eleven lung cancer patient cases, the accuracy of the Bio-CBCT-est technique has been compared to that of the 2D-3D deformation technique and the traditional CBCT reconstruction techniques. The accuracy was evaluated in the image domain, and also in the DVF domain through clinician-tracked lung landmarks. PMID:27831866
Tehrani, Jamshid J.; Collard, Mark; Shennan, Stephen J.
2010-01-01
Phylogenetic approaches to culture have shed new light on the role played by population dispersals in the spread and diversification of cultural traditions. However, the fact that cultural inheritance is based on separate mechanisms from genetic inheritance means that socially transmitted traditions have the potential to diverge from population histories. Here, we suggest that associations between these two systems can be reconstructed using techniques developed to study cospeciation between hosts and parasites and related problems in biology. Relationships among the latter are patterned by four main processes: co-divergence, intra-host speciation (duplication), intra-host extinction (sorting) and horizontal transfers. We show that patterns of cultural inheritance are structured by analogous processes, and then demonstrate the applicability of the host–parasite model to culture using empirical data on Iranian tribal populations. PMID:21041211
Haemodialysis plastic cannulae - a possible alternative to traditional metal needles?
Parisotto, Maria Teresa; Pelliccia, Francesco; Bedenbender-Stoll, Eva; Gallieni, Maurizio
2016-09-21
Haemodialysis plastic cannulae for arteriovenous fistulae (AVF) have been used for many years in Japan and recently this technique was introduced in Australia. Find answers to the following questions:What are the pros and cons of plastic cannulae versus traditional metal needles for AVF and arteriovenous graft (AVG)? Is the use of plastic cannulae instead of traditional metal needles an option for European dialysis units as well? If it is an option, for which patients should plastic cannulae be used? Literature search via PubMed and Google. Due to the characteristics of plastic cannulae, they seem to be well suited for restless patients, patients with unpredictable behaviour, children, and patients who are allergic to metal.However, the evidence base provided by studies on the use of cannulae is currently weak. More controlled randomised studies are needed.
ERIC Educational Resources Information Center
Holbeck, Rick; Bergquist, Emily; Lees, Sheila
2014-01-01
Classroom Assessment Techniques (CATs) have been used in traditional university classrooms as a strategy to check for student understanding (Angelo & Cross, 1993). With the emergence of online learning and its popularity for non-traditional students, it is equally important that instructors in the online environment check for student…
Lentz, Robert J; Argento, A Christine; Colby, Thomas V; Rickman, Otis B; Maldonado, Fabien
2017-07-01
Transbronchial lung biopsy with a cryoprobe, or cryobiopsy, is a promising new bronchoscopic biopsy technique capable of obtaining larger and better-preserved samples than previously possible using traditional biopsy forceps. Over two dozen case series and several small randomized trials are now available describing experiences with this technique, largely for the diagnosis of diffuse parenchymal lung disease (DPLD), in which the reported diagnostic yield is typically 70% to 80%. Cryobiopsy technique varies widely between centers and this predominantly single center-based retrospective literature heterogeneously defines diagnostic yield and complications, limiting the degree to which this technique can be compared between centers or to surgical lung biopsy (SLB). This review explores the broad range of cryobiopsy techniques currently in use, their rationale, the current state of the literature, and suggestions for the direction of future study into this promising but unproven procedure.
Diagnosis of toxoplasmosis and typing of Toxoplasma gondii.
Liu, Quan; Wang, Ze-Dong; Huang, Si-Yang; Zhu, Xing-Quan
2015-05-28
Toxoplasmosis, caused by the obligate intracellular protozoan Toxoplasma gondii, is an important zoonosis with medical and veterinary importance worldwide. The disease is mainly contracted by ingesting undercooked or raw meat containing viable tissue cysts, or by ingesting food or water contaminated with oocysts. The diagnosis and genetic characterization of T. gondii infection is crucial for the surveillance, prevention and control of toxoplasmosis. Traditional approaches for the diagnosis of toxoplasmosis include etiological, immunological and imaging techniques. Diagnosis of toxoplasmosis has been improved by the emergence of molecular technologies to amplify parasite nucleic acids. Among these, polymerase chain reaction (PCR)-based molecular techniques have been useful for the genetic characterization of T. gondii. Serotyping methods based on polymorphic polypeptides have the potential to become the choice for typing T. gondii in humans and animals. In this review, we summarize conventional non-DNA-based diagnostic methods, and the DNA-based molecular techniques for the diagnosis and genetic characterization of T. gondii. These techniques have provided foundations for further development of more effective and accurate detection of T. gondii infection. These advances will contribute to an improved understanding of the epidemiology, prevention and control of toxoplasmosis.
Yu, Bin-Sheng; Yang, Zhan-Kun; Li, Ze-Min; Zeng, Li-Wen; Wang, Li-Bing; Lu, William Weijia
2011-08-01
An in vitro biomechanical cadaver study. To evaluate the pull-out strength after 5000 cyclic loading among 4 revision techniques for the loosened iliac screw using corticocancellous bone, longer screw, traditional cement augmentation, and boring cement augmentation. Iliac screw loosening is still a clinical problem for lumbo-iliac fusion. Although many revision techniques using corticocancellous bone, larger screw, and polymethylmethacrylate (PMMA) augmentation were applied in repairing pedicle screw loosening, their biomechanical effects on the loosened iliac screw remain undetermined. Eight fresh human cadaver pelvises with the bone mineral density values ranging from 0.83 to 0.97 g/cm were adopted in this study. After testing the primary screw of 7.5 mm diameter and 70 mm length, 4 revision techniques were sequentially established and tested on the same pelvis as follows: corticocancellous bone, longer screw with 100 mm length, traditional PMMA augmentation, and boring PMMA augmentation. The difference of the boring technique from traditional PMMA augmentation is that PMMA was injected into the screw tract through 3 boring holes of outer cortical shell without removing the screw. On an MTS machine, after 5000 cyclic compressive loading of -200∼-500 N to the screw head, axial maximum pull-out strengths of the 5 screws were measured and analyzed. The pull-out strengths of the primary screw and 4 revised screws with corticocancellous bone, longer screw and traditional and boring PMMA augmentation were 1167 N, 361 N, 854 N, 1954 N, and 1820 N, respectively. Although longer screw method obtained significantly higher pull-out strength than corticocancellous bone (P<0.05), the revised screws using these 2 techniques exhibited notably lower pull-out strength than the primary screw and 2 PMMA-augmented screws (P<0.05). Either traditional or boring PMMA screw showed obviously higher pull-out strength than the primary screw (P<0.05); however, no significant difference of pull-out strength was detected between the 2 PMMA screws (P>0.05). Wadding corticocancellous bone and increasing screw length failed to provide sufficient anchoring strength for a loosened iliac screw; however, both traditional and boring PMMA-augmented techniques could effectively increase the fixation strength. On the basis of the viewpoint of minimal invasion, the boring PMMA augmentation may serve as a suitable salvage technique for iliac screw loosening.
Microbial Growth and Metabolism in Soil - Refining the Interpretation of Carbon Use Efficiency
NASA Astrophysics Data System (ADS)
Geyer, K.; Frey, S. D.
2016-12-01
Carbon use efficiency (CUE) describes a critical step in the terrestrial carbon cycle where microorganisms partition organic carbon (C) between stabilized organic forms and CO2. Application of this concept, however, begins with accurate measurements of CUE. Both traditional and developing approaches still depend on numerous assumptions that render them difficult to interpret and potentially incompatible with one another. Here we explore the soil processes inherent to traditional (e.g., substrate-based, biomass-based) and emerging (e.g., growth rate-based, calorimetry) CUE techniques in order to better understand the information they provide. Soil from the Harvard Forest Long Term Ecological Research (LTER) site in Massachusetts, USA, was amended with both 13C-glucose and 18O-water and monitored over 72 h for changes in dissolved organic carbon (DOC), respiration (R), microbial biomass (MB), DNA synthesis, and heat flux (Q). Four different CUE estimates were calculated: 1) (ΔDOC - R)/ΔDOC (substrate-based), 2) Δ13C-MB/(Δ13C-MB + R) (biomass-based), 3) Δ18O-DNA/(Δ18O-DNA + R) (growth rate-based), 4) Q/R (energy-based). Our results indicate that microbial growth (estimated by both 13C and 18O techniques) was delayed for 40 h after amendment even though DOC had declined to pre-amendment levels within 48 h. Respiration and heat flux also peaked after 40 h. Although these soils have a relatively high organic C content (5% C), respired CO2 was greater than 88% glucose-derived throughout the experiment. All estimates of microbial growth (Spearman's ρ >0.83, p<0.01) and efficiency (Spearman's ρ >0.65, p<0.05) were positively correlated, but strong differences in the magnitude of CUE suggest incomplete C accounting. This work increases the transparency of CUE techniques for researchers looking to choose the most appropriate measure for their scale of inquiry or to use CUE estimates in modeling applications.
Traditional Chinese food technology and cuisine.
Li, Jian-rong; Hsieh, Yun-Hwa P
2004-01-01
From ancient wisdom to modern science and technology, Chinese cuisine has been established from a long history of the country and gained a global reputation of its sophistication. Traditional Chinese foods and cuisine that exhibit Chinese culture, art and reality play an essential role in Chinese people's everyday lives. Recently, traditional Chinese foods have drawn a great degree of attention from food scientists and technologists, the food industry, and health promotion institutions worldwide due to the extensive values they offer beyond being merely another ethnic food. These traditional foods comprise a wide variety of products, such as pickled vegetables, salted fish and jellyfish, tofu and tofu derived products, rice and rice snack foods, fermented sauces, fish balls and thousand-year-old eggs. An overview of selected popular traditional Chinese foods and their processing techniques are included in this paper. Further development of the traditional techniques for formulation and production of these foods is expected to produce economic, social and health benefits.
NASA Astrophysics Data System (ADS)
Burton, Dallas Jonathan
The field of laser-based diagnostics has been a topic of research in various fields, more specifically for applications in environmental studies, military defense technologies, and medicine, among many others. In this dissertation, a novel laser-based optical diagnostic method, differential laser-induced perturbation spectroscopy (DLIPS), has been implemented in a spectroscopy mode and expanded into an imaging mode in combination with fluorescence techniques. The DLIPS method takes advantage of deep ultraviolet (UV) laser perturbation at sub-ablative energy fluences to photochemically cleave bonds and alter fluorescence signal response before and after perturbation. The resulting difference spectrum or differential image adds more information about the target specimen, and can be used in combination with traditional fluorescence techniques for detection of certain materials, characterization of many materials and biological specimen, and diagnosis of various human skin conditions. The differential aspect allows for mitigation of patient or sample variation, and has the potential to develop into a powerful, noninvasive optical sensing tool. The studies in this dissertation encompass efforts to continue the fundamental research on DLIPS including expansion of the method to an imaging mode. Five primary studies have been carried out and presented. These include the use of DLIPS in a spectroscopy mode for analysis of nitrogen-based explosives on various substrates, classification of Caribbean fruit flies versus Caribbean fruit flies that have been irradiated with gamma rays, and diagnosis of human skin cancer lesions. The nitrogen-based explosives and Caribbean fruit flies have been analyzed with the DLIPS scheme using the imaging modality, providing complementary information to the spectroscopic scheme. In each study, a comparison between absolute fluorescence signals and DLIPS responses showed that DLIPS statistically outperformed traditional fluorescence techniques with regards to regression error and classification.
Wang, Yucheng; Chen, Kangwu; Chen, Hao; Zhang, Kai; Lu, Jian; Mao, Haiqing; Yang, Huilin
2018-06-06
This retrospective cohort study aims to evaluate the effects of introducing the O-arm-based navigation technique into the traditional posterior lumbar interbody fusion (PLIF) procedure treating elderly patients with three-level lumbar degenerative diseases. Forty-one consecutive elderly patients were enrolled according to the criteria. There were 21 patients in the free-hand group and 20 patients in the O-arm group. Both two groups underwent the PLIF with or without the O-arm-based navigation technique. The demographic features, clinical data and outcomes, and radiological information were collected for further analysis. The average follow-up time was 18.3 (range, 12-28) months in the free-hand group and 16.7 (range, 12-24) months in the O-arm group. Comparison between two groups revealed no significant difference regarding demographic features. The operation time took in the navigation group was significantly less than that in the free-hand group (222.55 ± 38.00 mins versus 255.19 ± 40.26 mins, P < 0.05). Both VAS and ODI were improved post-operatively in two groups while comparison between groups showed no difference. The accuracy rate of pedicle screw positioning was 88.7% in the free-hand group to 96.9% in the O-arm group (P < 0.05). The O-arm-based navigation is an efficacious auxiliary technique which could significantly improve the accuracy of pedicle screw insertion, especially in cases of patients with complex anatomic degenerative diseases, without sacrificing the feasibility and reliable outcome of traditional PLIF.
He, Guo-qing; Liu, Tong-jie; Sadiq, Faizan A.; Gu, Jing-si; Zhang, Guo-hua
2017-01-01
Chinese traditional fermented foods have a very long history dating back thousands of years and have become an indispensable part of Chinese dietary culture. A plethora of research has been conducted to unravel the composition and dynamics of microbial consortia associated with Chinese traditional fermented foods using culture-dependent as well as culture-independent methods, like different high-throughput sequencing (HTS) techniques. These HTS techniques enable us to understand the relationship between a food product and its microbes to a greater extent than ever before. Considering the importance of Chinese traditional fermented products, the objective of this paper is to review the diversity and dynamics of microbiota in Chinese traditional fermented foods revealed by HTS approaches. PMID:28378567
NASA Astrophysics Data System (ADS)
Crowell, Paul A.; Liu, Changjiang; Patel, Sahil; Peterson, Tim; Geppert, Chad C.; Christie, Kevin; Stecklein, Gordon; Palmstrøm, Chris J.
2016-10-01
A distinguishing feature of spin accumulation in ferromagnet-semiconductor devices is its precession in a magnetic field. This is the basis for detection techniques such as the Hanle effect, but these approaches become ineffective as the spin lifetime in the semiconductor decreases. For this reason, no electrical Hanle measurement has been demonstrated in GaAs at room temperature. We show here that by forcing the magnetization in the ferromagnet to precess at resonance instead of relying only on the Larmor precession of the spin accumulation in the semiconductor, an electrically generated spin accumulation can be detected up to 300 K. The injection bias and temperature dependence of the measured spin signal agree with those obtained using traditional methods. We further show that this new approach enables a measurement of short spin lifetimes (< 100 psec), a regime that is not accessible in semiconductors using traditional Hanle techniques. The measurements were carried out on epitaxial Heusler alloy (Co2FeSi or Co2MnSi)/n-GaAs heterostructures. Lateral spin valve devices were fabricated by electron beam and photolithography. We compare measurements carried out by the new FMR-based technique with traditional non-local and three-terminal Hanle measurements. A full model appropriate for the measurements will be introduced, and a broader discussion in the context of spin pumping experimenments will be included in the talk. The new technique provides a simple and powerful means for detecting spin accumulation at high temperatures. Reference: C. Liu, S. J. Patel, T. A. Peterson, C. C. Geppert, K. D. Christie, C. J. Palmstrøm, and P. A. Crowell, "Dynamic detection of electron spin accumulation in ferromagnet-semiconductor devices by ferromagnetic resonance," Nature Communications 7, 10296 (2016). http://dx.doi.org/10.1038/ncomms10296
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Automatic Methods and Tools for the Verification of Real Time Systems
1997-11-30
We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.
A Survival Kit for the Elementary/Middle School Art Teacher.
ERIC Educational Resources Information Center
Hume, Helen D.
This book is for art teachers looking for a new approach to the traditional lesson. The projects can be used at most grade levels. While the book's organization is content-centered, it is also strongly student-centered. The lessons are based on the elements and principles of design. New skills and techniques are introduced, and most of the lessons…
ERIC Educational Resources Information Center
Umek, Lan; Aristovnik, Aleksander; Tomaževic, Nina; Keržic, Damijana
2015-01-01
The use of e-learning techniques in higher education is becoming ever more frequent. In some institutions, e-learning has completely replaced the traditional teaching methods, while in others it supplements classical courses. The paper presents a study conducted in a member institution of the University of Ljubljana that provides public…
ERIC Educational Resources Information Center
Luan, Jing; Zhao, Chun-Mei; Hayek, John C.
2009-01-01
Data mining provides both systematic and systemic ways to detect patterns of student engagement among students at hundreds of institutions. Using traditional statistical techniques alone, the task would be significantly difficult--if not impossible--considering the size and complexity in both data and analytical approaches necessary for this…
Monitoring urban tree cover using object-based image analysis and public domain remotely sensed data
L. Monika Moskal; Diane M. Styers; Meghan Halabisky
2011-01-01
Urban forest ecosystems provide a range of social and ecological services, but due to the heterogeneity of these canopies their spatial extent is difficult to quantify and monitor. Traditional per-pixel classification methods have been used to map urban canopies, however, such techniques are not generally appropriate for assessing these highly variable landscapes....
ERIC Educational Resources Information Center
Johnson, Adam R.
2013-01-01
A molecular orbital (MO) diagram, especially its frontier orbitals, explains the bonding and reactivity for a chemical compound. It is therefore important for students to learn how to construct one. The traditional methods used to derive these diagrams rely on linear algebra techniques to combine ligand orbitals into symmetry-adapted linear…
Blended Learning: A Mixed-Methods Study on Successful Schools and Effective Practices
ERIC Educational Resources Information Center
Mathews, Anne
2017-01-01
Blended learning is a teaching technique that utilizes face-to-face teaching and online or technology-based practice in which the learner has the ability to exert some level of control over the pace, place, path, or time of learning. Schools that employ this method of teaching often demonstrate larger gains than traditional face-to-face programs…
ERIC Educational Resources Information Center
Jeffries, Pamela R.; Woolf, Shirley; Linde, Beverly
2003-01-01
Electrocardiogram technique was taught to 32 nursing students using a self-study module, lecture-demonstration, and hands-on learning laboratories and to 45 students using interactive multimedia CD-ROM with self-study module. Pre/postprogram data show satisfaction and score improvement was high for both, with no significant differences. (Contains…
Traditional Predictors of Academic Performance in a Medical School's Independent Study Program.
ERIC Educational Resources Information Center
Meleca, C. Benjamin
1995-01-01
As an initial screening device for admission to the Independent Study Program at the Ohio State University College of Medicine, a numeric value was developed for 596 first-year students. The value was based on a combination of under graduate grade point average and Medical College Admission Test scores.The predictive value of the technique was…
Baritugo, Kei-Anne; Kim, Hee Taek; David, Yokimiko; Choi, Jong-Il; Hong, Soon Ho; Jeong, Ki Jun; Choi, Jong Hyun; Joo, Jeong Chan; Park, Si Jae
2018-05-01
Bio-based production of industrially important chemicals provides an eco-friendly alternative to current petrochemical-based processes. Because of the limited supply of fossil fuel reserves, various technologies utilizing microbial host strains for the sustainable production of platform chemicals from renewable biomass have been developed. Corynebacterium glutamicum is a non-pathogenic industrial microbial species traditionally used for L-glutamate and L-lysine production. It is a promising species for industrial production of bio-based chemicals because of its flexible metabolism that allows the utilization of a broad spectrum of carbon sources and the production of various amino acids. Classical breeding, systems, synthetic biology, and metabolic engineering approaches have been used to improve its applications, ranging from traditional amino-acid production to modern biorefinery systems for production of value-added platform chemicals. This review describes recent advances in the development of genetic engineering tools and techniques for the establishment and optimization of metabolic pathways for bio-based production of major C2-C6 platform chemicals using recombinant C. glutamicum.
Line identification studies using traditional techniques and wavelength coincidence statistics
NASA Technical Reports Server (NTRS)
Cowley, Charles R.; Adelman, Saul J.
1990-01-01
Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.
Single-pass memory system evaluation for multiprogramming workloads
NASA Technical Reports Server (NTRS)
Conte, Thomas M.; Hwu, Wen-Mei W.
1990-01-01
Modern memory systems are composed of levels of cache memories, a virtual memory system, and a backing store. Varying more than a few design parameters and measuring the performance of such systems has traditionally be constrained by the high cost of simulation. Models of cache performance recently introduced reduce the cost simulation but at the expense of accuracy of performance prediction. Stack-based methods predict performance accurately using one pass over the trace for all cache sizes, but these techniques have been limited to fully-associative organizations. This paper presents a stack-based method of evaluating the performance of cache memories using a recurrence/conflict model for the miss ratio. Unlike previous work, the performance of realistic cache designs, such as direct-mapped caches, are predicted by the method. The method also includes a new approach to the problem of the effects of multiprogramming. This new technique separates the characteristics of the individual program from that of the workload. The recurrence/conflict method is shown to be practical, general, and powerful by comparing its performance to that of a popular traditional cache simulator. The authors expect that the availability of such a tool will have a large impact on future architectural studies of memory systems.
Josephson frequency meter for millimeter and submillimeter wavelengths
NASA Technical Reports Server (NTRS)
Anischenko, S. E.; Larkin, S. Y.; Chaikovsky, V. I.; Kabayev, P. V.; Kamyshin, V. V.
1995-01-01
Frequency measurements of electromagnetic oscillations of millimeter and submillimeter wavebands with frequency growth due to a number of reasons become more and more difficult. First, these frequencies are considered to be cutoffs for semiconductor converting devices and one has to use optical measurement methods instead of traditional ones with frequency transfer. Second, resonance measurement methods are characterized by using relatively narrow bands and optical ones are limited in frequency and time resolution due to the limited range and velocity of movement of their mechanical elements as well as the efficiency of these optical techniques decrease with the increase of wavelength due to diffraction losses. That requires a priori information on the radiation frequency band of the source involved. Method of measuring frequency of harmonic microwave signals in millimeter and submillimeter wavebands based on the ac Josephson effect in superconducting contacts is devoid of all the above drawbacks. This approach offers a number of major advantages over the more traditional measurement methods, that is one based on frequency conversion, resonance and interferometric techniques. It can be characterized by high potential accuracy, wide range of frequencies measured, prompt measurement and the opportunity to obtain a panoramic display of the results as well as full automation of the measuring process.
Training and certification in endobronchial ultrasound-guided transbronchial needle aspiration
Konge, Lars; Nayahangan, Leizl Joy; Clementsen, Paul Frost
2017-01-01
Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) plays a key role in the staging of lung cancer, which is crucial for allocation to surgical treatment. EBUS-TBNA is a complicated procedure and simulation-based training is helpful in the first part of the long learning curve prior to performing the procedure on actual patients. New trainees should follow a structured training programme consisting of training on simulators to proficiency as assessed with a validated test followed by supervised practice on patients. The simulation-based training is superior to the traditional apprenticeship model and is recommended in the newest guidelines. EBUS-TBNA and oesophageal ultrasound-guided fine needle aspiration (EUS-FNA or EUS-B-FNA) are complementary to each other and the combined techniques are superior to either technique alone. It is logical to learn and to perform the two techniques in combination, however, for lung cancer staging solely EBUS-TBNA simulators exist, but hopefully in the future simulation-based training in EUS will be possible. PMID:28840013
Three-dimensional circumferential liposuction of the overweight or obese upper arm.
Hong, Yoon Gi; Sim, Hyung Bo; Lee, Mu Young; Seo, Sang Won; Chang, Choong Hyun; Yeo, Kwan Koo; Kim, June-kyu
2012-06-01
Due to recent trends in liposuction, anatomic consideration of the body's fatty layers is essential. Based on this knowledge, a circumferential approach to achieving maximal aesthetic results is highlighted. In the upper arm, aspiration of fat from only the posterolateral region can result in skin flaccidity and disharmony of the overall balance of the upper arm contour. Different suction techniques were applied depending on the degree of fat accumulation. If necessary, the operation area was extended around the axillary and scapular regions to overcome the limitations of the traditional method and to achieve optimal effects. To maximize skin contracture and redraping, the authors developed three-dimensional circumferential liposuction (3D-CL) based on two concepts: circumferential aspiration of the upper arm, to which was applied different fluid infiltration and liposuction techniques in three anatomic compartments (anteromedial, anterolateral, and posterolateral), and extension of liposuction to the periaxillar and parascarpular areas. A total of 57 female patients underwent liposuction of their excess arm fat using this technique. The authors achieved their aesthetic goals of a straightened inferior brachial border and a more slender body contour. Complications occurred for five patients including irregularity, incision-site scar, and transient pigmentation. Through 3D-CL, the limitations of traditional upper arm liposuction were overcome, and a slender arm contour with a straightened inferior brachial border was produced. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors at http://www.springer.com/00266.
Survey Of Lossless Image Coding Techniques
NASA Astrophysics Data System (ADS)
Melnychuck, Paul W.; Rabbani, Majid
1989-04-01
Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original. Lossless image coding algorithms meet this requirement by generating a decoded image that is numerically identical to the original. Several lossless coding techniques are modifications of well-known lossy schemes, whereas others are new. Traditional Markov-based models and newer arithmetic coding techniques are applied to predictive coding, bit plane processing, and lossy plus residual coding. Generally speaking, the compression ratio offered by these techniques are in the area of 1.6:1 to 3:1 for 8-bit pictorial images. Compression ratios for 12-bit radiological images approach 3:1, as these images have less detailed structure, and hence, their higher pel correlation leads to a greater removal of image redundancy.
Microfluidic desalination techniques and their potential applications.
Roelofs, S H; van den Berg, A; Odijk, M
2015-09-07
In this review we discuss recent developments in the emerging research field of miniaturized desalination. Traditionally desalination is performed to convert salt water into potable water and research is focused on improving performance of large-scale desalination plants. Microfluidic desalination offers several new opportunities in comparison to macro-scale desalination, such as providing a platform to increase fundamental knowledge of ion transport on the nano- and microfluidic scale and new microfluidic sample preparation methods. This approach has also lead to the development of new desalination techniques, based on micro/nanofluidic ion-transport phenomena, which are potential candidates for up-scaling to (portable) drinking water devices. This review assesses microfluidic desalination techniques on their applications and is meant to contribute to further implementation of microfluidic desalination techniques in the lab-on-chip community.
Joseph, Rodney P.; Cherrington, Andrea; Cuffee, Yendelela; Knight, BernNadette; Lewis, Dwight; Allison, Jeroan J.
2014-01-01
Introduction Innovative approaches are needed to promote physical activity among young adult overweight and obese African American women. We sought to describe key elements that African American women desire in a culturally relevant Internet-based tool to promote physical activity among overweight and obese young adult African American women. Methods A mixed-method approach combining nominal group technique and traditional focus groups was used to elicit recommendations for the development of an Internet-based physical activity promotion tool. Participants, ages 19 to 30 years, were enrolled in a major university. Nominal group technique sessions were conducted to identify themes viewed as key features for inclusion in a culturally relevant Internet-based tool. Confirmatory focus groups were conducted to verify and elicit more in-depth information on the themes. Results Twenty-nine women participated in nominal group (n = 13) and traditional focus group sessions (n = 16). Features that emerged to be included in a culturally relevant Internet-based physical activity promotion tool were personalized website pages, diverse body images on websites and in videos, motivational stories about physical activity and women similar to themselves in size and body shape, tips on hair care maintenance during physical activity, and online social support through social media (eg, Facebook, Twitter). Conclusion Incorporating existing social media tools and motivational stories from young adult African American women in Internet-based tools may increase the feasibility, acceptability, and success of Internet-based physical activity programs in this high-risk, understudied population. PMID:24433625
Systematic cloning of an ORFeome using the Gateway system.
Matsuyama, Akihisa; Yoshida, Minoru
2009-01-01
With the completion of the genome projects, there are increasing demands on the experimental systems that enable to exploit the entire set of protein-coding open reading frames (ORFs), viz. ORFeome, en masse. Systematic proteomic studies based on cloned ORFeomes are called "reverse proteomics," and have been launched in many organisms in recent years. Cloning of an ORFeome is such an attractive way for comprehensive understanding of biological phenomena, but is a challenging and daunting task. However, recent advances in techniques for DNA cloning using site-specific recombination and for high-throughput experimental techniques have made it feasible to clone an ORFeome with the minimum of exertion. The Gateway system is one of such the approaches, employing the recombination reaction of the bacteriophage lambda. Combining traditional DNA manipulation methods with modern technique of the recombination-based cloning system, it is possible to clone an ORFeome of an organism on an individual level.
Terminology model discovery using natural language processing and visualization techniques.
Zhou, Li; Tao, Ying; Cimino, James J; Chen, Elizabeth S; Liu, Hongfang; Lussier, Yves A; Hripcsak, George; Friedman, Carol
2006-12-01
Medical terminologies are important for unambiguous encoding and exchange of clinical information. The traditional manual method of developing terminology models is time-consuming and limited in the number of phrases that a human developer can examine. In this paper, we present an automated method for developing medical terminology models based on natural language processing (NLP) and information visualization techniques. Surgical pathology reports were selected as the testing corpus for developing a pathology procedure terminology model. The use of a general NLP processor for the medical domain, MedLEE, provides an automated method for acquiring semantic structures from a free text corpus and sheds light on a new high-throughput method of medical terminology model development. The use of an information visualization technique supports the summarization and visualization of the large quantity of semantic structures generated from medical documents. We believe that a general method based on NLP and information visualization will facilitate the modeling of medical terminologies.
Jiang, Baofeng; Jia, Pengjiao; Zhao, Wen; Wang, Wentao
2018-01-01
This paper explores a new method for rapid structural damage inspection of steel tube slab (STS) structures along randomly measured paths based on a combination of compressive sampling (CS) and ultrasonic computerized tomography (UCT). In the measurement stage, using fewer randomly selected paths rather than the whole measurement net is proposed to detect the underlying damage of a concrete-filled steel tube. In the imaging stage, the ℓ1-minimization algorithm is employed to recover the information of the microstructures based on the measurement data related to the internal situation of the STS structure. A numerical concrete tube model, with the various level of damage, was studied to demonstrate the performance of the rapid UCT technique. Real-world concrete-filled steel tubes in the Shenyang Metro stations were detected using the proposed UCT technique in a CS framework. Both the numerical and experimental results show the rapid UCT technique has the capability of damage detection in an STS structure with a high level of accuracy and with fewer required measurements, which is more convenient and efficient than the traditional UCT technique.
[Nursing curriculum: where to start and restart].
Vendrúscolo, D M; Manzolli, M C
1996-01-01
This study relies on the premise that all educational practices are based on presuppositions of philosophic and pedagogical nature, representing the curriculum, the vision of the world, perceived by the school and its professors. The leading question in this research can be formulated as: what is the professors perception on the guiding concepts of the nursing degree curriculum? 36 professors from 9 nursing schools in the State of São Paulo participated in the research answering enquiries based on the technique of Semantics Differential. The traditional, cognitive, behaviourist, self-realization and social reconstruction concepts guided this analysis. The results of the research expressed the subjects preference for postures of humanistic and social character, demonstrating the orientation for more renovating postures, as well as for more traditional postures, enabling the authors to conclude that there is no predominant guiding conception in the nursing curriculum.
A Darwinian approach to control-structure design
NASA Technical Reports Server (NTRS)
Zimmerman, David C.
1993-01-01
Genetic algorithms (GA's), as introduced by Holland (1975), are one form of directed random search. The form of direction is based on Darwin's 'survival of the fittest' theories. GA's are radically different from the more traditional design optimization techniques. GA's work with a coding of the design variables, as opposed to working with the design variables directly. The search is conducted from a population of designs (i.e., from a large number of points in the design space), unlike the traditional algorithms which search from a single design point. The GA requires only objective function information, as opposed to gradient or other auxiliary information. Finally, the GA is based on probabilistic transition rules, as opposed to deterministic rules. These features allow the GA to attack problems with local-global minima, discontinuous design spaces and mixed variable problems, all in a single, consistent framework.
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
1997-01-01
The NASA Lewis Research Center is developing analytical methods and software tools to create a bridge between the controls and computational fluid dynamics (CFD) disciplines. Traditionally, control design engineers have used coarse nonlinear simulations to generate information for the design of new propulsion system controls. However, such traditional methods are not adequate for modeling the propulsion systems of complex, high-speed vehicles like the High Speed Civil Transport. To properly model the relevant flow physics of high-speed propulsion systems, one must use simulations based on CFD methods. Such CFD simulations have become useful tools for engineers that are designing propulsion system components. The analysis techniques and software being developed as part of this effort are an attempt to evolve CFD into a useful tool for control design as well. One major aspect of this research is the generation of linear models from steady-state CFD results. CFD simulations, often used during the design of high-speed inlets, yield high resolution operating point data. Under a NASA grant, the University of Akron has developed analytical techniques and software tools that use these data to generate linear models for control design. The resulting linear models have the same number of states as the original CFD simulation, so they are still very large and computationally cumbersome. Model reduction techniques have been successfully applied to reduce these large linear models by several orders of magnitude without significantly changing the dynamic response. The result is an accurate, easy to use, low-order linear model that takes less time to generate than those generated by traditional means. The development of methods for generating low-order linear models from steady-state CFD is most complete at the one-dimensional level, where software is available to generate models with different kinds of input and output variables. One-dimensional methods have been extended somewhat so that linear models can also be generated from two- and three-dimensional steady-state results. Standard techniques are adequate for reducing the order of one-dimensional CFD-based linear models. However, reduction of linear models based on two- and three-dimensional CFD results is complicated by very sparse, ill-conditioned matrices. Some novel approaches are being investigated to solve this problem.
ERIC Educational Resources Information Center
Travis, Holly; Lord, Thomas
2004-01-01
Constructivist teaching techniques work well in various instructional settings, but many teachers remain skeptical because there is a lack of quantitative data supporting this model. This study compared an undergraduate nonmajors biology lab section taught in a traditional teacher-centered style to a similar section taught as a constructivist…
[Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].
Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia
2008-07-01
Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.
Ortu, Eleonora; Pietropaoli, Davide; Adib, Fray; Masci, Chiara; Giannoni, Mario; Monaco, Annalisa
2017-11-16
Objective To compare the clinical efficacy of two techniques for fabricating a Bimler device by assessing the patient's surface electromyography (sEMG) activity at rest before treatment and six months after treatment. Methods Twenty-four patients undergoing orthodontic treatment were enrolled in the study; 12 formed the test group and wore a Bimler device fabricated with a Myoprint impression using neuromuscular orthodontic technique and 12 formed the control group and were treated by traditional orthodontic technique with a wax bite in protrusion. The "rest" sEMG of each patient was recorded prior to treatment and six months after treatment. Results The neuromuscular-designed Bimler device was more comfortable and provided better treatment results than the traditional Bimler device. Conclusion This study suggests that the patient group subjected to neuromuscular orthodontic treatment had a treatment outcome with more relaxed masticatory muscles and better function versus the traditional orthodontic treatment.
Information visualization of the minority game
NASA Astrophysics Data System (ADS)
Jiang, W.; Herbert, R. D.; Webber, R.
2008-02-01
Many dynamical systems produce large quantities of data. How can the system be understood from the output data? Often people are simply overwhelmed by the data. Traditional tools such as tables and plots are often not adequate, and new techniques are needed to help people to analyze the system. In this paper, we propose the use of two spacefilling visualization tools to examine the output from a complex agent-based financial model. We measure the effectiveness and performance of these tools through usability experiments. Based on the experimental results, we develop two new visualization techniques that combine the advantages and discard the disadvantages of the information visualization tools. The model we use is an evolutionary version of the Minority Game which simulates a financial market.
A Comparison of FPGA and GPGPU Designs for Bayesian Occupancy Filters.
Medina, Luis; Diez-Ochoa, Miguel; Correal, Raul; Cuenca-Asensi, Sergio; Serrano, Alejandro; Godoy, Jorge; Martínez-Álvarez, Antonio; Villagra, Jorge
2017-11-11
Grid-based perception techniques in the automotive sector based on fusing information from different sensors and their robust perceptions of the environment are proliferating in the industry. However, one of the main drawbacks of these techniques is the traditionally prohibitive, high computing performance that is required for embedded automotive systems. In this work, the capabilities of new computing architectures that embed these algorithms are assessed in a real car. The paper compares two ad hoc optimized designs of the Bayesian Occupancy Filter; one for General Purpose Graphics Processing Unit (GPGPU) and the other for Field-Programmable Gate Array (FPGA). The resulting implementations are compared in terms of development effort, accuracy and performance, using datasets from a realistic simulator and from a real automated vehicle.
NASA Technical Reports Server (NTRS)
Dorrington, Adrian A.; Jones, Thomas W.; Danehy, Paul M.; Pappa, Richard S.
2003-01-01
Photogrammetry has proven to be a valuable tool for static and dynamic profiling of membrane based inflatable and ultra-lightweight space structures. However, the traditional photogrammetric targeting techniques used for solid structures, such as attached retro-reflective targets and white-light dot projection, have some disadvantages and are not ideally suited for measuring highly transparent or reflective membrane structures. In this paper, we describe a new laser-induced fluorescence based target generation technique that is more suitable for these types of structures. We also present several examples of non-contact non-invasive photogrammetric measurements of laser-dye doped polymers, including the dynamic measurement and modal analysis of a 1m-by-1m aluminized solar sail style membrane.
Privacy Enhancements for Inexact Biometric Templates
NASA Astrophysics Data System (ADS)
Ratha, Nalini; Chikkerur, Sharat; Connell, Jonathan; Bolle, Ruud
Traditional authentication schemes utilize tokens or depend on some secret knowledge possessed by the user for verifying his or her identity. Although these techniques are widely used, they have several limitations. Both tokenand knowledge-based approaches cannot differentiate between an authorized user and an impersonator having access to the tokens or passwords. Biometrics-based authentication schemes overcome these limitations while offering usability advantages in the area of password management. However, despite its obvious advantages, the use of biometrics raises several security and privacy concerns.
Brusati, R; Giannì, A B
2005-12-01
The authors describe a surgical technique alternative to traditional pre-surgical orthodontics in order to increase the apical base in mandibular retrusion (class II, division I). This subapical osteotomy, optimizing inferior incisal axis without dental extractions and a long orthodontic treatment, associated to genioplasty permits to obtain an ideal labio-dento-mental morphology. This procedure avoids in some cases the need of a mandibular advancement and, if necessary, it reduces his entity with obvious advantages.
Instrumentation for air quality measurements.
NASA Technical Reports Server (NTRS)
Loewenstein, M.
1973-01-01
Comparison of the new generation of air quality monitoring instruments with some more traditional methods. The first generation of air quality measurement instruments, based on the use of oxidant coulometric cells, nitrogen oxide colorimetry, carbon monoxide infrared analyzers, and other types of detectors, is compared with new techniques now coming into wide use in the air monitoring field and involving the use of chemiluminescent reactions, optical absorption detectors, a refinement of the carbon monoxide infrared analyzer, electrochemical cells based on solid electrolytes, and laser detectors.
Knowledge-based simulation for aerospace systems
NASA Technical Reports Server (NTRS)
Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.
1988-01-01
Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.
Optimization of segmented thermoelectric generator using Taguchi and ANOVA techniques.
Kishore, Ravi Anant; Sanghadasa, Mohan; Priya, Shashank
2017-12-01
Recent studies have demonstrated that segmented thermoelectric generators (TEGs) can operate over large thermal gradient and thus provide better performance (reported efficiency up to 11%) as compared to traditional TEGs, comprising of single thermoelectric (TE) material. However, segmented TEGs are still in early stages of development due to the inherent complexity in their design optimization and manufacturability. In this study, we demonstrate physics based numerical techniques along with Analysis of variance (ANOVA) and Taguchi optimization method for optimizing the performance of segmented TEGs. We have considered comprehensive set of design parameters, such as geometrical dimensions of p-n legs, height of segmentation, hot-side temperature, and load resistance, in order to optimize output power and efficiency of segmented TEGs. Using the state-of-the-art TE material properties and appropriate statistical tools, we provide near-optimum TEG configuration with only 25 experiments as compared to 3125 experiments needed by the conventional optimization methods. The effect of environmental factors on the optimization of segmented TEGs is also studied. Taguchi results are validated against the results obtained using traditional full factorial optimization technique and a TEG configuration for simultaneous optimization of power and efficiency is obtained.
West Java Snack Mapping based on Snack Types, Main Ingredients, and Processing Techniques
NASA Astrophysics Data System (ADS)
Nurani, A. S.; Subekti, S.; Ana
2016-04-01
The research was motivated by lack of literature on archipelago snack especially from West Java. It aims to explore the snack types, the processing techniques, and the main ingredients by planning a learning material on archipelago cake especially from West Java. The research methods used are descriptive observations and interviews. The samples were randomly chosen from all regions in West Java. The findings show the identification of traditional snack from West java including: 1. snack types which are similar in all regions as research sample namely: opak, rangginang, nagasari, aliagrem, cuhcur, keripik, semprong, wajit, dodol, kecimpring, combro, tape ketan, and surabi. The typical snack types involve burayot (Garut), simping kaum (Purwakarta), surabi hejo (Karawang), papais cisaat (Subang), Papais moyong, opak bakar (Kuningan), opak oded, ranggesing (Sumedang), gapit, tapel (Cirebon), gulampo, kue aci (Tasikmalaya), wajit cililin, gurilem (West Bandung), and borondong (Bandung District); 2. various processing techniques namely: steaming, boiling, frying, caramelizing, baking, grilling, roaster, sugaring; 3. various main ingredients namely rice, local glutinous rice, rice flour, glutinous rice flour, starch, wheat flour, hunkue flour, cassava, sweet potato, banana, nuts, and corn; 4. snack classification in West Java namely (1) traditional snack, (2) creation-snack, (3) modification-snack, (4) outside influence-snack.
Pen-based Interfaces for Engineering and Education
NASA Astrophysics Data System (ADS)
Stahovich, Thomas F.
Sketches are an important problem-solving tool in many fields. This is particularly true of engineering design, where sketches facilitate creativity by providing an efficient medium for expressing ideas. However, despite the importance of sketches in engineering practice, current engineering software still relies on traditional mouse and keyboard interfaces, with little or no capabilities to handle free-form sketch input. With recent advances in machine-interpretation techniques, it is now becoming possible to create practical interpretation-based interfaces for such software. In this chapter, we report on our efforts to create interpretation techniques to enable pen-based engineering applications. We describe work on two fundamental sketch understanding problems. The first is sketch parsing, the task of clustering pen strokes or geometric primitives into individual symbols. The second is symbol recognition, the task of classifying symbols once they have been located by a parser. We have used the techniques that we have developed to construct several pen-based engineering analysis tools. These are used here as examples to illustrate our methods. We have also begun to use our techniques to create pen-based tutoring systems that scaffold students in solving problems in the same way they would ordinarily solve them with paper and pencil. The chapter concludes with a brief discussion of these systems.
NASA Astrophysics Data System (ADS)
Yousefian Jazi, Nima
Spatial filtering and directional discrimination has been shown to be an effective pre-processing approach for noise reduction in microphone array systems. In dual-microphone hearing aids, fixed and adaptive beamforming techniques are the most common solutions for enhancing the desired speech and rejecting unwanted signals captured by the microphones. In fact, beamformers are widely utilized in systems where spatial properties of target source (usually in front of the listener) is assumed to be known. In this dissertation, some dual-microphone coherence-based speech enhancement techniques applicable to hearing aids are proposed. All proposed algorithms operate in the frequency domain and (like traditional beamforming techniques) are purely based on the spatial properties of the desired speech source and does not require any knowledge of noise statistics for calculating the noise reduction filter. This benefit gives our algorithms the ability to address adverse noise conditions, such as situations where interfering talker(s) speaks simultaneously with the target speaker. In such cases, the (adaptive) beamformers lose their effectiveness in suppressing interference, since the noise channel (reference) cannot be built and updated accordingly. This difference is the main advantage of the proposed techniques in the dissertation over traditional adaptive beamformers. Furthermore, since the suggested algorithms are independent of noise estimation, they offer significant improvement in scenarios that the power level of interfering sources are much more than that of target speech. The dissertation also shows the premise behind the proposed algorithms can be extended and employed to binaural hearing aids. The main purpose of the investigated techniques is to enhance the intelligibility level of speech, measured through subjective listening tests with normal hearing and cochlear implant listeners. However, the improvement in quality of the output speech achieved by the algorithms are also presented to show that the proposed methods can be potential candidates for future use in commercial hearing aids and cochlear implant devices.
Elemental investigation of Syrian medicinal plants using PIXE analysis
NASA Astrophysics Data System (ADS)
Rihawy, M. S.; Bakraji, E. H.; Aref, S.; Shaban, R.
2010-09-01
Particle induced X-ray emission (PIXE) technique has been employed to perform elemental analysis of K, Ca, Mn, Fe, Cu, Zn, Br and Sr for Syrian medicinal plants used traditionally to enhance the body immunity. Plant samples were prepared in a simple dried base. The results were verified by comparing with those obtained from both IAEA-359 and IAEA-V10 reference materials. Relative standard deviations are mostly within ±5-10% suggest good precision. A correlation between the elemental content in each medicinal plant with its traditional remedial usage has been proposed. Both K and Ca are found to be the major elements in the samples. Fe, Mn and Zn have been detected in good levels in most of these plants clarifying their possible contribution to keep the body immune system in good condition. The contribution of the elements in these plants to the dietary recommended intakes (DRI) has been evaluated. Advantages and limitations of PIXE analytical technique in this investigation have been reviewed.
Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection.
Caiazzo, Fabrizia; Caggiano, Alessandra
2018-04-20
Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti⁻6Al⁻4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data.
Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection
2018-01-01
Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti–6Al–4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data. PMID:29677114
Canaloplasty: A Minimally Invasive and Maximally Effective Glaucoma Treatment
Khaimi, Mahmoud A.
2015-01-01
Canaloplasty is a highly effective, minimally invasive, surgical technique indicated for the treatment of open-angle glaucoma that works by restoring the function of the eye's natural outflow system. The procedure's excellent safety profile and long-term efficacy make it a viable option for the majority of glaucoma patient types. It can be used in conjunction with existing drug based glaucoma treatments, after laser or other types of incisional surgery, and does not preclude or affect the outcome of future surgery. Numerous scientific studies have shown Canaloplasty to be safe and effective in lowering IOP whilst reducing medication dependence. A recent refinement of Canaloplasty, known as ab-interno Canaloplasty (ABiC), maintains the IOP-lowering and safety benefits of traditional (ab-externo) Canaloplasty using a more efficient, simplified surgical approach. This paper presents a review of Canaloplasty indications, clinical data, and complications, as well as comparisons with traditional incisional glaucoma techniques. It also addresses the early clinical evidence for ABiC. PMID:26495135
Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K
2017-12-01
Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.
Combustion Synthesis of Glass-Ceramic Composites Under Terrestrial and Reduced Gravity Conditions
NASA Technical Reports Server (NTRS)
Manerbino, Anthony; Yi, H. C.; Guigne, J. Y.; Moore, J. J.; Gokoglu, S. (Technical Monitor)
2001-01-01
Glasses based on B2O3-Al2O3-BaO-and B2O3-Al2O3-MgO have been produced by the combustion synthesis technique. The combustion temperature, wave velocity for selected compositions are presented. Combustion reactions of these materials were typically low exothermic, resulting in unstable combustion waves. Microstructural characterization of these materials indicated that the glass formation region was similar to those that were produced by the traditional technique. Results of the effect of gravity on the glass formation (or divitrification) studied onboard of KC-135 is also presented.
NASA Technical Reports Server (NTRS)
Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William
2015-01-01
Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).
Photobiomolecular deposition of metallic particles and films
Hu, Zhong-Cheng
2005-02-08
The method of the invention is based on the unique electron-carrying function of a photocatalytic unit such as the photosynthesis system I (PSI) reaction center of the protein-chlorophyll complex isolated from chloroplasts. The method employs a photo-biomolecular metal deposition technique for precisely controlled nucleation and growth of metallic clusters/particles, e.g., platinum, palladium, and their alloys, etc., as well as for thin-film formation above the surface of a solid substrate. The photochemically mediated technique offers numerous advantages over traditional deposition methods including quantitative atom deposition control, high energy efficiency, and mild operating condition requirements.
Photobiomolecular metallic particles and films
Hu, Zhong-Cheng
2003-05-06
The method of the invention is based on the unique electron-carrying function of a photocatalytic unit such as the photosynthesis system I (PSI) reaction center of the protein-chlorophyll complex isolated from chloroplasts. The method employs a photo-biomolecular metal deposition technique for precisely controlled nucleation and growth of metallic clusters/particles, e.g., platinum, palladium, and their alloys, etc., as well as for thin-film formation above the surface of a solid substrate. The photochemically mediated technique offers numerous advantages over traditional deposition methods including quantitative atom deposition control, high energy efficiency, and mild operating condition requirements.
[Advancements of computer chemistry in separation of Chinese medicine].
Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei
2011-12-01
Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1996-07-01
This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.
Distillation Designs for the Lunar Surface
NASA Technical Reports Server (NTRS)
Boul, Peter J.; Lange,Kevin E.; Conger, Bruce; Anderson, Molly
2010-01-01
Gravity-based distillation methods may be applied to the purification of wastewater on the lunar base. These solutions to water processing are robust physical separation techniques, which may be more advantageous than many other techniques for their simplicity in design and operation. The two techniques can be used in conjunction with each other to obtain high purity water. The components and feed compositions for modeling waste water streams are presented in conjunction with the Aspen property system for traditional stage distillation. While the individual components for each of the waste streams will vary naturally within certain bounds, an analog model for waste water processing is suggested based on typical concentration ranges for these components. Target purity levels for recycled water are determined for each individual component based on NASA s required maximum contaminant levels for potable water Optimum parameters such as reflux ratio, feed stage location, and processing rates are determined with respect to the power consumption of the process. Multistage distillation is evaluated for components in wastewater to determine the minimum number of stages necessary for each of 65 components in humidity condensate and urine wastewater mixed streams.
NASA Astrophysics Data System (ADS)
Torteeka, Peerapong; Gao, Peng-Qi; Shen, Ming; Guo, Xiao-Zhang; Yang, Da-Tao; Yu, Huan-Huan; Zhou, Wei-Ping; Zhao, You
2017-02-01
Although tracking with a passive optical telescope is a powerful technique for space debris observation, it is limited by its sensitivity to dynamic background noise. Traditionally, in the field of astronomy, static background subtraction based on a median image technique has been used to extract moving space objects prior to the tracking operation, as this is computationally efficient. The main disadvantage of this technique is that it is not robust to variable illumination conditions. In this article, we propose an approach for tracking small and dim space debris in the context of a dynamic background via one of the optical telescopes that is part of the space surveillance network project, named the Asia-Pacific ground-based Optical Space Observation System or APOSOS. The approach combines a fuzzy running Gaussian average for robust moving-object extraction with dim-target tracking using a particle-filter-based track-before-detect method. The performance of the proposed algorithm is experimentally evaluated, and the results show that the scheme achieves a satisfactory level of accuracy for space debris tracking.
NASA Astrophysics Data System (ADS)
Bard, P. Y.; Laurendeau, A.; Hollender, F.; Perron, V.; Hernandez, B.; Foundotos, L.
2016-12-01
Assessment of local seismic hazard on hard rock sites (1000 < VS30 < 3000 m/s) is needed either for installations built on such hard rock, or as a reference motion for site response computation. Empirical ground motion prediction equations (GMPEs) are the traditional basis for estimating ground motion, but most of them are poorly constrained for VS30 larger than 1000 m/s. The presently used approach for estimating hard rock hazard consists of "host-to-target" adjustment techniques (HTTA) based on VS30 and κ0 values. Recent studies have investigated alternative methods to estimate reference motions on very hard rock through an original processing of the Japanese KiK-net recordings from stiff sites (500 < VS30 < 1350 m/s). The pairs of recordings at surface and depth, together with the knowledge of the velocity profile, allowed to derive two sets of "virtual" outcropping, hard-rock motion data for sites having velocities in the range [1000 - 3000 m/s]. The corrections are based either on a transformation of deep, within-motion to outcropping motion, or on a deconvolution of surface recordings using the velocity profile and 1D simulation, which has been performed both in the response spectrum and Fourier domains. Each of these virtual "outcropping hard-rock motion" data sets has then been used to derive GMPEs with simple functional forms, using as site condition proxy the S-wave velocity at depth (VSDH), ranging from 1000 to 3000 m/s. Both sets provide very similar predictions, which are much smaller at high frequencies (f > 10 Hz) than those estimated with the traditional HTTA technique - by a factor up to 3-4,. These differences decrease for decreasing frequency, and become negligible at low frequency (f < 1 Hz). The main focus will be to discuss the possible reasons of such differences, in relation with the implicit or explicit assumptions of either approach. Our present interpretation is related to the existence of a significant, high-frequency amplification on stiff soils and standard rocks, due to thin, shallow, moderate velocity layers. Not only this resonant amplification is not correctly accounted for by the quarter-wavelength approach used in the traditional HTTA adjustment techniques, but it may also significantly impact and bias the κ measurements, and the (VS30- κ0) relationships implicitly used in HTTA techniques.
Hepatitis Diagnosis Using Facial Color Image
NASA Astrophysics Data System (ADS)
Liu, Mingjia; Guo, Zhenhua
Facial color diagnosis is an important diagnostic method in traditional Chinese medicine (TCM). However, due to its qualitative, subjective and experi-ence-based nature, traditional facial color diagnosis has a very limited application in clinical medicine. To circumvent the subjective and qualitative problems of facial color diagnosis of Traditional Chinese Medicine, in this paper, we present a novel computer aided facial color diagnosis method (CAFCDM). The method has three parts: face Image Database, Image Preprocessing Module and Diagnosis Engine. Face Image Database is carried out on a group of 116 patients affected by 2 kinds of liver diseases and 29 healthy volunteers. The quantitative color feature is extracted from facial images by using popular digital image processing techni-ques. Then, KNN classifier is employed to model the relationship between the quantitative color feature and diseases. The results show that the method can properly identify three groups: healthy, severe hepatitis with jaundice and severe hepatitis without jaundice with accuracy higher than 73%.
NASA Astrophysics Data System (ADS)
Choi, Young-In; Ahn, Jaemyung
2018-04-01
Earned value management (EVM) is a methodology for monitoring and controlling the performance of a project based on a comparison between planned and actual cost/schedule. This study proposes a concept of hybrid earned value management (H-EVM) that integrates the traditional EVM metrics with information on the technology readiness level. The proposed concept can reflect the progress of a project in a sensitive way and provides short-term perspective complementary to the traditional EVM metrics. A two-dimensional visualization on the cost/schedule status of a project reflecting both of the traditional EVM (long-term perspective) and the proposed H-EVM (short-term perspective) indices is introduced. A case study on the management of a new space launch vehicle development program is conducted to demonstrate the effectiveness of the proposed H-EVM concept, associated metrics, and the visualization technique.
A resource-sharing model based on a repeated game in fog computing.
Sun, Yan; Zhang, Nan
2017-03-01
With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.
de Almeida, Patrícia Maria Duarte
2006-02-01
Considering the body structures and systems loss of function, after a Spinal Cord Injury, with is respective activities limitations and social participation restriction, the rehabilitation process goals are to achieve the maximal functional independence and quality of life allowed by the clinical lesion. For this is necessary a rehabilitation period with a rehabilitation team, including the physiotherapist whose interventions will depend on factors such degree of completeness or incompleteness and patient clinical stage. Physiotherapy approach includes several procedures and techniques related with a traditional model or with the recent perspective of neuronal regeneration. Following a traditional model, the interventions in complete A and incomplete B lesions, is based on compensatory method of functional rehabilitation using the non affected muscles. In the incomplete C and D lesions, motor re-education below the lesion, using key points to facilitate normal and selective patterns of movement is preferable. In other way if the neuronal regeneration is possible with respective function improve; the physiotherapy approach goals are to maintain muscular trofism and improve the recruitment of motor units using intensive techniques. In both, there is no scientific evidence to support the procedures, exists a lack of investigation and most of the research are methodologically poor. © 2006 Sociedade Portuguesa de Pneumologia/SPP.
NASA Astrophysics Data System (ADS)
Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng
This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).
Haverkort, J J Mark; Leenen, Luke P H
2017-10-01
Presently used evaluation techniques rely on 3 traditional dimensions: reports from observers, registration system data, and observational cameras. Some of these techniques are observer-dependent and are not reproducible for a second review. This proof-of-concept study aimed to test the feasibility of extending evaluation to a fourth dimension, the patient's perspective. Footage was obtained during a large, full-scale hospital trauma drill. Two mock victims were equipped with point-of-view cameras filming from the patient's head. Based on the Major Incident Hospital's first experience during the drill, a protocol was developed for a prospective, standardized method to evaluate a hospital's major incident response from the patient's perspective. The protocol was then tested in a second drill for its feasibility. New insights were gained after review of the footage. The traditional observer missed some of the evaluation points, which were seen on the point-of-view cameras. The information gained from the patient's perspective proved to be implementable into the designed protocol. Use of point-of-view camera recordings from a mock patient's perspective is a valuable addition to traditional evaluation of trauma drills and trauma care. Protocols should be designed to optimize and objectify judgement of such footage. (Disaster Med Public Health Preparedness. 2017;11:594-599).
The Progression of Podcasting/Vodcasting in a Technical Physics Class
NASA Astrophysics Data System (ADS)
Glanville, Y. J.
2010-11-01
Technology such as Microsoft PowerPoint presentations, clickers, podcasting, and learning management suites is becoming prevalent in classrooms. Instructors are using these media in both large lecture hall settings and small classrooms with just a handful of students. Traditionally, each of these media is instructor driven. For instance, podcasting (audio recordings) provided my technical physics course with supplemental notes to accompany a traditional algebra-based physics lecture. Podcasting is an ideal tool for this mode of instruction, but podcasting/vodcasting is also an ideal technique for student projects and student-driven learning. I present here the various podcasting/vodcasting projects my students and I have undertaken over the last few years.
Non-linear eigensolver-based alternative to traditional SCF methods
NASA Astrophysics Data System (ADS)
Gavin, B.; Polizzi, E.
2013-05-01
The self-consistent procedure in electronic structure calculations is revisited using a highly efficient and robust algorithm for solving the non-linear eigenvector problem, i.e., H({ψ})ψ = Eψ. This new scheme is derived from a generalization of the FEAST eigenvalue algorithm to account for the non-linearity of the Hamiltonian with the occupied eigenvectors. Using a series of numerical examples and the density functional theory-Kohn/Sham model, it will be shown that our approach can outperform the traditional SCF mixing-scheme techniques by providing a higher converge rate, convergence to the correct solution regardless of the choice of the initial guess, and a significant reduction of the eigenvalue solve time in simulations.
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Spiers, Gary D.; Lobl, Elena S.; Rothermel, Jeff; Keller, Vernon W.
1996-01-01
Innovative designs of a space-based laser remote sensing 'wind machine' are presented. These designs seek compatibility with the traditionally conflicting constraints of high scientific value and low total mission cost. Mission cost is reduced by moving to smaller, lighter, more off-the-shelf instrument designs which can be accommodated on smaller launch vehicles.
Wright, Cameron H G; Barrett, Steven F; Pack, Daniel J
2005-01-01
We describe a new approach to attacking the problem of robust computer vision for mobile robots. The overall strategy is to mimic the biological evolution of animal vision systems. Our basic imaging sensor is based upon the eye of the common house fly, Musca domestica. The computational algorithms are a mix of traditional image processing, subspace techniques, and multilayer neural networks.
Non-surgical treatment of esophageal achalasia
Annese, Vito; Bassotti, Gabrio
2006-01-01
Esophageal achalasia is an infrequent motility disorder characterized by a progressive stasis and dilation of the oesophagus; with subsequent risk of aspiration, weight loss, and malnutrition. Although the treatment of achalasia has been traditionally based on a surgical approach, especially with the introduction of laparoscopic techniques, there is still some space for a medical approach. The present article reviews the non-surgical therapeutic options for achalasia. PMID:17007039
Characterization and Biomimcry of Avian Nanostructured Tissues
2016-01-19
keratin cortex (Maia et al. 2011) at the outer edge of barbs from TEM images. Geometric morphometrics of barb shape Digitized images of the barb thin...morphological measurements (all P > 0.05; Figure 4C; Table S2). Gloss and Barb Geometric Morphometrics Matte and glossy barbs differed significantly in...barbs and lack of multiple, clear anatomically homologous features, traditional landmark based morphometric techniques (Bookstein, 1982) would be
Microfabricated Nickel Based Sensors for Hostile and High Pressure Environments
NASA Astrophysics Data System (ADS)
Holt, Christopher Michael Bjustrom
This thesis outlines the development of two platforms for integrating microfabricated sensors with high pressure feedthroughs for application in hostile high temperature high pressure environments. An application in oil well production logging is explored and two sensors were implemented with these platforms for application in an oil well. The first platform developed involved microfabrication directly onto a cut and polished high pressure feedthrough. This technique enables a system that is more robust than the wire bonded silicon die technique used for MEMS integration in pressure sensors. Removing wire bonds from the traditional MEMS package allows for direct interface of a microfabricated sensor with a hostile high pressure fluid environment which is not currently possible. During the development of this platform key performance metrics included pressure testing to 70MPa and temperature cycling from 20°C to 200°C. This platform enables electronics integration with a variety of microfabricated electrical and thermal based sensors which can be immersed within the oil well environment. The second platform enabled free space fabrication of nickel microfabricated devices onto an array of pins using a thick tin sacrificial layer. This technique allowed microfabrication of metal MEMS that are released by distances of 1cm from their substrate. This method is quite flexible and allows for fabrication to be done on any pin array substrate regardless of surface quality. Being able to place released MEMS sensors directly onto traditional style circuit boards, ceramic circuit boards, electrical connectors, ribbon cables, pin headers, or high pressure feedthroughs greatly improves the variety of possible applications and reduces fabrication costs. These two platforms were then used to fabricate thermal conductivity sensors that showed excellent performance for distinguishing between oil, water, and gas phases. Testing was conducted at various flow rates and performance of the released platform was shown to be better than the performance seen in the anchored sensors while both platforms were significantly better than a simply fabricated wrapped wire sensor. The anchored platform was also used to demonstrate a traditional capacitance based fluid dielectric sensor which was found to work similarly to conventional commercial capacitance probes while being significantly smaller in size.
2005-01-01
Students are most motivated and learn best when they are immersed in an environment that causes them to realize why they should learn. Perhaps nowhere is this truer than when teaching the biological sciences to engineers. Transitioning from a traditionally mathematics-based to a traditionally knowledge-based pedagogical style can challenge student learning and engagement. To address this, human pathologies were used as a problem-based context for teaching knowledge-based cell biological mechanisms. Lectures were divided into four modules. First, a disease was presented from clinical, economic, and etiological standpoints. Second, fundamental concepts of cell and molecular biology were taught that were directly relevant to that disease. Finally, we discussed the cellular and molecular basis of the disease based on these fundamental concepts, together with current clinical approaches to the disease. The basic science is thus presented within a “shrink wrap” of disease application. Evaluation of this contextual technique suggests that it is very useful in improving undergraduate student focus and motivation, and offers many advantages to the instructor as well. PMID:15917872
Vision-based system for the control and measurement of wastewater flow rate in sewer systems.
Nguyen, L S; Schaeli, B; Sage, D; Kayal, S; Jeanbourquin, D; Barry, D A; Rossi, L
2009-01-01
Combined sewer overflows and stormwater discharges represent an important source of contamination to the environment. However, the harsh environment inside sewers and particular hydraulic conditions during rain events reduce the reliability of traditional flow measurement probes. In the following, we present and evaluate an in situ system for the monitoring of water flow in sewers based on video images. This paper focuses on the measurement of the water level based on image-processing techniques. The developed image-based water level algorithms identify the wall/water interface from sewer images and measure its position with respect to real world coordinates. A web-based user interface and a 3-tier system architecture enable the remote configuration of the cameras and the image-processing algorithms. Images acquired and processed by our system were found to reliably measure water levels and thereby to provide crucial information leading to better understand particular hydraulic behaviors. In terms of robustness and accuracy, the water level algorithm provided equal or better results compared to traditional water level probes in three different in situ configurations.
NASA Astrophysics Data System (ADS)
Muhammad, Umar B.; Ezugwu, Absalom E.; Ofem, Paulinus O.; Rajamäki, Jyri; Aderemi, Adewumi O.
2017-06-01
Recently, researchers in the field of wireless sensor networks have resorted to energy harvesting techniques that allows energy to be harvested from the ambient environment to power sensor nodes. Using such Energy harvesting techniques together with proper routing protocols, an Energy Neutral state can be achieved so that sensor nodes can run perpetually. In this paper, we propose an Energy Neutral LEACH routing protocol which is an extension to the traditional LEACH protocol. The goal of the proposed protocol is to use Gateway node in each cluster so as to reduce the data transmission ranges of cluster head nodes. Simulation results show that the proposed routing protocol achieves a higher throughput and ensure the energy neutral status of the entire network.
[Screening for atherosclerosis to prevent cardiovascular risk : a pro-contra debate].
Nanchen, David; Genest, Jacques
2018-02-28
Detecting atherosclerosis using imaging techniques is the subject of intense debate in the scientific community. Among the arguments in favor of screening, a better identification or better stratification of cardiovascular risk is mentioned, compared to cardiovascular risk scores based solely on traditional risk factors, such as blood pressure or cholesterol levels. Imaging techniques are also used to monitor the progression of atherosclerosis among patients using lipid-lowering or antihypertensive drugs in primary prevention. However, several experts in recent years have challenged the clinical utility of these imaging techniques in asymptomatic adults. This article proposes a debate « for or against » to describe the main arguments for or against the use of imaging for screening for atherosclerosis.
Exaggerated heart rate oscillations during two meditation techniques.
Peng, C K; Mietus, J E; Liu, Y; Khalsa, G; Douglas, P S; Benson, H; Goldberger, A L
1999-07-31
We report extremely prominent heart rate oscillations associated with slow breathing during specific traditional forms of Chinese Chi and Kundalini Yoga meditation techniques in healthy young adults. We applied both spectral analysis and a novel analytic technique based on the Hilbert transform to quantify these heart rate dynamics. The amplitude of these oscillations during meditation was significantly greater than in the pre-meditation control state and also in three non-meditation control groups: i) elite athletes during sleep, ii) healthy young adults during metronomic breathing, and iii) healthy young adults during spontaneous nocturnal breathing. This finding, along with the marked variability of the beat-to-beat heart rate dynamics during such profound meditative states, challenges the notion of meditation as only an autonomically quiescent state.
NASA Astrophysics Data System (ADS)
Kabir, Salman; Smith, Craig; Armstrong, Frank; Barnard, Gerrit; Schneider, Alex; Guidash, Michael; Vogelsang, Thomas; Endsley, Jay
2018-03-01
Differential binary pixel technology is a threshold-based timing, readout, and image reconstruction method that utilizes the subframe partial charge transfer technique in a standard four-transistor (4T) pixel CMOS image sensor to achieve a high dynamic range video with stop motion. This technology improves low light signal-to-noise ratio (SNR) by up to 21 dB. The method is verified in silicon using a Taiwan Semiconductor Manufacturing Company's 65 nm 1.1 μm pixel technology 1 megapixel test chip array and is compared with a traditional 4 × oversampling technique using full charge transfer to show low light SNR superiority of the presented technology.
Application of See One, Do One, Teach One Concept in Surgical Training
Kotsis, Sandra V.; Chung, Kevin C.
2016-01-01
Background The traditional method of teaching in Surgery is known as “See One, Do One, Teach One.” However, many have argued that this method is no longer applicable mainly because of concerns for patient safety. The purpose of this paper is to show that the basis of the traditional teaching method is still valid in surgical training if it is combined with various adult learning principles. Methods We reviewed literature regarding the history of the formation of the surgical residency program, adult learning principles, mentoring, and medical simulation. We provide examples for how these learning techniques can be incorporated into a surgical resident training program. Results The surgical residency program created by Dr. William Halsted remained virtually unchanged until recently with reductions in resident work hours and changes to a competency-based training system. Such changes have reduced the teaching time between attending physicians and residents. Learning principles such as “Experience, Observation, Thinking and Action” as well as deliberate practice can be used to train residents. Mentoring is also an important aspect in teaching surgical technique. We review the different types of simulators: standardized patients, virtual reality applications, and high-fidelity mannequin simulators and the advantages and disadvantages of using them. Conclusions The traditional teaching method of “see one, do one, teach one” in surgical residency programs is simple but still applicable. It needs to evolve with current changes in the medical system to adequately train surgical residents and also provide patients with safe, evidence-based care. PMID:23629100
Hemi-transseptal Approach for Pituitary Surgery: A Follow-Up Study
Fnais, Naif; Maio, Salvatore Di; Edionwe, Susan; Zeitouni, Anthony; Sirhan, Denis; Valdes, Constanza J.; Tewfik, Marc A.
2016-01-01
Objectives The hemi-transseptal (Hemi-T) approach was developed to overcome the potential drawbacks of the nasoseptal flap (NSF) in endoscopic endonasal transsphenoidal skull base surgery. In this study, we describe further refinements on the Hemi-T approach, and report long-term outcomes as compared with traditional methods of skull base reconstruction. Design A retrospective case-control study. Setting Montreal Neurological Institute and Jewish General Hospital, Montreal, Canada. Participants Patients who underwent endoscopic endonasal transsphenoidal approach to skull base pathology. Main Outcome Measures Operative time, CSF rhinorrhea, and postoperative nasal morbidity. Results A total of 105 patients underwent the Hemi-T approach versus 40 controls. Operative time was shorter using the Hemi-T technique (180.51 ± 56.9 vs. 202.9 ± 62 minutes; p = 0.048). The rates of nasal morbidity (septal perforation [5/102 vs. 6/37; p = 0.029] and mucosal adhesion [11/102 vs. 10/39 p = 0.027]), fascia lata harvest (21/100 vs. 18/39; p = 0.0028), and postoperative CSF leak rates (7/100 vs. 9/38; p = 0.006) were lower in the Hemi-T group. Conclusion Advantages of the Hemi-T approach over traditional exposure techniques include preservation of the nasal vascular pedicle, shorter operative time, reduced fascia lata harvest rates, and decreased nasal morbidity. PMID:28321378
Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi
2018-01-01
Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.
Zolfaghari, Z; Rezaee, N; Shakiba, M; Navidian, A
2018-07-01
Cervical cancer, a major health issue affecting women, is preventable and can be successfully treated. It is essential that measures are taken to improve the uptake of screening for this cancer. The aim of this study was to compare the effects of motivational interviewing (MI)-based training and traditional training on the frequency of cervical cancer screening tests in a group of working female teachers. This is a quasi-experimental study. This research was conducted in 2017 among 134 teachers (aged 30-60 years) working in southeastern Iran. The participants were selected from among the eligible individuals and subsequently divided into MI-based training and traditional training groups (n = 67 for each group). Each group received a three-session training program, and 20 weeks after the end of the last training session, the information obtained from cervical cancer screening tests was documented. To analyze the data, independent t-test and Chi-squared test were run in SPSS, version 21. There was no significant difference between the two groups in terms of demographic characteristics such as age, age at the first pregnancy, age of marriage, the number of parities, and educational level. Twenty weeks after intervention, 20.9% of the MI-based training group underwent Pap smear screening test, while 9% of the women in the traditional training group took the test, indicating a statistically significant difference between the two groups (P < 0.0.5). MI-based training has a significant positive effect on women's compliance with cervical cancer screening tests. Therefore, it is recommended that this technique be adopted in women's health centers. IRCT2017100729954N4. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Protecting against cyber threats in networked information systems
NASA Astrophysics Data System (ADS)
Ertoz, Levent; Lazarevic, Aleksandar; Eilertson, Eric; Tan, Pang-Ning; Dokas, Paul; Kumar, Vipin; Srivastava, Jaideep
2003-07-01
This paper provides an overview of our efforts in detecting cyber attacks in networked information systems. Traditional signature based techniques for detecting cyber attacks can only detect previously known intrusions and are useless against novel attacks and emerging threats. Our current research at the University of Minnesota is focused on developing data mining techniques to automatically detect attacks against computer networks and systems. This research is being conducted as a part of MINDS (Minnesota Intrusion Detection System) project at the University of Minnesota. Experimental results on live network traffic at the University of Minnesota show that the new techniques show great promise in detecting novel intrusions. In particular, during the past few months our techniques have been successful in automatically identifying several novel intrusions that could not be detected using state-of-the-art tools such as SNORT.
Niamul Islam, Naz; Hannan, M A; Mohamed, Azah; Shareef, Hussain
2016-01-01
Power system oscillation is a serious threat to the stability of multimachine power systems. The coordinated control of power system stabilizers (PSS) and thyristor-controlled series compensation (TCSC) damping controllers is a commonly used technique to provide the required damping over different modes of growing oscillations. However, their coordinated design is a complex multimodal optimization problem that is very hard to solve using traditional tuning techniques. In addition, several limitations of traditionally used techniques prevent the optimum design of coordinated controllers. In this paper, an alternate technique for robust damping over oscillation is presented using backtracking search algorithm (BSA). A 5-area 16-machine benchmark power system is considered to evaluate the design efficiency. The complete design process is conducted in a linear time-invariant (LTI) model of a power system. It includes the design formulation into a multi-objective function from the system eigenvalues. Later on, nonlinear time-domain simulations are used to compare the damping performances for different local and inter-area modes of power system oscillations. The performance of the BSA technique is compared against that of the popular particle swarm optimization (PSO) for coordinated design efficiency. Damping performances using different design techniques are compared in term of settling time and overshoot of oscillations. The results obtained verify that the BSA-based design improves the system stability significantly. The stability of the multimachine power system is improved by up to 74.47% and 79.93% for an inter-area mode and a local mode of oscillation, respectively. Thus, the proposed technique for coordinated design has great potential to improve power system stability and to maintain its secure operation.
Highly oriented carbon fiber–polymer composites via additive manufacturing
Tekinalp, Halil L.; Kunc, Vlastimil; Velez-Garcia, Gregorio M.; ...
2014-10-16
Additive manufacturing, diverging from traditional manufacturing techniques, such as casting and machining materials, can handle complex shapes with great design flexibility without the typical waste. Although this technique has been mainly used for rapid prototyping, interest is growing in using this method to directly manufacture actual parts of complex shape. To use 3D-printing additive manufacturing in wide spread applications, the technique and the feedstock materials require improvements to meet the mechanical requirements of load-bearing components. Thus, we investigated the short fiber (0.2 mm to 0.4 mm) reinforced acrylonitrile-butadiene-styrene composites as a feedstock for 3D-printing in terms of their processibility, microstructuremore » and mechanical performance; and also provided comparison with traditional compression molded composites. The tensile strength and modulus of 3D-printed samples increased ~115% and ~700%, respectively. 3D-printer yielded samples with very high fiber orientation in printing direction (up to 91.5 %), whereas, compression molding process yielded samples with significantly less fiber orientation. Microstructure-mechanical property relationships revealed that although the relatively high porosity is observed in the 3D-printed composites as compared to those produced by the conventional compression molding technique, they both exhibited comparable tensile strength and modulus. Furthermore, this phenomena is explained based on the changes in fiber orientation, dispersion and void formation.« less
NASA Astrophysics Data System (ADS)
Rauscher, Bernard J.; Arendt, Richard G.; Fixsen, D. J.; Greenhouse, Matthew A.; Lander, Matthew; Lindler, Don; Loose, Markus; Moseley, S. H.; Mott, D. Brent; Wen, Yiting; Wilson, Donna V.; Xenophontos, Christos
2017-10-01
Near-infrared array detectors, like the James Webb Space Telescope (JWST) NIRSpec’s Teledyne’s H2RGs, often provide reference pixels and a reference output. These are used to remove correlated noise. Improved reference sampling and subtraction (IRS2) is a statistical technique for using this reference information optimally in a least-squares sense. Compared with the traditional H2RG readout, IRS2 uses a different clocking pattern to interleave many more reference pixels into the data than is otherwise possible. Compared with standard reference correction techniques, IRS2 subtracts the reference pixels and reference output using a statistically optimized set of frequency-dependent weights. The benefits include somewhat lower noise variance and much less obvious correlated noise. NIRSpec’s IRS2 images are cosmetically clean, with less 1/f banding than in traditional data from the same system. This article describes the IRS2 clocking pattern and presents the equations needed to use IRS2 in systems other than NIRSpec. For NIRSpec, applying these equations is already an option in the calibration pipeline. As an aid to instrument builders, we provide our prototype IRS2 calibration software and sample JWST NIRSpec data. The same techniques are applicable to other detector systems, including those based on Teledyne’s H4RG arrays. The H4RG’s interleaved reference pixel readout mode is effectively one IRS2 pattern.
Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum
2012-01-01
Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing. PMID:22919273
Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum
2012-01-01
Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.
2017-01-01
Traditional techniques of active thermography require an external source of energy used for excitation, usually in the form of high power lamps or ultrasonic devices. In this paper, the author presents an alternative approach based on the self-heating effect observable in polymer-based structures during cyclic loading. The presented approach is based on, firstly, determination of bending resonance frequencies of a tested structure, and then, on excitation of a structure with a multi-harmonic signal constructed from the harmonics with frequencies of determined resonances. Following this, heating-up of a tested structure occurs in the location of stress concentration and mechanical energy dissipation due to the viscoelastic response of a structure. By applying multi-harmonic signal, one ensures coverage of the structure by such heated regions. The concept is verified experimentally on artificially damaged composite specimens. The results demonstrate the presented approach and indicate its potential, especially when traditional methods of excitation with an external structure for thermographic inspection cannot be applied. PMID:29283430
Song, Guanli; Wang, Yinghui; Zhang, Runshun; Liu, Baoyan; Zhou, Xuezhong; Zhou, Xiaji; Zhang, Hong; Guo, Yufeng; Xue, Yanxing; Xu, Lili
2014-09-01
The current modes of experience inheritance from famous specialists in traditional Chinese medicine (TCM) include master and disciple, literature review, clinical-epidemiology-based clinical research observation, and analysis and data mining via computer and database technologies. Each mode has its advantages and disadvantages. However, a scientific and instructive experience inheritance mode has not been developed. The advent of the big data era as well as the formation and practice accumulation of the TCM clinical research paradigm in the real world have provided new perspectives, techniques, and methods for inheriting experience from famous TCM specialists. Through continuous exploration and practice, the research group proposes the innovation research mode based on the real-world TCM clinical research paradigm, which involves the inheritance and innovation of the existing modes. This mode is formulated in line with its own development regularity of TCM and is expected to become the main mode of experience inheritance in the clinical field.
Web information retrieval based on ontology
NASA Astrophysics Data System (ADS)
Zhang, Jian
2013-03-01
The purpose of the Information Retrieval (IR) is to find a set of documents that are relevant for a specific information need of a user. Traditional Information Retrieval model commonly used in commercial search engine is based on keyword indexing system and Boolean logic queries. One big drawback of traditional information retrieval is that they typically retrieve information without an explicitly defined domain of interest to the users so that a lot of no relevance information returns to users, which burden the user to pick up useful answer from these no relevance results. In order to tackle this issue, many semantic web information retrieval models have been proposed recently. The main advantage of Semantic Web is to enhance search mechanisms with the use of Ontology's mechanisms. In this paper, we present our approach to personalize web search engine based on ontology. In addition, key techniques are also discussed in our paper. Compared to previous research, our works concentrate on the semantic similarity and the whole process including query submission and information annotation.
Katunin, Andrzej
2017-12-28
Traditional techniques of active thermography require an external source of energy used for excitation, usually in the form of high power lamps or ultrasonic devices. In this paper, the author presents an alternative approach based on the self-heating effect observable in polymer-based structures during cyclic loading. The presented approach is based on, firstly, determination of bending resonance frequencies of a tested structure, and then, on excitation of a structure with a multi-harmonic signal constructed from the harmonics with frequencies of determined resonances. Following this, heating-up of a tested structure occurs in the location of stress concentration and mechanical energy dissipation due to the viscoelastic response of a structure. By applying multi-harmonic signal, one ensures coverage of the structure by such heated regions. The concept is verified experimentally on artificially damaged composite specimens. The results demonstrate the presented approach and indicate its potential, especially when traditional methods of excitation with an external structure for thermographic inspection cannot be applied.
Neural Net Gains Estimation Based on an Equivalent Model
Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory
2016-01-01
A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system. PMID:27366146
Neural Net Gains Estimation Based on an Equivalent Model.
Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory
2016-01-01
A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system.
A model-based 3D template matching technique for pose acquisition of an uncooperative space object.
Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele
2015-03-16
This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented.
Assessing FRET using Spectral Techniques
Leavesley, Silas J.; Britain, Andrea L.; Cichon, Lauren K.; Nikolaev, Viacheslav O.; Rich, Thomas C.
2015-01-01
Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein–protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP–Epac–YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. PMID:23929684
Assessing FRET using spectral techniques.
Leavesley, Silas J; Britain, Andrea L; Cichon, Lauren K; Nikolaev, Viacheslav O; Rich, Thomas C
2013-10-01
Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein-protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP-Epac-YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. © 2013 International Society for Advancement of Cytometry. Copyright © 2013 International Society for Advancement of Cytometry.
Dipnall, Joanna F; Pasco, Julie A; Berk, Michael; Williams, Lana J; Dodd, Seetal; Jacka, Felice N; Meyer, Denny
2016-01-01
Depression is commonly comorbid with many other somatic diseases and symptoms. Identification of individuals in clusters with comorbid symptoms may reveal new pathophysiological mechanisms and treatment targets. The aim of this research was to combine machine-learning (ML) algorithms with traditional regression techniques by utilising self-reported medical symptoms to identify and describe clusters of individuals with increased rates of depression from a large cross-sectional community based population epidemiological study. A multi-staged methodology utilising ML and traditional statistical techniques was performed using the community based population National Health and Nutrition Examination Study (2009-2010) (N = 3,922). A Self-organised Mapping (SOM) ML algorithm, combined with hierarchical clustering, was performed to create participant clusters based on 68 medical symptoms. Binary logistic regression, controlling for sociodemographic confounders, was used to then identify the key clusters of participants with higher levels of depression (PHQ-9≥10, n = 377). Finally, a Multiple Additive Regression Tree boosted ML algorithm was run to identify the important medical symptoms for each key cluster within 17 broad categories: heart, liver, thyroid, respiratory, diabetes, arthritis, fractures and osteoporosis, skeletal pain, blood pressure, blood transfusion, cholesterol, vision, hearing, psoriasis, weight, bowels and urinary. Five clusters of participants, based on medical symptoms, were identified to have significantly increased rates of depression compared to the cluster with the lowest rate: odds ratios ranged from 2.24 (95% CI 1.56, 3.24) to 6.33 (95% CI 1.67, 24.02). The ML boosted regression algorithm identified three key medical condition categories as being significantly more common in these clusters: bowel, pain and urinary symptoms. Bowel-related symptoms was found to dominate the relative importance of symptoms within the five key clusters. This methodology shows promise for the identification of conditions in general populations and supports the current focus on the potential importance of bowel symptoms and the gut in mental health research.
Ultrasound image edge detection based on a novel multiplicative gradient and Canny operator.
Zheng, Yinfei; Zhou, Yali; Zhou, Hao; Gong, Xiaohong
2015-07-01
To achieve the fast and accurate segmentation of ultrasound image, a novel edge detection method for speckle noised ultrasound images was proposed, which was based on the traditional Canny and a novel multiplicative gradient operator. The proposed technique combines a new multiplicative gradient operator of non-Newtonian type with the traditional Canny operator to generate the initial edge map, which is subsequently optimized by the following edge tracing step. To verify the proposed method, we compared it with several other edge detection methods that had good robustness to noise, with experiments on the simulated and in vivo medical ultrasound image. Experimental results showed that the proposed algorithm has higher speed for real-time processing, and the edge detection accuracy could be 75% or more. Thus, the proposed method is very suitable for fast and accurate edge detection of medical ultrasound images. © The Author(s) 2014.
Xu, Xiaoli; Tang, LiLing
2017-01-01
The living environment of cancer cells is complicated and information-rich. Thus, traditional 2D culture mold in vitro cannot mimic the microenvironment of cancer cells exactly. Currently, bioengineered 3D scaffolds have been developed which can better simulate the microenvironment of tumors and fill the gap between 2D culture and clinical application. In this review, we discuss the scaffold materials used for fabrication techniques, biological behaviors of cancer cells in 3D scaffolds and the scaffold-based drug screening. A major emphasis is placed on the description of scaffold-based epithelial to mesenchymal transition and drug screening in 3D culture. By overcoming the defects of traditional 2D culture, 3D scaffolds culture can provide a simpler, safer and more reliable approach for cancer research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Modified McCash Technique for Management of Dupuytren Contracture.
Lesiak, Alex C; Jarrett, Nicole J; Imbriglia, Joseph E
2017-05-01
Despite recent advancements in the nonsurgical treatment for Dupuytren contracture, a number of patients remain poor nonsurgical candidates or elect for surgical management. The traditional McCash technique releases contractures while leaving open palmar wounds. Although successful in alleviating contractures, these wounds are traditionally large, transverse incisions across the palm. A modification of this technique has been performed that permits the surgeon to utilize smaller wounds while eliminating debilitating contractures. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Expanded Endoscopic Endonasal Approaches to Skull Base Meningiomas
Prosser, J. Drew; Vender, John R.; Alleyne, Cargill H.; Solares, C. Arturo
2012-01-01
Anterior cranial base meningiomas have traditionally been addressed via frontal or frontolateral approaches. However, with the advances in endoscopic endonasal treatment of pituitary lesions, the transphenoidal approach is being expanded to address lesions of the petrous ridge, anterior clinoid, clivus, sella, parasellar region, tuberculum, planum, olfactory groove, and crista galli regions. The expanded endoscopic endonasal approach (EEEA) has the advantage of limiting brain retraction and resultant brain edema, as well as minimizing manipulation of neural structures. Herein, we describe the techniques of transclival, transphenoidal, transplanum, and transcribiform resections of anterior skull base meningiomas. Selected cases are presented. PMID:23730542
Application of optical coherence tomography based microangiography for cerebral imaging
NASA Astrophysics Data System (ADS)
Baran, Utku; Wang, Ruikang K.
2016-03-01
Requirements of in vivo rodent brain imaging are hard to satisfy using traditional technologies such as magnetic resonance imaging and two-photon microscopy. Optical coherence tomography (OCT) is an emerging tool that can easily reach at high speeds and provide high resolution volumetric images with a relatively large field of view for rodent brain imaging. Here, we provide the overview of recent developments of functional OCT based imaging techniques for neuroscience applications on rodents. Moreover, a summary of OCT-based microangiography (OMAG) studies for stroke and traumatic brain injury cases on rodents are provided.
Heart Sound Biometric System Based on Marginal Spectrum Analysis
Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin
2013-01-01
This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515
NASA Astrophysics Data System (ADS)
Coughlan, Michael R.
2016-05-01
Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.
Cheng, Xu-Dong; Feng, Liang; Gu, Jun-Fei; Zhang, Ming-Hua; Jia, Xiao-Bin
2014-11-01
Chinese medicine prescriptions are the wisdom outcomes of traditional Chinese medicine (TCM) clinical treatment determinations which based on differentiation of symptoms and signs. Chinese medicine prescriptions are also the basis of secondary exploitation of TCM. The study on prescription helps to understand the material basis of its efficacy, pharmacological mechanism, which is an important guarantee for the modernization of traditional Chinese medicine. Currently, there is not yet dissertation n the method and technology system of basic research on the prescription of Chinese medicine. This paper focuses on how to build an effective system of prescription research technology. Based on "component structure" theory, a technology system contained four-step method that "prescription analysis, the material basis screening, the material basis of analysis and optimization and verify" was proposed. The technology system analyzes the material basis of the three levels such as Chinese medicine pieces, constituents and the compounds which could respect the overall efficacy of Chinese medicine. Ideas of prescription optimization, remodeling are introduced into the system. The technology system is the combination of the existing research and associates with new techniques and methods, which used for explore the research thought suitable for material basis research and prescription remodeling. The system provides a reference for the secondary development of traditional Chinese medicine, and industrial upgrading.
Coughlan, Michael R
2016-05-01
Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.
A Comparison of FPGA and GPGPU Designs for Bayesian Occupancy Filters
Medina, Luis; Diez-Ochoa, Miguel; Correal, Raul; Cuenca-Asensi, Sergio; Godoy, Jorge; Martínez-Álvarez, Antonio
2017-01-01
Grid-based perception techniques in the automotive sector based on fusing information from different sensors and their robust perceptions of the environment are proliferating in the industry. However, one of the main drawbacks of these techniques is the traditionally prohibitive, high computing performance that is required for embedded automotive systems. In this work, the capabilities of new computing architectures that embed these algorithms are assessed in a real car. The paper compares two ad hoc optimized designs of the Bayesian Occupancy Filter; one for General Purpose Graphics Processing Unit (GPGPU) and the other for Field-Programmable Gate Array (FPGA). The resulting implementations are compared in terms of development effort, accuracy and performance, using datasets from a realistic simulator and from a real automated vehicle. PMID:29137137
Precision assessment of model-based RSA for a total knee prosthesis in a biplanar set-up.
Trozzi, C; Kaptein, B L; Garling, E H; Shelyakova, T; Russo, A; Bragonzoni, L; Martelli, S
2008-10-01
Model-based Roentgen Stereophotogrammetric Analysis (RSA) was recently developed for the measurement of prosthesis micromotion. Its main advantage is that markers do not need to be attached to the implants as traditional marker-based RSA requires. Model-based RSA has only been tested in uniplanar radiographic set-ups. A biplanar set-up would theoretically facilitate the pose estimation algorithm, since radiographic projections would show more different shape features of the implants than in uniplanar images. We tested the precision of model-based RSA and compared it with that of the traditional marker-based method in a biplanar set-up. Micromotions of both tibial and femoral components were measured with both the techniques from double examinations of patients participating in a clinical study. The results showed that in the biplanar set-up model-based RSA presents a homogeneous distribution of precision for all the translation directions, but an inhomogeneous error for rotations, especially internal-external rotation presented higher errors than rotations about the transverse and sagittal axes. Model-based RSA was less precise than the marker-based method, although the differences were not significant for the translations and rotations of the tibial component, with the exception of the internal-external rotations. For both prosthesis components the precisions of model-based RSA were below 0.2 mm for all the translations, and below 0.3 degrees for rotations about transverse and sagittal axes. These values are still acceptable for clinical studies aimed at evaluating total knee prosthesis micromotion. In a biplanar set-up model-based RSA is a valid alternative to traditional marker-based RSA where marking of the prosthesis is an enormous disadvantage.
Gotink, Rinske A; Meijboom, Rozanna; Vernooij, Meike W; Smits, Marion; Hunink, M G Myriam
2016-10-01
The objective of the current study was to systematically review the evidence of the effect of secular mindfulness techniques on function and structure of the brain. Based on areas known from traditional meditation neuroimaging results, we aimed to explore a neuronal explanation of the stress-reducing effects of the 8-week Mindfulness Based Stress Reduction (MBSR) and Mindfulness Based Cognitive Therapy (MBCT) program. We assessed the effect of MBSR and MBCT (N=11, all MBSR), components of the programs (N=15), and dispositional mindfulness (N=4) on brain function and/or structure as assessed by (functional) magnetic resonance imaging. 21 fMRI studies and seven MRI studies were included (two studies performed both). The prefrontal cortex, the cingulate cortex, the insula and the hippocampus showed increased activity, connectivity and volume in stressed, anxious and healthy participants. Additionally, the amygdala showed decreased functional activity, improved functional connectivity with the prefrontal cortex, and earlier deactivation after exposure to emotional stimuli. Demonstrable functional and structural changes in the prefrontal cortex, cingulate cortex, insula and hippocampus are similar to changes described in studies on traditional meditation practice. In addition, MBSR led to changes in the amygdala consistent with improved emotion regulation. These findings indicate that MBSR-induced emotional and behavioral changes are related to functional and structural changes in the brain. Copyright © 2016 Elsevier Inc. All rights reserved.
Virtual shelves in a digital library: a framework for access to networked information sources.
Patrick, T B; Springer, G K; Mitchell, J A; Sievert, M E
1995-01-01
Develop a framework for collections-based access to networked information sources that addresses the problem of location-dependent access to information sources. This framework uses a metaphor of a virtual shelf. A virtual shelf is a general-purpose server that is dedicated to a particular information subject class. The identifier of one of these servers identifies its subject class. Location-independent call numbers are assigned to information sources. Call numbers are based on standard vocabulary codes. The call numbers are first mapped to the location-independent identifiers of virtual shelves. When access to an information resource is required, a location directory provides a second mapping of these location-independent server identifiers to actual network locations. The framework has been implemented in two different systems. One system is based on the Open System Foundation/Distributed Computing Environment and the other is based on the World Wide Web. This framework applies in new ways traditional methods of library classification and cataloging. It is compatible with two traditional styles of selecting information searching and browsing. Traditional methods may be combined with new paradigms of information searching that will be able to take advantage of the special properties of digital information. Cooperation between the library-informational science community and the informatics community can provide a means for a continuing application of the knowledge and techniques of library science to the new problems of networked information sources.
3D surface pressure measurement with single light-field camera and pressure-sensitive paint
NASA Astrophysics Data System (ADS)
Shi, Shengxian; Xu, Shengming; Zhao, Zhou; Niu, Xiaofu; Quinn, Mark Kenneth
2018-05-01
A novel technique that simultaneously measures three-dimensional model geometry, as well as surface pressure distribution, with single camera is demonstrated in this study. The technique takes the advantage of light-field photography which can capture three-dimensional information with single light-field camera, and combines it with the intensity-based pressure-sensitive paint method. The proposed single camera light-field three-dimensional pressure measurement technique (LF-3DPSP) utilises a similar hardware setup to the traditional two-dimensional pressure measurement technique, with exception that the wind-on, wind-off and model geometry images are captured via an in-house-constructed light-field camera. The proposed LF-3DPSP technique was validated with a Mach 5 flared cone model test. Results show that the technique is capable of measuring three-dimensional geometry with high accuracy for relatively large curvature models, and the pressure results compare well with the Schlieren tests, analytical calculations, and numerical simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norgaard, J.V.; Olsen, D.; Springer, N.
1995-12-31
A new technique for obtaining water-oil capillary pressure curves, based on NMR imaging of the saturation distribution in flooded cores is presented. In this technique, a steady state fluid saturation profile is developed by flooding the core at a constant flow rate. At the steady state situation where the saturation distribution no longer changes, the local pressure difference between the wetting and non-wetting phases represents the capillary pressure. The saturation profile is measured using an NMR technique and for a drainage case, the pressure in the non-wetting phase is calculated numerically. The paper presents the NMR technique and the proceduremore » for calculating the pressure distribution in the sample. Inhomogeneous samples produce irregular saturation profiles, which may be interpreted in terms of variation in permeability, porosity, and capillary pressure. Capillary pressure curves for North Sea chalk obtained by the new technique show good agreement with capillary pressure curves obtained by traditional techniques.« less
Nanomaterial-Enabled Neural Stimulation
Wang, Yongchen; Guo, Liang
2016-01-01
Neural stimulation is a critical technique in treating neurological diseases and investigating brain functions. Traditional electrical stimulation uses electrodes to directly create intervening electric fields in the immediate vicinity of neural tissues. Second-generation stimulation techniques directly use light, magnetic fields or ultrasound in a non-contact manner. An emerging generation of non- or minimally invasive neural stimulation techniques is enabled by nanotechnology to achieve a high spatial resolution and cell-type specificity. In these techniques, a nanomaterial converts a remotely transmitted primary stimulus such as a light, magnetic or ultrasonic signal to a localized secondary stimulus such as an electric field or heat to stimulate neurons. The ease of surface modification and bio-conjugation of nanomaterials facilitates cell-type-specific targeting, designated placement and highly localized membrane activation. This review focuses on nanomaterial-enabled neural stimulation techniques primarily involving opto-electric, opto-thermal, magneto-electric, magneto-thermal and acousto-electric transduction mechanisms. Stimulation techniques based on other possible transduction schemes and general consideration for these emerging neurotechnologies are also discussed. PMID:27013938
How "Flipping" the Classroom Can Improve the Traditional Lecture
ERIC Educational Resources Information Center
Berrett, Dan
2012-01-01
In this article, the author discusses a teaching technique called "flipping" and describes how "flipping" the classroom can improve the traditional lecture. As its name suggests, flipping describes the inversion of expectations in the traditional college lecture. It takes many forms, including interactive engagement, just-in-time teaching (in…
Significance of clustering and classification applications in digital and physical libraries
NASA Astrophysics Data System (ADS)
Triantafyllou, Ioannis; Koulouris, Alexandros; Zervos, Spiros; Dendrinos, Markos; Giannakopoulos, Georgios
2015-02-01
Applications of clustering and classification techniques can be proved very significant in both digital and physical (paper-based) libraries. The most essential application, document classification and clustering, is crucial for the content that is produced and maintained in digital libraries, repositories, databases, social media, blogs etc., based on various tags and ontology elements, transcending the traditional library-oriented classification schemes. Other applications with very useful and beneficial role in the new digital library environment involve document routing, summarization and query expansion. Paper-based libraries can benefit as well since classification combined with advanced material characterization techniques such as FTIR (Fourier Transform InfraRed spectroscopy) can be vital for the study and prevention of material deterioration. An improved two-level self-organizing clustering architecture is proposed in order to enhance the discrimination capacity of the learning space, prior to classification, yielding promising results when applied to the above mentioned library tasks.
A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis
NASA Astrophysics Data System (ADS)
Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui
2015-07-01
Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.
Monte Carlo-based Reconstruction in Water Cherenkov Detectors using Chroma
NASA Astrophysics Data System (ADS)
Seibert, Stanley; Latorre, Anthony
2012-03-01
We demonstrate the feasibility of event reconstruction---including position, direction, energy and particle identification---in water Cherenkov detectors with a purely Monte Carlo-based method. Using a fast optical Monte Carlo package we have written, called Chroma, in combination with several variance reduction techniques, we can estimate the value of a likelihood function for an arbitrary event hypothesis. The likelihood can then be maximized over the parameter space of interest using a form of gradient descent designed for stochastic functions. Although slower than more traditional reconstruction algorithms, this completely Monte Carlo-based technique is universal and can be applied to a detector of any size or shape, which is a major advantage during the design phase of an experiment. As a specific example, we focus on reconstruction results from a simulation of the 200 kiloton water Cherenkov far detector option for LBNE.
A Course in Information Techniques for Dental Students
Dannenberg, Dena
1972-01-01
A course plan is presented for introducing literature searching and critical skills to dental students. Topics include the “life cycle of information,” reference sources available, search procedure, abstracting and indexing, and personal information systems. Teaching is structured around planned seminars and student projects. The course design is compatible with traditional dental curricula and is based on students' interest in dentistry rather than in information/library science. PMID:5024320
Is it time for brushless scrubbing with an alcohol-based agent?
Gruendemann, B J; Bjerke, N B
2001-12-01
The practice of surgical scrubbing in perioperative settings is changing rapidly. This article presents information about eliminating the traditional scrub brush technique and using an alcohol formulation for surgical hand scrubs. Also covered are antimicrobial agents, relevant US Food and Drug Administration classifications, skin and fingernail care, and implementation of changes. The article challenges surgical team members to evaluate a new and different approach to surgical hand scrubbing.
A Multistage Approach for Image Registration.
Bowen, Francis; Hu, Jianghai; Du, Eliza Yingzi
2016-09-01
Successful image registration is an important step for object recognition, target detection, remote sensing, multimodal content fusion, scene blending, and disaster assessment and management. The geometric and photometric variations between images adversely affect the ability for an algorithm to estimate the transformation parameters that relate the two images. Local deformations, lighting conditions, object obstructions, and perspective differences all contribute to the challenges faced by traditional registration techniques. In this paper, a novel multistage registration approach is proposed that is resilient to view point differences, image content variations, and lighting conditions. Robust registration is realized through the utilization of a novel region descriptor which couples with the spatial and texture characteristics of invariant feature points. The proposed region descriptor is exploited in a multistage approach. A multistage process allows the utilization of the graph-based descriptor in many scenarios thus allowing the algorithm to be applied to a broader set of images. Each successive stage of the registration technique is evaluated through an effective similarity metric which determines subsequent action. The registration of aerial and street view images from pre- and post-disaster provide strong evidence that the proposed method estimates more accurate global transformation parameters than traditional feature-based methods. Experimental results show the robustness and accuracy of the proposed multistage image registration methodology.
Josephson frequency meter for millimeter and submillimeter wavelengths
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anischenko, S.E.; Larkin, S.Y.; Chaikovsky, V.I.
1994-12-31
Frequency measurements of electromagnetic oscillations of millimeter and submillimeter wavebands with frequency growth due to a number of reasons become more and more difficult. First, these frequencies are considered to be cutoff for semiconductor converting devices and one has to use optical measurement methods instead of traditional ones with frequency transfer. Second, resonance measurement methods are characterized by using relatively narrow bands and optical ones are limited in frequency and time resolution due to the limited range and velocity of movement of their mechanical elements as well as the efficiency of these optical techniques decreases with the increase of wavelengthmore » due to diffraction losses. That requires the apriori information on the radiation frequency band of the source involved. Method of measuring frequency of harmonic microwave signals in millimeter and submillimeter wavebands based on the ac Josephson effect in superconducting contacts is devoid of all the above drawbacks. This approach offers a number of major advantages over the more traditional measurement methods, that is the one based on frequency conversion, resonance and interferrometric techniques. It can be characterized by high potential accuracy, wide range of frequencies measured, prompt measurement and the opportunity to obtain panoramic display of the results as well as full automation of the measuring process.« less
NASA Astrophysics Data System (ADS)
Podesta, John J.
2017-12-01
Over the last decade it has become popular to analyze turbulent solar wind fluctuations with respect to a coordinate system aligned with the local mean magnetic field. This useful analysis technique has provided new information and new insights about the nature of solar wind fluctuations and provided some support for phenomenological theories of MHD turbulence based on the ideas of Goldreich and Sridhar. At the same time it has drawn criticism suggesting that the use of a scale-dependent local mean field is somehow inconsistent or irreconcilable with traditional analysis techniques based on second-order structure functions and power spectra that, for stationary time series, are defined with respect to the constant (scale-independent) ensemble average magnetic field. Here it is shown that for fluctuations with power law spectra, such as those observed in solar wind turbulence, it is possible to define the local mean magnetic field in a special way such that the total mean square amplitude (trace amplitude) of turbulent fluctuations is approximately the same, scale by scale, as that obtained using traditional second-order structure functions or power spectra. This fact should dispel criticism concerning the physical validity or practical usefulness of the local mean magnetic field in these applications.
NASA Astrophysics Data System (ADS)
Lara, J. L.; Cowen, E. A.; Sou, I. M.
2002-06-01
Boundary layer flows are ubiquitous in the environment, but their study is often complicated by their thinness, geometric irregularity and boundary porosity. In this paper, we present an approach to making laboratory-based particle image velocimetry (PIV) measurements in these complex flow environments. Clear polycarbonate spheres were used to model a porous and rough bed. The strong curvature of the spheres results in a diffuse volume illuminated region instead of the more traditional finite and thin light sheet illuminated region, resulting in the imaging of both in-focus and significantly out-of-focus particles. Results of a traditional cross-correlation-based PIV-type analysis of these images demonstrate that the mean and turbulent features of an oscillatory boundary layer driven by a free-surface wave over an irregular-shaped porous bed can be robustly measured. Measurements of the mean flow, turbulent intensities, viscous and turbulent stresses are presented and discussed. Velocity spectra have been calculated showing an inertial subrange confirming that the PIV analysis is sufficiently robust to extract turbulence. The presented technique is particularly well suited for the study of highly dynamic free-surface flows that prevent the delivery of the light sheet from above the bed, such as swash flows.
Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.
Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai
2018-03-01
With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.
Ficklin, Travis; Lund, Robin; Schipper, Megan
2014-01-01
The purpose of this study was to compare traditional and swing blocking techniques on center of mass (COM) projectile motion and effective blocking area in nine healthy Division I female volleyball players. Two high-definition (1080 p) video cameras (60 Hz) were used to collect two-dimensional variables from two separate views. One was placed perpendicular to the plane of the net and the other was directed along the top of the net, and were used to estimate COM locations and blocking area in a plane parallel to the net and hand penetration through the plane of the net respectively. Video of both the traditional and swing techniques were digitized and kinematic variables were calculated. Paired samples t-tests indicated that the swing technique resulted in greater (p < 0.05) vertical and horizontal takeoff velocities (vy and vx), jump height (H), duration of the block (tBLOCK), blocking coverage during the block (C) as well as hand penetration above and through the net’s plane (YPEN, ZPEN). The traditional technique had significantly greater approach time (tAPP). The results of this study suggest that the swing technique results in both greater jump height and effective blocking area. However, the shorter tAPP that occurs with swing is associated with longer times in the air during the block which may reduce the ability of the athlete to make adjustments to attacks designed to misdirect the defense. Key Points Swing blocking technique has greater jump height, effective blocking area, hand penetration, horizontal and vertical takeoff velocity, and has a shorter time of approach. Despite these advantages, there may be more potential for mistiming blocks and having erratic deflections of the ball after contact when using the swing technique. Coaches should take more than simple jump height and hand penetration into account when deciding which technique to employ. PMID:24570609
Ficklin, Travis; Lund, Robin; Schipper, Megan
2014-01-01
The purpose of this study was to compare traditional and swing blocking techniques on center of mass (COM) projectile motion and effective blocking area in nine healthy Division I female volleyball players. Two high-definition (1080 p) video cameras (60 Hz) were used to collect two-dimensional variables from two separate views. One was placed perpendicular to the plane of the net and the other was directed along the top of the net, and were used to estimate COM locations and blocking area in a plane parallel to the net and hand penetration through the plane of the net respectively. Video of both the traditional and swing techniques were digitized and kinematic variables were calculated. Paired samples t-tests indicated that the swing technique resulted in greater (p < 0.05) vertical and horizontal takeoff velocities (vy and vx), jump height (H), duration of the block (tBLOCK), blocking coverage during the block (C) as well as hand penetration above and through the net's plane (YPEN, ZPEN). The traditional technique had significantly greater approach time (tAPP). The results of this study suggest that the swing technique results in both greater jump height and effective blocking area. However, the shorter tAPP that occurs with swing is associated with longer times in the air during the block which may reduce the ability of the athlete to make adjustments to attacks designed to misdirect the defense. Key PointsSwing blocking technique has greater jump height, effective blocking area, hand penetration, horizontal and vertical takeoff velocity, and has a shorter time of approach.Despite these advantages, there may be more potential for mistiming blocks and having erratic deflections of the ball after contact when using the swing technique.Coaches should take more than simple jump height and hand penetration into account when deciding which technique to employ.
Zhou, Minwen; Wang, Wei; Huang, Wenbin; Zhang, Xiulan
2014-09-06
To evaluate the surgical outcome of Ahmed glaucoma valve (AGV) implantation with a new technique of mitomycin C (MMC) application. This is a retrospective study. All patients with refractory glaucoma underwent FP-7 AGV implantation. Two methods of MMC application were used. In the traditional technique, 6 × 4 mm cotton soaked with MMC (0.25-0.33 mg/ml) was placed in the implantation area for 2-5mins; in the new technique, the valve plate first was encompassed with a thin layer of cotton soaked with MMC, then inserted into the same area. A 200 ml balanced salt solution was applied for irrigation of MMC. The surgical success rate, intraocular pressure (IOP), number of anti-glaucoma medications used, and postoperative complications were analyzed between the groups. The surgical outcomes of two MMC applied techniques were compared. The new technique group had only one case (2.6%) of encapsulated cyst formation out of 38 eyes, while there were eight (19.5%) cases out of 41 eyes the in traditional group. The difference was statistically significant (P = 0.030). According to the definition of success rate, there was 89.5% in the new technique group and 70.7% in the traditional group at the follow-up end point. There was a significant difference between the two groups (P = 0.035). Mean IOP in the new technique group were significantly lower than those of the traditional group at 3 and 6 months (P < 0.05). By using a thin layer of cotton soaked with MMC to encompass the valve plate, the new MMC application technique could greatly decrease the incidence of encapsulated cyst and increase the success rate following AGV implantation.
Perea Palazón, R J; Solé Arqués, M; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Ortiz Pérez, J T
2015-01-01
Cardiac magnetic resonance imaging is considered the reference technique for characterizing myocardial tissue; for example, T2-weighted sequences make it possible to evaluate areas of edema or myocardial inflammation. However, traditional sequences have many limitations and provide only qualitative information. Moreover, traditional sequences depend on the reference to remote myocardium or skeletal muscle, which limits their ability to detect and quantify diffuse myocardial damage. Recently developed magnetic resonance myocardial mapping techniques enable quantitative assessment of parameters indicative of edema. These techniques have proven better than traditional sequences both in acute cardiomyopathy and in acute ischemic heart disease. This article synthesizes current developments in T2 mapping as well as their clinical applications and limitations. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
Higgs, Gary
2006-04-01
Despite recent U.K. Government commitments' to encourage public participation in environmental decision making, those exercises conducted to date have been largely confined to 'traditional' modes of participation such as the dissemination of information and in encouraging feedback on proposals through, for example, questionnaires or surveys. It is the premise of this paper that participative approaches that use IT-based methods, based on combined geographical information systems (GIS) and multi-criteria evaluation techniques that could involve the public in the decision-making process, have the potential to build consensus and reduce disputes and conflicts such as those arising from the siting of different types of waste facilities. The potential of these techniques are documented through a review of the existing literature in order to highlight the opportunities and challenges facing decision makers in increasing the involvement of the public at different stages of the waste facility management process. It is concluded that there are important lessons to be learned by researchers, consultants, managers and decision makers if barriers hindering the wider use of such techniques are to be overcome.
Recombinant antibodies and their use in biosensors.
Zeng, Xiangqun; Shen, Zhihong; Mernaugh, Ray
2012-04-01
Inexpensive, noninvasive immunoassays can be used to quickly detect disease in humans. Immunoassay sensitivity and specificity are decidedly dependent upon high-affinity, antigen-specific antibodies. Antibodies are produced biologically. As such, antibody quality and suitability for use in immunoassays cannot be readily determined or controlled by human intervention. However, the process through which high-quality antibodies can be obtained has been shortened and streamlined by use of genetic engineering and recombinant antibody techniques. Antibodies that traditionally take several months or more to produce when animals are used can now be developed in a few weeks as recombinant antibodies produced in bacteria, yeast, or other cell types. Typically most immunoassays use two or more antibodies or antibody fragments to detect antigens that are indicators of disease. However, a label-free biosensor, for example, a quartz-crystal microbalance (QCM) needs one antibody only. As such, the cost and time needed to design and develop an immunoassay can be substantially reduced if recombinant antibodies and biosensors are used rather than traditional antibody and assay (e.g. enzyme-linked immunosorbant assay, ELISA) methods. Unlike traditional antibodies, recombinant antibodies can be genetically engineered to self-assemble on biosensor surfaces, at high density, and correctly oriented to enhance antigen-binding activity and to increase assay sensitivity, specificity, and stability. Additionally, biosensor surface chemistry and physical and electronic properties can be modified to further increase immunoassay performance above and beyond that obtained by use of traditional methods. This review describes some of the techniques investigators have used to develop highly specific and sensitive, recombinant antibody-based biosensors for detection of antigens in simple or complex biological samples.
Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.
A Third-Party E-payment Protocol Based on Quantum Multi-proxy Blind Signature
NASA Astrophysics Data System (ADS)
Niu, Xu-Feng; Zhang, Jian-Zhong; Xie, Shu-Cui; Chen, Bu-Qing
2018-05-01
A third-party E-payment protocol is presented in this paper. It is based on quantum multi-proxy blind signature. Adopting the techniques of quantum key distribution, one-time pad and quantum multi-proxy blind signature, our third-party E-payment system could protect user's anonymity as the traditional E-payment systems do, and also have unconditional security which the classical E-payment systems can not provide. Furthermore, compared with the existing quantum E-payment systems, the proposed system could support the E-payment which using the third-party platforms.
DLP NIRscan Nano: an ultra-mobile DLP-based near-infrared Bluetooth spectrometer
NASA Astrophysics Data System (ADS)
Gelabert, Pedro; Pruett, Eric; Perrella, Gavin; Subramanian, Sreeram; Lakshminarayanan, Aravind
2016-02-01
The DLP NIRscan Nano is an ultra-portable spectrometer evaluation module utilizing DLP technology to meet lower cost, smaller size, and higher performance than traditional architectures. The replacement of a linear array detector with DLP digital micromirror device (DMD) in conjunction with a single point detector adds the functionality of programmable spectral filters and sampling techniques that were not previously available on NIR spectrometers. This paper presents the hardware, software, and optical systems of the DLP NIRscan Nano and its design considerations on the implementation of a DLP-based spectrometer.
3D-model building of the jaw impression
NASA Astrophysics Data System (ADS)
Ahmed, Moumen T.; Yamany, Sameh M.; Hemayed, Elsayed E.; Farag, Aly A.
1997-03-01
A novel approach is proposed to obtain a record of the patient's occlusion using computer vision. Data acquisition is obtained using intra-oral video cameras. The technique utilizes shape from shading to extract 3D information from 2D views of the jaw, and a novel technique for 3D data registration using genetic algorithms. The resulting 3D model can be used for diagnosis, treatment planning, and implant purposes. The overall purpose of this research is to develop a model-based vision system for orthodontics to replace traditional approaches. This system will be flexible, accurate, and will reduce the cost of orthodontic treatments.
Probst, Yasmine; Nguyen, Duc Thanh; Tran, Minh Khoi; Li, Wanqing
2015-07-27
Dietary assessment, while traditionally based on pen-and-paper, is rapidly moving towards automatic approaches. This study describes an Australian automatic food record method and its prototype for dietary assessment via the use of a mobile phone and techniques of image processing and pattern recognition. Common visual features including scale invariant feature transformation (SIFT), local binary patterns (LBP), and colour are used for describing food images. The popular bag-of-words (BoW) model is employed for recognizing the images taken by a mobile phone for dietary assessment. Technical details are provided together with discussions on the issues and future work.
Optimal focal-plane restoration
NASA Technical Reports Server (NTRS)
Reichenbach, Stephen E.; Park, Stephen K.
1989-01-01
Image restoration can be implemented efficiently by calculating the convolution of the digital image and a small kernel during image acquisition. Processing the image in the focal-plane in this way requires less computation than traditional Fourier-transform-based techniques such as the Wiener filter and constrained least-squares filter. Here, the values of the convolution kernel that yield the restoration with minimum expected mean-square error are determined using a frequency analysis of the end-to-end imaging system. This development accounts for constraints on the size and shape of the spatial kernel and all the components of the imaging system. Simulation results indicate the technique is effective and efficient.
Social marketing: application to medical education.
David, S P; Greer, D S
2001-01-16
Medical education is often a frustrating endeavor, particularly when it attempts to change practice behavior. Traditional lecture-based educational methods are limited in their ability to sustain concentration and interest and to promote learner adherence to best-practice guidelines. Marketing techniques have been very effective in changing consumer behavior and physician behavior. However, the techniques of social marketing-goal identification, audience segmentation, and market research-have not been harnessed and applied to medical education. Social marketing can be applied to medical education in the effort to go beyond inoculation of learners with information and actually change behaviors. The tremendous potential of social marketing for medical education should be pilot-tested and systematically evaluated.
Using Movies to Analyse Gene Circuit Dynamics in Single Cells
Locke, James CW; Elowitz, Michael B
2010-01-01
Preface Many bacterial systems rely on dynamic genetic circuits to control critical processes. A major goal of systems biology is to understand these behaviours in terms of individual genes and their interactions. However, traditional techniques based on population averages wash out critical dynamics that are either unsynchronized between cells or driven by fluctuations, or ‘noise,’ in cellular components. Recently, the combination of time-lapse microscopy, quantitative image analysis, and fluorescent protein reporters has enabled direct observation of multiple cellular components over time in individual cells. In conjunction with mathematical modelling, these techniques are now providing powerful insights into genetic circuit behaviour in diverse microbial systems. PMID:19369953
A new technique to prepare hard fruits and seeds for anatomical studies1
Benedict, John C.
2015-01-01
Premise of the study: A novel preparation technique was developed to examine fruits and seeds of plants with exceptionally hard or brittle tissues that are very difficult to prepare using standard histological techniques. Methods and Results: The method introduced here was modified from a technique employed on fossil material and has been adapted for use on fruits and seeds of extant plants. A variety of fruits and seeds have been prepared with great success, and the technique will be useful for any excessively hard fruits or seeds that are not able to be prepared using traditional embedding or sectioning methods. Conclusions: When compared to existing techniques for obtaining anatomical features of fruits and seeds, the protocol described here has the potential to create high-quality thin sections of materials that are not able to be sectioned using traditional histological techniques, which can be produced quickly and without the need for harmful chemicals. PMID:26504684
Abdul Jalil, Muhammad Fahmi; Story, Rowan D; Rogers, Myron
2017-05-01
Minimally invasive approaches to the central skull base have been popularized over the last decade and have to a large extent displaced 'open' procedures. However, traditional skull base surgery still has its role especially when dealing with a large clival chordoma where maximal surgical resection is the principal goal to maximize patient survival. In this paper, we present a case of a 25year-old male patient with chordoma in the inferior clivus which was initially debulked via a transnasal endoscopic approach. He unfortunately had a large recurrence of tumor requiring re-do resection. With the aim to achieve maximal surgical resection, we then chose the technique of a transoral approach with Le Fort 1 maxillotomy and midline palatal split. Post-operative course for the patient was uneventful and post-operative MRI confirmed significant debulking of the clival lesion. The technique employed for the surgical procedure is presented here in detail as is our experience over two decades using this technique for tumors, inflammatory lesions and congenital abnormalities at the cranio-cervical junction. Copyright © 2017 Elsevier Ltd. All rights reserved.
Jiang, Baofeng; Jia, Pengjiao; Zhao, Wen; Wang, Wentao
2018-01-01
This paper explores a new method for rapid structural damage inspection of steel tube slab (STS) structures along randomly measured paths based on a combination of compressive sampling (CS) and ultrasonic computerized tomography (UCT). In the measurement stage, using fewer randomly selected paths rather than the whole measurement net is proposed to detect the underlying damage of a concrete-filled steel tube. In the imaging stage, the ℓ1-minimization algorithm is employed to recover the information of the microstructures based on the measurement data related to the internal situation of the STS structure. A numerical concrete tube model, with the various level of damage, was studied to demonstrate the performance of the rapid UCT technique. Real-world concrete-filled steel tubes in the Shenyang Metro stations were detected using the proposed UCT technique in a CS framework. Both the numerical and experimental results show the rapid UCT technique has the capability of damage detection in an STS structure with a high level of accuracy and with fewer required measurements, which is more convenient and efficient than the traditional UCT technique. PMID:29293593
NASA Technical Reports Server (NTRS)
Tzvi, G. C.
1986-01-01
A technique to deduce the virtual temperature from the combined use of the equations of fluid dynamics, observed wind and observed radiances is described. The wind information could come from ground-based sensitivity very high frequency (VHF) Doppler radars and/or from space-borne Doppler lidars. The radiometers are also assumed to be either space-borne and/or ground-based. From traditional radiometric techniques the vertical structure of the temperature can be estimated only crudely. While it has been known for quite some time that the virtual temperature could be deduced from wind information only, such techniques had to assume the infallibility of certain diagnostic relations. The proposed technique is an extension of the Gal-Chen technique. It is assumed that due to modeling uncertainties the equations of fluid dynamics are satisfied only in the least square sense. The retrieved temperature, however, is constrained to reproduce the observed radiances. It is shown that the combined use of the three sources of information (wind, radiances and fluid dynamical equations) can result in a unique determination of the vertical temperature structure with spatial and temporal resolution comparable to that of the observed wind.
Analyzing locomotion synthesis with feature-based motion graphs.
Mahmudi, Mentar; Kallmann, Marcelo
2013-05-01
We propose feature-based motion graphs for realistic locomotion synthesis among obstacles. Among several advantages, feature-based motion graphs achieve improved results in search queries, eliminate the need of postprocessing for foot skating removal, and reduce the computational requirements in comparison to traditional motion graphs. Our contributions are threefold. First, we show that choosing transitions based on relevant features significantly reduces graph construction time and leads to improved search performances. Second, we employ a fast channel search method that confines the motion graph search to a free channel with guaranteed clearance among obstacles, achieving faster and improved results that avoid expensive collision checking. Lastly, we present a motion deformation model based on Inverse Kinematics applied over the transitions of a solution branch. Each transition is assigned a continuous deformation range that does not exceed the original transition cost threshold specified by the user for the graph construction. The obtained deformation improves the reachability of the feature-based motion graph and in turn also reduces the time spent during search. The results obtained by the proposed methods are evaluated and quantified, and they demonstrate significant improvements in comparison to traditional motion graph techniques.
Online selective kernel-based temporal difference learning.
Chen, Xingguo; Gao, Yang; Wang, Ruili
2013-12-01
In this paper, an online selective kernel-based temporal difference (OSKTD) learning algorithm is proposed to deal with large scale and/or continuous reinforcement learning problems. OSKTD includes two online procedures: online sparsification and parameter updating for the selective kernel-based value function. A new sparsification method (i.e., a kernel distance-based online sparsification method) is proposed based on selective ensemble learning, which is computationally less complex compared with other sparsification methods. With the proposed sparsification method, the sparsified dictionary of samples is constructed online by checking if a sample needs to be added to the sparsified dictionary. In addition, based on local validity, a selective kernel-based value function is proposed to select the best samples from the sample dictionary for the selective kernel-based value function approximator. The parameters of the selective kernel-based value function are iteratively updated by using the temporal difference (TD) learning algorithm combined with the gradient descent technique. The complexity of the online sparsification procedure in the OSKTD algorithm is O(n). In addition, two typical experiments (Maze and Mountain Car) are used to compare with both traditional and up-to-date O(n) algorithms (GTD, GTD2, and TDC using the kernel-based value function), and the results demonstrate the effectiveness of our proposed algorithm. In the Maze problem, OSKTD converges to an optimal policy and converges faster than both traditional and up-to-date algorithms. In the Mountain Car problem, OSKTD converges, requires less computation time compared with other sparsification methods, gets a better local optima than the traditional algorithms, and converges much faster than the up-to-date algorithms. In addition, OSKTD can reach a competitive ultimate optima compared with the up-to-date algorithms.
Laser versus traditional techniques in cerebral and brain stem gliomas
NASA Astrophysics Data System (ADS)
Lombard, Gian F.
1996-01-01
In medical literature no significant studies have been published on the effectiveness of laser compared with traditional procedures in two series of cerebral gliomas; for this reason we have studied 220 tumors (200 supratentorial -- 20 brain stem gliomas), 110 operated upon with laser, 100 with conventional techniques. Four surgical protocols have been carried out: (1) traditional techniques; (2) carbon dioxide laser free hand; (3) carbon dioxide laser plus microscope; (4) multiple laser sources plus microscope plus neurosector plus CUSA. Two laser sources have been used alone or in combination (carbon dioxide -- Nd:YAG 1.06 or 1.32). Patients have been monitored for Karnofsky scale before and after operation, 12 - 24 and 36 months later; and for survival rate. Tumors were classified by histological examination, dimensions, vascularization, topography (critical or non critical areas). Results for supratentorial gliomas: survival time is the same in both series (laser and traditional). Post- op morbidity is significantly improved in the laser group (high grade sub-group); long term follow-up shows an improvement of quality of life until 36 months in the low grade sub-group.
Schroer, William C; Diesfeld, Paul J; Reedy, Mary E; Lemarr, Angela R
2008-06-01
A total of 50 total knee arthroplasty (TKA) patients, 25 traditional and 25 minimally invasive surgical (MIS), underwent computed tomography scans to determine if a loss of accuracy in implant alignment occurred when a surgeon switched from a traditional medial parapatellar arthrotomy to a mini-subvastus surgical technique. Surgical accuracy was determined by comparing the computed tomography measured implant alignment with the surgical alignment goals. There was no loss in accuracy in the implantation of the tibial component with the mini-subvastus technique. The mean variance for the tibial coronal alignment was 1.03 degrees for the traditional TKA and 1.00 degrees for the MIS TKA (P = .183). Similarly, there was no difference in the mean variance for the posterior tibial slope (P = .054). Femoral coronal alignment was less accurate with the MIS procedure, mean variance of 1.04 degrees and 1.71 degrees for the traditional and MIS TKA, respectively (P = .045). Instrumentation and surgical technique concerns that led to this loss in accuracy were determined.
Enhanced orbit determination filter: Inclusion of ground system errors as filter parameters
NASA Technical Reports Server (NTRS)
Masters, W. C.; Scheeres, D. J.; Thurman, S. W.
1994-01-01
The theoretical aspects of an orbit determination filter that incorporates ground-system error sources as model parameters for use in interplanetary navigation are presented in this article. This filter, which is derived from sequential filtering theory, allows a systematic treatment of errors in calibrations of transmission media, station locations, and earth orientation models associated with ground-based radio metric data, in addition to the modeling of the spacecraft dynamics. The discussion includes a mathematical description of the filter and an analytical comparison of its characteristics with more traditional filtering techniques used in this application. The analysis in this article shows that this filter has the potential to generate navigation products of substantially greater accuracy than more traditional filtering procedures.
NASA Astrophysics Data System (ADS)
Gao, Xiangdong; Chen, Yuquan; You, Deyong; Xiao, Zhenlin; Chen, Xiaohui
2017-02-01
An approach for seam tracking of micro gap weld whose width is less than 0.1 mm based on magneto optical (MO) imaging technique during butt-joint laser welding of steel plates is investigated. Kalman filtering(KF) technology with radial basis function(RBF) neural network for weld detection by an MO sensor was applied to track the weld center position. Because the laser welding system process noises and the MO sensor measurement noises were colored noises, the estimation accuracy of traditional KF for seam tracking was degraded by the system model with extreme nonlinearities and could not be solved by the linear state-space model. Also, the statistics characteristics of noises could not be accurately obtained in actual welding. Thus, a RBF neural network was applied to the KF technique to compensate for the weld tracking errors. The neural network can restrain divergence filter and improve the system robustness. In comparison of traditional KF algorithm, the RBF with KF was not only more effectively in improving the weld tracking accuracy but also reduced noise disturbance. Experimental results showed that magneto optical imaging technique could be applied to detect micro gap weld accurately, which provides a novel approach for micro gap seam tracking.
Elderly quality of life impacted by traditional chinese medicine techniques
Figueira, Helena A; Figueira, Olivia A; Figueira, Alan A; Figueira, Joana A; Giani, Tania S; Dantas, Estélio HM
2010-01-01
Background: The shift in age structure is having a profound impact, suggesting that the aged should be consulted as reporters on the quality of their own lives. Objectives: The aim of this research was to establish the possible impact of traditional Chinese medicine (TCM) techniques on the quality of life (QOL) of the elderly. Sample: Two non-selected, volunteer groups of Rio de Janeiro municipality inhabitants: a control group (36 individuals), not using TCM, and an experimental group (28 individuals), using TCM at ABACO/Sohaku-in Institute, Brazil. Methods: A questionnaire on elderly QOL devised by the World Health Organization, the WHOQOL-Old, was adopted and descriptive statistical techniques were used: mean and standard deviation. The Shapiro–Wilk test checked the normality of the distribution. Furthermore, based on its normality distribution for the intergroup comparison, the Student t test was applied to facets 2, 4, 5, 6, and total score, and the Mann–Whitney U rank test to facets 1 and 3, both tests aiming to analyze the P value between experimental and control groups. The significance level utilized was 95% (P < 0.05). Results: The experimental group reported the highest QOL for every facet and the total score. Conclusions: The results suggest that TCM raises the level of QOL. PMID:21103400
Characterization of tabique walls nails of the Alto Douro Wine Region
NASA Astrophysics Data System (ADS)
Cardoso, Rui; Pinto, Jorge; Paiva, Anabela; Lanzinha, João Carlos
2016-11-01
Tabique is one of the main Portuguese traditional building techniques which use raw materials as stone, earth andwood. In general, a tabique building component as a wall consist of a wooden structure made up of vertical boards connected to laths by metal nails and covered on both sides by an earth based material. This traditional building technology as an expressive incidence in the Alto Douro Wine Region located in the interior of Northern Portugal, added to the UNESCO's Word Heritage Sites List in December 2001 as an `evolved continuing cultural landscape'. Furthermore, previous research works have shown that the existing tabique construction, in this region, reveals a certain lack of maintenance partially justified by the knowledge loosed on that technique, consequently this construction technique present an advanced stage of deterioration. This aspect associated to the fact that there is still a lack of scientific studies in this field motivated the writing of this paper, the main objectives are to identify and characterize the nails used in the timber connections. The nails samples were collected from tabique walls included in tabique buildings located in LamegoMunicipality, near Douro River, in the Alto Douro Wine Region. This work also intends to give guidelines to the rehabilitation and preservation of this important legacy.
Maljovec, D.; Liu, S.; Wang, B.; ...
2015-07-14
Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less
Abdur-Rashid, Khalil; Furber, Steven Woodward; Abdul-Basser, Taha
2013-04-01
We survey the meta-ethical tools and institutional processes that traditional Islamic ethicists apply when deliberating on bioethical issues. We present a typology of these methodological elements, giving particular attention to the meta-ethical techniques and devices that traditional Islamic ethicists employ in the absence of decisive or univocal authoritative texts or in the absence of established transmitted cases. In describing how traditional Islamic ethicists work, we demonstrate that these experts possess a variety of discursive tools. We find that the ethical responsa-i.e., the products of the application of the tools that we describe-are generally characterized by internal consistency. We also conclude that Islamic ethical reasoning on bioethical issues, while clearly scripture-based, is also characterized by strong consequentialist elements and possesses clear principles-based characteristics. The paper contributes to the study of bioethics by familiarizing non-specialists in Islamic ethics with the role, scope, and applicability of key Islamic ethical concepts, such as "aims" (maqāṣid), "universals" (kulliyyāt), "interest" (maṣlaḥa), "maxims" (qawā`id), "controls" (ḍawābit), "differentiators" (furūq), "preponderization" (tarjīḥ), and "extension" (tafrī`).
NASA Astrophysics Data System (ADS)
Rath, S.; Sengupta, P. P.; Singh, A. P.; Marik, A. K.; Talukdar, P.
2013-07-01
Accurate prediction of roll force during hot strip rolling is essential for model based operation of hot strip mills. Traditionally, mathematical models based on theory of plastic deformation have been used for prediction of roll force. In the last decade, data driven models like artificial neural network have been tried for prediction of roll force. Pure mathematical models have accuracy limitations whereas data driven models have difficulty in convergence when applied to industrial conditions. Hybrid models by integrating the traditional mathematical formulations and data driven methods are being developed in different parts of world. This paper discusses the methodology of development of an innovative hybrid mathematical-artificial neural network model. In mathematical model, the most important factor influencing accuracy is flow stress of steel. Coefficients of standard flow stress equation, calculated by parameter estimation technique, have been used in the model. The hybrid model has been trained and validated with input and output data collected from finishing stands of Hot Strip Mill, Bokaro Steel Plant, India. It has been found that the model accuracy has been improved with use of hybrid model, over the traditional mathematical model.
Deflection-Based Aircraft Structural Loads Estimation with Comparison to Flight
NASA Technical Reports Server (NTRS)
Lizotte, Andrew M.; Lokos, William A.
2005-01-01
Traditional techniques in structural load measurement entail the correlation of a known load with strain-gage output from the individual components of a structure or machine. The use of strain gages has proved successful and is considered the standard approach for load measurement. However, remotely measuring aerodynamic loads using deflection measurement systems to determine aeroelastic deformation as a substitute to strain gages may yield lower testing costs while improving aircraft performance through reduced instrumentation weight. With a reliable strain and structural deformation measurement system this technique was examined. The objective of this study was to explore the utility of a deflection-based load estimation, using the active aeroelastic wing F/A-18 aircraft. Calibration data from ground tests performed on the aircraft were used to derive left wing-root and wing-fold bending-moment and torque load equations based on strain gages, however, for this study, point deflections were used to derive deflection-based load equations. Comparisons between the strain-gage and deflection-based methods are presented. Flight data from the phase-1 active aeroelastic wing flight program were used to validate the deflection-based load estimation method. Flight validation revealed a strong bending-moment correlation and slightly weaker torque correlation. Development of current techniques, and future studies are discussed.
Deflection-Based Structural Loads Estimation From the Active Aeroelastic Wing F/A-18 Aircraft
NASA Technical Reports Server (NTRS)
Lizotte, Andrew M.; Lokos, William A.
2005-01-01
Traditional techniques in structural load measurement entail the correlation of a known load with strain-gage output from the individual components of a structure or machine. The use of strain gages has proved successful and is considered the standard approach for load measurement. However, remotely measuring aerodynamic loads using deflection measurement systems to determine aeroelastic deformation as a substitute to strain gages may yield lower testing costs while improving aircraft performance through reduced instrumentation weight. This technique was examined using a reliable strain and structural deformation measurement system. The objective of this study was to explore the utility of a deflection-based load estimation, using the active aeroelastic wing F/A-18 aircraft. Calibration data from ground tests performed on the aircraft were used to derive left wing-root and wing-fold bending-moment and torque load equations based on strain gages, however, for this study, point deflections were used to derive deflection-based load equations. Comparisons between the strain-gage and deflection-based methods are presented. Flight data from the phase-1 active aeroelastic wing flight program were used to validate the deflection-based load estimation method. Flight validation revealed a strong bending-moment correlation and slightly weaker torque correlation. Development of current techniques, and future studies are discussed.
Contact thermal shock test of ceramics
NASA Technical Reports Server (NTRS)
Rogers, W. P.; Emery, A. F.
1992-01-01
A novel quantitative thermal shock test of ceramics is described. The technique employs contact between a metal-cooling rod and hot disk-shaped specimen. In contrast with traditional techniques, the well-defined thermal boundary condition allows for accurate analyses of heat transfer, stress, and fracture. Uniform equibiaxial tensile stresses are induced in the center of the test specimen. Transient specimen temperature and acoustic emission are monitored continuously during the thermal stress cycle. The technique is demonstrated with soda-lime glass specimens. Experimental results are compared with theoretical predictions based on a finite-element method thermal stress analysis combined with a statistical model of fracture. Material strength parameters are determined using concentric ring flexure tests. Good agreement is found between experimental results and theoretical predictions of failure probability as a function of time and initial specimen temperature.
Formal methods for modeling and analysis of hybrid systems
NASA Technical Reports Server (NTRS)
Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)
2009-01-01
A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.
Fast and precise technique for magnet lattice correction via sine-wave excitation of fast correctors
Yang, X.; Smaluk, V.; Yu, L. H.; ...
2017-05-02
A novel technique has been developed to improve the precision and shorten the measurement time of the LOCO (linear optics from closed orbits) method. This technique, named AC LOCO, is based on sine-wave (ac) beam excitation via fast correctors. Such fast correctors are typically installed at synchrotron light sources for the fast orbit feedback. The beam oscillations are measured by beam position monitors. The narrow band used for the beam excitation and measurement not only allows us to suppress effectively the beam position noise but also opens the opportunity for simultaneously exciting multiple correctors at different frequencies (multifrequency mode). Wemore » demonstrated at NSLS-II that AC LOCO provides better lattice corrections and works much faster than the traditional LOCO method.« less
Automated quantitative micro-mineralogical characterization for environmental applications
Smith, Kathleen S.; Hoal, K.O.; Walton-Day, Katherine; Stammer, J.G.; Pietersen, K.
2013-01-01
Characterization of ore and waste-rock material using automated quantitative micro-mineralogical techniques (e.g., QEMSCAN® and MLA) has the potential to complement traditional acid-base accounting and humidity cell techniques when predicting acid generation and metal release. These characterization techniques, which most commonly are used for metallurgical, mineral-processing, and geometallurgical applications, can be broadly applied throughout the mine-life cycle to include numerous environmental applications. Critical insights into mineral liberation, mineral associations, particle size, particle texture, and mineralogical residence phase(s) of environmentally important elements can be used to anticipate potential environmental challenges. Resources spent on initial characterization result in lower uncertainties of potential environmental impacts and possible cost savings associated with remediation and closure. Examples illustrate mineralogical and textural characterization of fluvial tailings material from the upper Arkansas River in Colorado.
Comparison of normal and phase stepping shearographic NDE
NASA Astrophysics Data System (ADS)
Andhee, A.; Gryzagoridis, J.; Findeis, D.
2005-05-01
The paper presents results of non-destructive testing of composite main rotor helicopter blade calibration specimens using the laser based optical NDE technique known as Shearography. The tests were performed initially using the already well established near real-time non-destructive technique of Shearography, with the specimens perturbed during testing for a few seconds using the hot air from a domestic hair dryer. Subsequent to modification of the shearing device utilized in the shearographic setup, phase stepping of one of the sheared images to be captured by the CCD camera was enabled and identical tests were performed on the composite main rotor helicopter blade specimens. Considerable enhancement of the images manifesting or depicting the defects on the specimens is noted suggesting that phase stepping is a desirable enhancement technique to the traditional Shearographic setup.
NASA Astrophysics Data System (ADS)
Xia, Huihui; Kan, Ruifeng; Xu, Zhenyu; He, Yabai; Liu, Jianguo; Chen, Bing; Yang, Chenguang; Yao, Lu; Wei, Min; Zhang, Guangle
2017-03-01
We present a system for accurate tomographic reconstruction of the combustion temperature and H2O vapor concentration of a flame based on laser absorption measurements, in combination with an innovative two-step algebraic reconstruction technique. A total of 11 collimated laser beams generated from outputs of fiber-coupled diode lasers formed a two-dimensional 5 × 6 orthogonal beam grids and measured at two H2O absorption transitions (7154.354/7154.353 cm-1 and 7467.769 cm-1). The measurement system was designed on a rotation platform to achieve a two-folder improvement in spatial resolution. Numerical simulation showed that the proposed two-step algebraic reconstruction technique for temperature and concentration, respectively, greatly improved the reconstruction accuracy of species concentration when compared with a traditional calculation. Experimental results demonstrated the good performances of the measurement system and the two-step reconstruction technique for applications such as flame monitoring and combustion diagnosis.
Continuous welding of unidirectional fiber reinforced thermoplastic tape material
NASA Astrophysics Data System (ADS)
Schledjewski, Ralf
2017-10-01
Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.
NASA Astrophysics Data System (ADS)
Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.
1991-03-01
To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).
Non-uniform refractive index field measurement based on light field imaging technique
NASA Astrophysics Data System (ADS)
Du, Xiaokun; Zhang, Yumin; Zhou, Mengjie; Xu, Dong
2018-02-01
In this paper, a method for measuring the non-uniform refractive index field based on the light field imaging technique is proposed. First, the light field camera is used to collect the four-dimensional light field data, and then the light field data is decoded according to the light field imaging principle to obtain image sequences with different acquisition angles of the refractive index field. Subsequently PIV (Particle Image Velocimetry) technique is used to extract ray offset of each image. Finally, the distribution of non-uniform refractive index field can be calculated by inversing the deflection of light rays. Compared with traditional optical methods which require multiple optical detectors from multiple angles to synchronously collect data, the method proposed in this paper only needs a light field camera and shoot once. The effectiveness of the method has been verified by the experiment which quantitatively measures the distribution of the refractive index field above the flame of the alcohol lamp.
Acoustic emission source localization based on distance domain signal representation
NASA Astrophysics Data System (ADS)
Gawronski, M.; Grabowski, K.; Russek, P.; Staszewski, W. J.; Uhl, T.; Packo, P.
2016-04-01
Acoustic emission is a vital non-destructive testing technique and is widely used in industry for damage detection, localisation and characterization. The latter two aspects are particularly challenging, as AE data are typically noisy. What is more, elastic waves generated by an AE event, propagate through a structural path and are significantly distorted. This effect is particularly prominent for thin elastic plates. In these media the dispersion phenomenon results in severe localisation and characterization issues. Traditional Time Difference of Arrival methods for localisation techniques typically fail when signals are highly dispersive. Hence, algorithms capable of dispersion compensation are sought. This paper presents a method based on the Time - Distance Domain Transform for an accurate AE event localisation. The source localisation is found through a minimization problem. The proposed technique focuses on transforming the time signal to the distance domain response, which would be recorded at the source. Only, basic elastic material properties and plate thickness are used in the approach, avoiding arbitrary parameters tuning.
Recent patents of nanopore DNA sequencing technology: progress and challenges.
Zhou, Jianfeng; Xu, Bingqian
2010-11-01
DNA sequencing techniques witnessed fast development in the last decades, primarily driven by the Human Genome Project. Among the proposed new techniques, Nanopore was considered as a suitable candidate for the single DNA sequencing with ultrahigh speed and very low cost. Several fabrication and modification techniques have been developed to produce robust and well-defined nanopore devices. Many efforts have also been done to apply nanopore to analyze the properties of DNA molecules. By comparing with traditional sequencing techniques, nanopore has demonstrated its distinctive superiorities in main practical issues, such as sample preparation, sequencing speed, cost-effective and read-length. Although challenges still remain, recent researches in improving the capabilities of nanopore have shed a light to achieve its ultimate goal: Sequence individual DNA strand at single nucleotide level. This patent review briefly highlights recent developments and technological achievements for DNA analysis and sequencing at single molecule level, focusing on nanopore based methods.
Luo, Wei; Chen, Sheng; Chen, Lei; Li, Hualong; Miao, Pengcheng; Gao, Huiyi; Hu, Zelin; Li, Miao
2017-05-29
We describe a theoretical model to analyze temperature effects on the Kretschmann surface plasmon resonance (SPR) sensor, and describe a new double-incident angle technique to simultaneously measure changes in refractive index (RI) and temperature. The method uses the observation that output signals obtained from two different incident angles each have a linear dependence on RI and temperature, and are independent. A proof-of-concept experiment using different NaCl concentration solutions as analytes demonstrates the ability of the technique. The optical design is as simple and robust as conventional SPR detection, but provides a way to discriminate between RI-induced and temperature-induced SPR changes. This technique facilitates a way for traditional SPR sensors to detect RI in different temperature environments, and may lead to better design and fabrication of SPR sensors against temperature variation.
Component Pin Recognition Using Algorithms Based on Machine Learning
NASA Astrophysics Data System (ADS)
Xiao, Yang; Hu, Hong; Liu, Ze; Xu, Jiangchang
2018-04-01
The purpose of machine vision for a plug-in machine is to improve the machine’s stability and accuracy, and recognition of the component pin is an important part of the vision. This paper focuses on component pin recognition using three different techniques. The first technique involves traditional image processing using the core algorithm for binary large object (BLOB) analysis. The second technique uses the histogram of oriented gradients (HOG), to experimentally compare the effect of the support vector machine (SVM) and the adaptive boosting machine (AdaBoost) learning meta-algorithm classifiers. The third technique is the use of an in-depth learning method known as convolution neural network (CNN), which involves identifying the pin by comparing a sample to its training. The main purpose of the research presented in this paper is to increase the knowledge of learning methods used in the plug-in machine industry in order to achieve better results.
Lindsay, Kaitlin E; Rühli, Frank J; Deleon, Valerie Burke
2015-06-01
The technique of forensic facial approximation, or reconstruction, is one of many facets of the field of mummy studies. Although far from a rigorous scientific technique, evidence-based visualization of antemortem appearance may supplement radiological, chemical, histological, and epidemiological studies of ancient remains. Published guidelines exist for creating facial approximations, but few approximations are published with documentation of the specific process and references used. Additionally, significant new research has taken place in recent years which helps define best practices in the field. This case study records the facial approximation of a 3,000-year-old ancient Egyptian woman using medical imaging data and the digital sculpting program, ZBrush. It represents a synthesis of current published techniques based on the most solid anatomical and/or statistical evidence. Through this study, it was found that although certain improvements have been made in developing repeatable, evidence-based guidelines for facial approximation, there are many proposed methods still awaiting confirmation from comprehensive studies. This study attempts to assist artists, anthropologists, and forensic investigators working in facial approximation by presenting the recommended methods in a chronological and usable format. © 2015 Wiley Periodicals, Inc.
Ravi, Logesh; Vairavasundaram, Subramaniyaswamy
2016-01-01
Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented.
Estimating Crop Growth Stage by Combining Meteorological and Remote Sensing Based Techniques
NASA Astrophysics Data System (ADS)
Champagne, C.; Alavi-Shoushtari, N.; Davidson, A. M.; Chipanshi, A.; Zhang, Y.; Shang, J.
2016-12-01
Estimations of seeding, harvest and phenological growth stage of crops are important sources of information for monitoring crop progress and crop yield forecasting. Growth stage has been traditionally estimated at the regional level through surveys, which rely on field staff to collect the information. Automated techniques to estimate growth stage have included agrometeorological approaches that use temperature and day length information to estimate accumulated heat and photoperiod, with thresholds used to determine when these stages are most likely. These approaches however, are crop and hybrid dependent, and can give widely varying results depending on the method used, particularly if the seeding date is unknown. Methods to estimate growth stage from remote sensing have progressed greatly in the past decade, with time series information from the Normalized Difference Vegetation Index (NDVI) the most common approach. Time series NDVI provide information on growth stage through a variety of techniques, including fitting functions to a series of measured NDVI values or smoothing these values and using thresholds to detect changes in slope that are indicative of rapidly increasing or decreasing `greeness' in the vegetation cover. The key limitations of these techniques for agriculture are frequent cloud cover in optical data that lead to errors in estimating local features in the time series function, and the incongruity between changes in greenness and traditional agricultural growth stages. There is great potential to combine both meteorological approaches and remote sensing to overcome the limitations of each technique. This research will examine the accuracy of both meteorological and remote sensing approaches over several agricultural sites in Canada, and look at the potential to integrate these techniques to provide improved estimates of crop growth stage for common field crops.
Kim, Hae Ri; Jang, Seong-Ho; Kim, Young Kyung; Son, Jun Sik; Min, Bong Ki; Kim, Kyo-Han; Kwon, Tae-Yub
2016-01-01
The microstructures and mechanical properties of cobalt-chromium (Co-Cr) alloys produced by three CAD/CAM-based processing techniques were investigated in comparison with those produced by the traditional casting technique. Four groups of disc- (microstructures) or dumbbell- (mechanical properties) specimens made of Co-Cr alloys were prepared using casting (CS), milling (ML), selective laser melting (SLM), and milling/post-sintering (ML/PS). For each technique, the corresponding commercial alloy material was used. The microstructures of the specimens were evaluated via X-ray diffractometry, optical and scanning electron microscopy with energy-dispersive X-ray spectroscopy, and electron backscattered diffraction pattern analysis. The mechanical properties were evaluated using a tensile test according to ISO 22674 (n = 6). The microstructure of the alloys was strongly influenced by the manufacturing processes. Overall, the SLM group showed superior mechanical properties, the ML/PS group being nearly comparable. The mechanical properties of the ML group were inferior to those of the CS group. The microstructures and mechanical properties of Co-Cr alloys were greatly dependent on the manufacturing technique as well as the chemical composition. The SLM and ML/PS techniques may be considered promising alternatives to the Co-Cr alloy casting process. PMID:28773718
Network-based high level data classification.
Silva, Thiago Christiano; Zhao, Liang
2012-06-01
Traditional supervised data classification considers only physical features (e.g., distance or similarity) of the input data. Here, this type of learning is called low level classification. On the other hand, the human (animal) brain performs both low and high orders of learning and it has facility in identifying patterns according to the semantic meaning of the input data. Data classification that considers not only physical attributes but also the pattern formation is, here, referred to as high level classification. In this paper, we propose a hybrid classification technique that combines both types of learning. The low level term can be implemented by any classification technique, while the high level term is realized by the extraction of features of the underlying network constructed from the input data. Thus, the former classifies the test instances by their physical features or class topologies, while the latter measures the compliance of the test instances to the pattern formation of the data. Our study shows that the proposed technique not only can realize classification according to the pattern formation, but also is able to improve the performance of traditional classification techniques. Furthermore, as the class configuration's complexity increases, such as the mixture among different classes, a larger portion of the high level term is required to get correct classification. This feature confirms that the high level classification has a special importance in complex situations of classification. Finally, we show how the proposed technique can be employed in a real-world application, where it is capable of identifying variations and distortions of handwritten digit images. As a result, it supplies an improvement in the overall pattern recognition rate.
Risk Management of NASA Projects
NASA Technical Reports Server (NTRS)
Sarper, Hueseyin
1997-01-01
Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.
Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L
2017-07-01
Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.
eLearning techniques supporting problem based learning in clinical simulation.
Docherty, Charles; Hoy, Derek; Topp, Helena; Trinder, Kathryn
2005-08-01
This paper details the results of the first phase of a project using eLearning to support students' learning within a simulated environment. The locus was a purpose built clinical simulation laboratory (CSL) where the School's philosophy of problem based learning (PBL) was challenged through lecturers using traditional teaching methods. a student-centred, problem based approach to the acquisition of clinical skills that used high quality learning objects embedded within web pages, substituting for lecturers providing instruction and demonstration. This encouraged student nurses to explore, analyse and make decisions within the safety of a clinical simulation. Learning was facilitated through network communications and reflection on video performances of self and others. Evaluations were positive, students demonstrating increased satisfaction with PBL, improved performance in exams, and increased self-efficacy in the performance of nursing activities. These results indicate that eLearning techniques can help students acquire clinical skills in the safety of a simulated environment within the context of a problem based learning curriculum.
Conceptual Design of a Communication-Based Deep Space Navigation Network
NASA Technical Reports Server (NTRS)
Anzalone, Evan J.; Chuang, C. H.
2012-01-01
As the need grows for increased autonomy and position knowledge accuracy to support missions beyond Earth orbit, engineers must push and develop more advanced navigation sensors and systems that operate independent of Earth-based analysis and processing. Several spacecraft are approaching this problem using inter-spacecraft radiometric tracking and onboard autonomous optical navigation methods. This paper proposes an alternative implementation to aid in spacecraft position fixing. The proposed method Network-Based Navigation technique takes advantage of the communication data being sent between spacecraft and between spacecraft and ground control to embed navigation information. The navigation system uses these packets to provide navigation estimates to an onboard navigation filter to augment traditional ground-based radiometric tracking techniques. As opposed to using digital signal measurements to capture inherent information of the transmitted signal itself, this method relies on the embedded navigation packet headers to calculate a navigation estimate. This method is heavily dependent on clock accuracy and the initial results show the promising performance of a notional system.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Verification of Autonomous Systems for Space Applications
NASA Technical Reports Server (NTRS)
Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.
2006-01-01
Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.
Effect of friction stir welding and post-weld heat treatment on a nanostructured ferritic alloy
Mazumder, Baishakhi; Yu, Xinghua; Edmondson, Philip D.; ...
2015-12-08
Nanostructured ferritic alloys (NFAs) are new generation materials for use in high temperature energy systems, such as nuclear fission or fusion reactors. However, joining these materials is a concern, as their unique microstructure is destroyed by traditional liquid-state welding methods. The microstructural evolution of a friction stir welded 14YWT NFA was investigated by atom probe tomography, before and after a post-weld heat treatment (PWHT) at 1123K. The particle size, number density, elemental composition, and morphology of the titanium-yttrium-oxygenenriched nanoclusters (NCs) in the stir and thermally-affected zones were studied and compared with the base metal. No statistical difference in the sizemore » of the NCs was observed in any of these conditions. After the PWHT, increases in the number density and the oxygen enrichment in the NCs were observed. Therefore, these new results provide additional supporting evidence that friction stir welding appears to be a viable joining technique for NFAs, as the microstructural parameters of the NCs are not strongly affected, in contrast to traditional welding techniques.« less