Sample records for traditional methods require

  1. Towards an Optimized Method of Olive Tree Crown Volume Measurement

    PubMed Central

    Miranda-Fuentes, Antonio; Llorens, Jordi; Gamarra-Diezma, Juan L.; Gil-Ribes, Jesús A.; Gil, Emilio

    2015-01-01

    Accurate crown characterization of large isolated olive trees is vital for adjusting spray doses in three-dimensional crop agriculture. Among the many methodologies available, laser sensors have proved to be the most reliable and accurate. However, their operation is time consuming and requires specialist knowledge and so a simpler crown characterization method is required. To this end, three methods were evaluated and compared with LiDAR measurements to determine their accuracy: Vertical Crown Projected Area method (VCPA), Ellipsoid Volume method (VE) and Tree Silhouette Volume method (VTS). Trials were performed in three different kinds of olive tree plantations: intensive, adapted one-trunked traditional and traditional. In total, 55 trees were characterized. Results show that all three methods are appropriate to estimate the crown volume, reaching high coefficients of determination: R2 = 0.783, 0.843 and 0.824 for VCPA, VE and VTS, respectively. However, discrepancies arise when evaluating tree plantations separately, especially for traditional trees. Here, correlations between LiDAR volume and other parameters showed that the Mean Vector calculated for VCPA method showed the highest correlation for traditional trees, thus its use in traditional plantations is highly recommended. PMID:25658396

  2. Genetic improvement of olive (Olea europaea L.) by conventional and in vitro biotechnology methods.

    PubMed

    Rugini, E; Cristofori, V; Silvestri, C

    2016-01-01

    In olive (Olea europaea L.) traditional methods of genetic improvement have up to now produced limited results. Intensification of olive growing requires appropriate new cultivars for fully mechanized groves, but among the large number of the traditional varieties very few are suitable. High-density and super high-density hedge row orchards require genotypes with reduced size, reduced apical dominance, a semi-erect growth habit, easy to propagate, resistant to abiotic and biotic stresses, with reliably high productivity and quality of both fruits and oil. Innovative strategies supported by molecular and biotechnological techniques are required to speed up novel hybridisation methods. Among traditional approaches the Gene Pool Method seems a reasonable option, but it requires availability of widely diverse germplasm from both cultivated and wild genotypes, supported by a detailed knowledge of their genetic relationships. The practice of "gene therapy" for the most important existing cultivars, combined with conventional methods, could accelerate achievement of the main goals, but efforts to overcome some technical and ideological obstacles are needed. The present review describes the benefits that olive and its products may obtain from genetic improvement using state of the art of conventional and unconventional methods, and includes progress made in the field of in vitro techniques. The uses of both traditional and modern technologies are discussed with recommendations. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Use of traditional contraceptive methods in India & its socio-demographic determinants.

    PubMed

    Ram, Faujdar; Shekhar, Chander; Chowdhury, Biswabandita

    2014-11-01

    The high use of traditional contraceptive methods may have health repercussions on both partners. High failure rate, lack of protection from sexually transmitted diseases are some of the examples of these repercussions. The aim of this study was to understand the level, trends, pattern, volume and socio-demographic determinants of using traditional contraceptive methods in the Indian context. Percentages, per cent distribution, cross-tabulation and multinomial logistic regression analyses were carried out. The data from the three rounds of National Family Health survey (NFHS) were used. The unit level District Level Household Survey (2007-2008) were mainly used to carry out the analysis in this paper. Marriage rates for States and Union Territories (UTs) were projected for the period of 2001-2011 to estimate the volume of traditional contraceptive users. These rates are required to get the number of eligible couples as on 2011 in the respective State/UT. The latest round of the District Level Household Survey (2007-2008) revealed that 6.7 per cent currently married women were using traditional contraceptive methods in India. More than half of the currently married women (56%) have ever used these methods. In terms of socio-demographic determinants, the odds ratios of using these methods were significantly higher for women aged 35 years and above, rural, Hindu, other than Scheduled Castes/Tribes (SCs/STs), secondary and above educated, non-poor, having two plus living children, and at least one surviving son in most of the states as well as at the national level. The northeastern region showed higher odds ratios (5 times) of women using traditional contraceptive methods than the southern region. A large number of currently married women have ever used the traditional contraceptive methods in India. On the basis of the findings from this study, the total size of those women who were using traditional methods and those who were having unmet need, and are required to use modern spacing methods of family planning in achieving the reproductive goals, is around 53 million. Women from a set of specific socio-demographic backgrounds are more likely to use these methods. A regional pattern has also emerged in use of tradition contraceptive methods in India.

  4. TRADITIONAL PRACTICES ADOPTED BY JORDANIAN MOTHERS WHEN CARING FOR THEIR INFANTS IN RURAL AREAS

    PubMed Central

    Al-Sagarat, Ahmad Yahya; Al-Kharabsheh, Amani

    2017-01-01

    Background: Traditional practices are commonly present within the Jordanian society, especailly those concerned with infant’s care. Some of these practices might be harmful and thus health professioanls are required to substitute these practices with safe and healthy ones. The goal of this study is to determine the traditional practices adopted by Jordanian mothers when caring for their infants in rural areas. Materials and Methods: A descriptive study design using qualitative method was utilized in this study. A Purposive sample of 30 mothers was recruited from four rural regions in outskirts of Amman the capital city of Jordan. Results: Mothers had traditional infant’s care practices pertinent to bathing of babies, including the salting, swaddling, care of the umbilical cord and jaundice. Conclusion: Traditional practices are still common in Jordan; some of these behaviors can cause health risks. While health consequences of some of the traditional practices are still not clear, health professianls, especially nurses, are required to intervene by changing policies and education. PMID:28331910

  5. Student Diversity Requires Different Approaches to College Teaching, Even in Math and Science.

    ERIC Educational Resources Information Center

    Nelson, Craig E.

    1996-01-01

    Asserts that traditional teaching methods are unintentionally biased towards the elite and against many non-traditional students. Outlines several easily accessible changes in teaching methods that have fostered dramatic changes in student performance with no change in standards. These approaches have proven effective even in the fields of…

  6. Online Estimation of Allan Variance Coefficients Based on a Neural-Extended Kalman Filter

    PubMed Central

    Miao, Zhiyong; Shen, Feng; Xu, Dingjie; He, Kunpeng; Tian, Chunmiao

    2015-01-01

    As a noise analysis method for inertial sensors, the traditional Allan variance method requires the storage of a large amount of data and manual analysis for an Allan variance graph. Although the existing online estimation methods avoid the storage of data and the painful procedure of drawing slope lines for estimation, they require complex transformations and even cause errors during the modeling of dynamic Allan variance. To solve these problems, first, a new state-space model that directly models the stochastic errors to obtain a nonlinear state-space model was established for inertial sensors. Then, a neural-extended Kalman filter algorithm was used to estimate the Allan variance coefficients. The real noises of an ADIS16405 IMU and fiber optic gyro-sensors were analyzed by the proposed method and traditional methods. The experimental results show that the proposed method is more suitable to estimate the Allan variance coefficients than the traditional methods. Moreover, the proposed method effectively avoids the storage of data and can be easily implemented using an online processor. PMID:25625903

  7. Achieving femoral artery hemostasis after cardiac catheterization: a comparison of methods.

    PubMed

    Schickel, S I; Adkisson, P; Miracle, V; Cronin, S N

    1999-11-01

    Cardiac catheterization is a common procedure that involves the introduction of a small sheath (5F-8F) into the femoral artery for insertion of other diagnostic catheters. After cardiac catheterization, local compression of the femoral artery is required to prevent bleeding and to achieve hemostasis. Traditional methods of achieving hemostasis require significant time and close supervision by medical personnel and can contribute to patients' discomfort. VasoSeal is a recently developed device that delivers absorbable collagen into the supra-arterial space to promote hemostasis. To compare outcomes between patients receiving a collagen plug and patients in whom a traditional method of achieving hemostasis was used after diagnostic cardiac catheterization. An outcomes tracking tool was used to analyze the medical records of 95 patients in whom a traditional method was used (traditional group) and 81 patients in whom VasoSeal was used (device group) to achieve hemostasis. Complications at the femoral access site, patients' satisfaction, and times to hemostasis, ambulation, and discharge were compared. Hematomas of 6-cm diameter occurred in 5.3% of the traditional group; no complications occurred in the device group. The device group also achieved hemostasis faster and had earlier ambulation (P < .001). Patients in the device group were discharged a mean of 5 hours sooner than patients in the traditional group (P < .05). No significant differences were found in patients' satisfaction. VasoSeal is a safe and effective method of achieving hemostasis after cardiac catheterization that can hasten time to hemostasis, ambulation, and discharge.

  8. Probabilistic Design of a Mars Sample Return Earth Entry Vehicle Thermal Protection System

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Mitcheltree, Robert A.

    2002-01-01

    The driving requirement for design of a Mars Sample Return mission is to assure containment of the returned samples. Designing to, and demonstrating compliance with, such a requirement requires physics based tools that establish the relationship between engineer's sizing margins and probabilities of failure. The traditional method of determining margins on ablative thermal protection systems, while conservative, provides little insight into the actual probability of an over-temperature during flight. The objective of this paper is to describe a new methodology for establishing margins on sizing the thermal protection system (TPS). Results of this Monte Carlo approach are compared with traditional methods.

  9. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A Comparative Study on Electronic versus Traditional Data Collection in a Special Education Setting

    ERIC Educational Resources Information Center

    Ruf, Hernan Dennis

    2012-01-01

    The purpose of the current study was to determine the efficiency of an electronic data collection method compared to a traditional paper-based method in the educational field, in terms of the accuracy of data collected and the time required to do it. In addition, data were collected to assess users' preference and system usability. The study…

  11. Addressing the Barriers to Agile Development in the Department of Defense: Program Structure, Requirements, and Contracting

    DTIC Science & Technology

    2015-04-30

    approach directly contrast with the traditional DoD acquisition model designed for a single big-bang waterfall approach (Broadus, 2013). Currently...progress, reduce technical and programmatic risk, and respond to feedback and changes more quickly than traditional waterfall methods (Modigliani...requirements, and contracting. The DoD can address these barriers by utilizing a proactively tailored Agile acquisition model , implementing an IT Box

  12. New Laboratory Methods for Characterizing the Immersion Factors for Irradiance

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Zibordi, Giuseppe; DAlimonte, Davide; vaderLinde, Dirk; Brown, James W.

    2003-01-01

    The experimental determination of the immersion factor, I(sub f)(lambda), of irradiance collectors is a requirement of any in-water radiometer. The eighth SeaWiFS Intercalibration Round-Robin Experiment (SIRREX-8) showed different implementations, at different laboratories, of the same I(sub f)(lambda) measurement protocol. The different implementations make use of different setups, volumes, and water types. Consequently, they exhibit different accuracies and require different execution times for characterizing an irradiance sensor. In view of standardizing the characterization of I(sub f)(lambda) values for in-water radiometers, together with an increase in the accuracy of methods and a decrease in the execution time, alternative methods are presented, and assessed versus the traditional method. The proposed new laboratory methods include: a) the continuous method, in which optical measurements taken with discrete water depths are substituted by continuous profiles created by removing the water from the water vessel at a constant flow rate (which significantly reduces the time required for the characterization of a single radiometer); and b) the Compact Portable Advanced Characterization Tank (ComPACT) method, in which the commonly used large tanks are replaced by a small water vessel, thereby allowing the determination of I(sub f)(lambda) values with a small water volume, and more importantly, permitting I(sub f)(lambda) characterizations with pure water. Intercomparisons between the continuous and the traditional method showed results within the variance of I(sub f) (lambda) determinations. The use of the continuous method, however, showed a much shorter realization time. Intercomparisons between the ComPACT and the traditional method showed generally higher I(sub f)(lambda) values for the former. This is in agreement with the generalized expectations of a reduction in scattering effects, because of the use of pure water with the ComPACT method versus the use of tap water with the traditional method.

  13. A novel asynchronous access method with binary interfaces

    PubMed Central

    2008-01-01

    Background Traditionally synchronous access strategies require users to comply with one or more time constraints in order to communicate intent with a binary human-machine interface (e.g., mechanical, gestural or neural switches). Asynchronous access methods are preferable, but have not been used with binary interfaces in the control of devices that require more than two commands to be successfully operated. Methods We present the mathematical development and evaluation of a novel asynchronous access method that may be used to translate sporadic activations of binary interfaces into distinct outcomes for the control of devices requiring an arbitrary number of commands to be controlled. With this method, users are required to activate their interfaces only when the device under control behaves erroneously. Then, a recursive algorithm, incorporating contextual assumptions relevant to all possible outcomes, is used to obtain an informed estimate of user intention. We evaluate this method by simulating a control task requiring a series of target commands to be tracked by a model user. Results When compared to a random selection, the proposed asynchronous access method offers a significant reduction in the number of interface activations required from the user. Conclusion This novel access method offers a variety of advantages over traditionally synchronous access strategies and may be adapted to a wide variety of contexts, with primary relevance to applications involving direct object manipulation. PMID:18959797

  14. Comparison of traditional gas chromatography (GC), headspace GC, and the microbial identification library GC system for the identification of Clostridium difficile.

    PubMed Central

    Cundy, K V; Willard, K E; Valeri, L J; Shanholtzer, C J; Singh, J; Peterson, L R

    1991-01-01

    Three gas chromatography (GC) methods were compared for the identification of 52 clinical Clostridium difficile isolates, as well as 17 non-C. difficile Clostridium isolates. Headspace GC and Microbial Identification System (MIS) GC, an automated system which utilizes a software library developed at the Virginia Polytechnic Institute to identify organisms based on the fatty acids extracted from the bacterial cell wall, were compared against the reference method of traditional GC. Headspace GC and MIS were of approximately equivalent accuracy in identifying the 52 C. difficile isolates (52 of 52 versus 51 of 52, respectively). However, 7 of 52 organisms required repeated sample preparation before an identification was achieved by the MIS method. Both systems effectively differentiated C. difficile from non-C. difficile clostridia, although the MIS method correctly identified only 9 of 17. We conclude that the headspace GC system is an accurate method of C. difficile identification, which requires only one-fifth of the sample preparation time of MIS GC and one-half of the sample preparation time of traditional GC. PMID:2007632

  15. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    USDA-ARS?s Scientific Manuscript database

    Recently, an instrument (TEMPOTM) has been developed to automate the Most Probable Number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique to traditional microbiological plating methods or PetrifilmTM for estimating the t...

  16. Method of electric powertrain matching for battery-powered electric cars

    NASA Astrophysics Data System (ADS)

    Ning, Guobao; Xiong, Lu; Zhang, Lijun; Yu, Zhuoping

    2013-05-01

    The current match method of electric powertrain still makes use of longitudinal dynamics, which can't realize maximum capacity for on-board energy storage unit and can't reach lowest equivalent fuel consumption as well. Another match method focuses on improving available space considering reasonable layout of vehicle to enlarge rated energy capacity for on-board energy storage unit, which can keep the longitudinal dynamics performance almost unchanged but can't reach lowest fuel consumption. Considering the characteristics of driving motor, method of electric powertrain matching utilizing conventional longitudinal dynamics for driving system and cut-and-try method for energy storage system is proposed for passenger cars converted from traditional ones. Through combining the utilization of vehicle space which contributes to the on-board energy amount, vehicle longitudinal performance requirements, vehicle equivalent fuel consumption level, passive safety requirements and maximum driving range requirement together, a comprehensive optimal match method of electric powertrain for battery-powered electric vehicle is raised. In simulation, the vehicle model and match method is built in Matlab/simulink, and the Environmental Protection Agency (EPA) Urban Dynamometer Driving Schedule (UDDS) is chosen as a test condition. The simulation results show that 2.62% of regenerative energy and 2% of energy storage efficiency are increased relative to the traditional method. The research conclusions provide theoretical and practical solutions for electric powertrain matching for modern battery-powered electric vehicles especially for those converted from traditional ones, and further enhance dynamics of electric vehicles.

  17. A Modified Magnetic Gradient Contraction Based Method for Ferromagnetic Target Localization

    PubMed Central

    Wang, Chen; Zhang, Xiaojuan; Qu, Xiaodong; Pan, Xiao; Fang, Guangyou; Chen, Luzhao

    2016-01-01

    The Scalar Triangulation and Ranging (STAR) method, which is based upon the unique properties of magnetic gradient contraction, is a high real-time ferromagnetic target localization method. Only one measurement point is required in the STAR method and it is not sensitive to changes in sensing platform orientation. However, the localization accuracy of the method is limited by the asphericity errors and the inaccurate value of position leads to larger errors in the estimation of magnetic moment. To improve the localization accuracy, a modified STAR method is proposed. In the proposed method, the asphericity errors of the traditional STAR method are compensated with an iterative algorithm. The proposed method has a fast convergence rate which meets the requirement of high real-time localization. Simulations and field experiments have been done to evaluate the performance of the proposed method. The results indicate that target parameters estimated by the modified STAR method are more accurate than the traditional STAR method. PMID:27999322

  18. Sociometric Indicators of Leadership: An Exploratory Analysis

    DTIC Science & Technology

    2018-01-01

    streamline existing observational protocols and assessment methods . This research provides an initial test of sociometric badges in the context of the U.S...understand, the requirements of the mission. Traditional research and assessment methods focusing on leader and follower interactions require direct...based methods of social network analysis. Novel Measures of Leadership Building on these findings and earlier research , it is apparent that

  19. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  20. Sediment-generated noise (SGN): Comparison with physical bedload measurements in a small semi-arid watershed

    USDA-ARS?s Scientific Manuscript database

    Passive acoustic techniques for the measurement of Sediment-Generated Noise (SGN) in gravel-bed rivers present a promising alternative to traditional bedload measurement techniques. Where traditional methods are often prohibitively costly, particularly in labor requirements, and produce point-scale ...

  1. Oxygen production using solid-state zirconia electrolyte technology

    NASA Technical Reports Server (NTRS)

    Suitor, Jerry W.; Clark, Douglas J.

    1991-01-01

    High purity oxygen is required for a number of scientific, medical, and industrial applications. Traditionally, these needs have been met by cryogenic distillation or pressure swing adsorption systems designed to separate oxygen from air. Oxygen separation from air via solid-state zirconia electrolyte technology offers an alternative to these methods. The technology has several advantages over the traditional methods, including reliability, compactness, quiet operation, high purity output, and low power consumption.

  2. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  3. A methodology for highway asset valuation in Indiana.

    DOT National Transportation Integrated Search

    2012-11-01

    The Government Accounting Standards Board (GASB) requires transportation agencies to report the values of their tangible assets. : Numerous valuation methods exist which use different underlying concepts and data items. These traditional methods have...

  4. A Conflict of Cultures: Planning vs. Tradition in Public Libraries.

    ERIC Educational Resources Information Center

    Raber, Douglas

    1995-01-01

    Strategic planning for public libraries as advocated by the PLA (Public Library Association) in the Public Library Development Program is likely to be met with resistance due to changes it requires in traditional public library planning and services. Conflicts that may arise are discussed, as well as methods for preventing, minimizing, and…

  5. Teaching Bovine Abdominal Anatomy: Use of a Haptic Simulator

    ERIC Educational Resources Information Center

    Kinnison, Tierney; Forrest, Neil David; Frean, Stephen Philip; Baillie, Sarah

    2009-01-01

    Traditional methods of teaching anatomy to undergraduate medical and veterinary students are being challenged and need to adapt to modern concerns and requirements. There is a move away from the use of cadavers to new technologies as a way of complementing the traditional approaches and addressing resource and ethical problems. Haptic (touch)…

  6. [Essential procedure and key methods for survey of traditional knowledge related to Chinese materia medica resources].

    PubMed

    Cheng, Gong; Huang, Lu-qi; Xue, Da-yuan; Zhang, Xiao-bo

    2014-12-01

    The survey of traditional knowledge related to Chinese materia medica resources is the important component and one of the innovative aspects of the fourth national survey of the Chinese materia medica resources. China has rich traditional knowledge of traditional Chinese medicine (TCM) and the comprehensive investigation of TCM traditional knowledge aims to promote conservation and sustainable use of Chinese materia medica resources. Building upon the field work of pilot investigations, this paper introduces the essential procedures and key methods for conducting the survey of traditional knowledge related to Chinese materia medica resources. The essential procedures are as follows. First is the preparation phrase. It is important to review all relevant literature and provide training to the survey teams so that they have clear understanding of the concept of traditional knowledge and master key survey methods. Second is the field investigation phrase. When conducting field investigations, survey teams should identify the traditional knowledge holders by using the 'snowball method', record the traditional knowledge after obtaining prior informed concerned from the traditional knowledge holders. Researchers should fill out the survey forms provided by the Technical Specification of the Fourth National Survey of Chinese Materia Medica Resources. Researchers should pay particular attention to the scope of traditional knowledge and the method of inheriting the knowledge, which are the key information for traditional knowledge holders and potential users to reach mutual agreed terms to achieve benefit sharing. Third is the data compilation and analysis phrase. Researchers should try to compile and edit the TCM traditional knowledge in accordance with intellectual property rights requirements so that the information collected through the national survey can serve as the basic data for the TCM traditional knowledge database. The key methods of the survey include regional division of Chinese materia medica resources, interview of key information holders and standardization of information.' In particular, using "snowball method" can effectively identify traditional knowledge holder in the targeted regions and ensuring traditional knowledge holders receiving prior informed concerned before sharing the information with researcher to make sure the rights of traditional knowledge holders are protected. Employing right survey methods is not only the key to obtain traditional knowledge related to Chinese materia medica resources, but also the pathway to fulfill the objectives of access and benefit sharing stipulated in Convention on Biological Resources. It will promote the legal protection of TCM traditional knowledge and conservation of TCM intangible, cultural heritage.

  7. Evaluation of a Mobile Application for Multiplier Method Growth and Epiphysiodesis Timing Predictions.

    PubMed

    Wagner, Pablo; Standard, Shawn C; Herzenberg, John E

    The multiplier method (MM) is frequently used to predict limb-length discrepancy and timing of epiphysiodesis. The traditional MM uses complex formulae and requires a calculator. A mobile application was developed in an attempt to simplify and streamline these calculations. We compared the accuracy and speed of using the traditional pencil and paper technique with that using the Multiplier App (MA). After attending a training lecture and a hands-on workshop on the MM and MA, 30 resident surgeons were asked to apply the traditional MM and the MA at different weeks of their rotations. They were randomized as to the method they applied first. Subjects performed calculations for 5 clinical exercises that involved congenital and developmental limb-length discrepancies and timing of epiphysiodesis. The amount of time required to complete the exercises and the accuracy of the answers were evaluated for each subject. The test subjects answered 60% of the questions correctly using the traditional MM and 80% of the questions correctly using the MA (P=0.001). The average amount of time to complete the 5 exercises with the MM and MA was 22 and 8 minutes, respectively (P<0.0001). Several reports state that the traditional MM is quick and easy to use. Nevertheless, even in the most experienced hands, performing the calculations in clinical practice can be time-consuming. Errors may result from choosing the wrong formulae and from performing the calculations by hand. Our data show that the MA is simpler, more accurate, and faster than the traditional MM from a practical standpoint. Level II.

  8. Comprehensive reliability allocation method for CNC lathes based on cubic transformed functions of failure mode and effects analysis

    NASA Astrophysics Data System (ADS)

    Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin

    2015-03-01

    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  9. The Graduate Program in Pharmacology at the Ohio State University College of Pharmacy

    ERIC Educational Resources Information Center

    Burkman, Allan M.

    1976-01-01

    Ohio State's traditional graduate program is discussed in terms of student requirements, including competence in research strategy and experimental design, manipulative technique, and oral and written communication. Methods for meeting these requirements are reviewed briefly. (LBH)

  10. Determination of Carbonyl Groups in Pyrolysis Bio-oils Using Potentiometric Titration: Review and Comparison of Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, Stuart; Ferrell, Jack R.

    Carbonyl compounds present in bio-oils are known to be responsible for bio-oil property changes upon storage and during upgrading. As such, carbonyl content has previously been used as a method of tracking bio-oil aging and condensation reactions with less variability than viscosity measurements. Given the importance of carbonyls in bio-oils, accurate analytical methods for their quantification are very important for the bio-oil community. Potentiometric titration methods based on carbonyl oximation have long been used for the determination of carbonyl content in pyrolysis bio-oils. Here in this study, we present a modification of the traditional carbonyl oximation procedures that results inmore » less reaction time, smaller sample size, higher precision, and more accurate carbonyl determinations. Some compounds such as carbohydrates are not measured by the traditional method (modified Nicolaides method), resulting in low estimations of the carbonyl content. Furthermore, we have shown that reaction completion for the traditional method can take up to 300 hours. The new method presented here (the modified Faix method) reduces the reaction time to 2 hours, uses triethanolamine (TEA) in the place of pyridine, and requires a smaller sample size for the analysis. Carbonyl contents determined using this new method are consistently higher than when using the traditional titration methods.« less

  11. Determination of Carbonyl Groups in Pyrolysis Bio-oils Using Potentiometric Titration: Review and Comparison of Methods

    DOE PAGES

    Black, Stuart; Ferrell, Jack R.

    2016-01-06

    Carbonyl compounds present in bio-oils are known to be responsible for bio-oil property changes upon storage and during upgrading. As such, carbonyl content has previously been used as a method of tracking bio-oil aging and condensation reactions with less variability than viscosity measurements. Given the importance of carbonyls in bio-oils, accurate analytical methods for their quantification are very important for the bio-oil community. Potentiometric titration methods based on carbonyl oximation have long been used for the determination of carbonyl content in pyrolysis bio-oils. Here in this study, we present a modification of the traditional carbonyl oximation procedures that results inmore » less reaction time, smaller sample size, higher precision, and more accurate carbonyl determinations. Some compounds such as carbohydrates are not measured by the traditional method (modified Nicolaides method), resulting in low estimations of the carbonyl content. Furthermore, we have shown that reaction completion for the traditional method can take up to 300 hours. The new method presented here (the modified Faix method) reduces the reaction time to 2 hours, uses triethanolamine (TEA) in the place of pyridine, and requires a smaller sample size for the analysis. Carbonyl contents determined using this new method are consistently higher than when using the traditional titration methods.« less

  12. Aerobic conditioning for team sport athletes.

    PubMed

    Stone, Nicholas M; Kilding, Andrew E

    2009-01-01

    Team sport athletes require a high level of aerobic fitness in order to generate and maintain power output during repeated high-intensity efforts and to recover. Research to date suggests that these components can be increased by regularly performing aerobic conditioning. Traditional aerobic conditioning, with minimal changes of direction and no skill component, has been demonstrated to effectively increase aerobic function within a 4- to 10-week period in team sport players. More importantly, traditional aerobic conditioning methods have been shown to increase team sport performance substantially. Many team sports require the upkeep of both aerobic fitness and sport-specific skills during a lengthy competitive season. Classic team sport trainings have been shown to evoke marginal increases/decreases in aerobic fitness. In recent years, aerobic conditioning methods have been designed to allow adequate intensities to be achieved to induce improvements in aerobic fitness whilst incorporating movement-specific and skill-specific tasks, e.g. small-sided games and dribbling circuits. Such 'sport-specific' conditioning methods have been demonstrated to promote increases in aerobic fitness, though careful consideration of player skill levels, current fitness, player numbers, field dimensions, game rules and availability of player encouragement is required. Whilst different conditioning methods appear equivalent in their ability to improve fitness, whether sport-specific conditioning is superior to other methods at improving actual game performance statistics requires further research.

  13. Assessing Smart Phones for Generating Life-space Indicators.

    PubMed

    Wan, Neng; Qu, Wenyu; Whittington, Jackie; Witbrodt, Bradley C; Henderson, Mary Pearl; Goulding, Evan H; Schenk, A Katrin; Bonasera, Stephen J; Lin, Ge

    2013-04-01

    Life-space is a promising method for estimating older adults' functional status. However, traditional life-space measures are costly and time consuming because they often rely on active subject participation. This study assesses the feasibility of using the global positioning system (GPS) function of smart phones to generate life-space indicators. We first evaluated the location accuracy of smart phone collected GPS points versus those acquired by a commercial GPS unit. We then assessed the specificity of the smart phone processed life-space information against the traditional diary method. Our results suggested comparable location accuracy between the smart phone and the standard GPS unit in most outdoor situations. In addition, the smart phone method revealed more comprehensive life-space information than the diary method, which leads to higher and more consistent life-space scores. We conclude that the smart phone method is more reliable than traditional methods for measuring life-space. Further improvements will be required to develop a robust application of this method that is suitable for health-related practices.

  14. Non-codified traditional medicine practices from Belgaum Region in Southern India: present scenario

    PubMed Central

    2014-01-01

    Background Traditional medicine in India can be classified into codified (Ayurveda, Unani, Siddha, Homeopathy) and non-codified (folk medicine) systems. Both the systems contributing equally to the primary healthcare in India. The present study is aimed to understand the current scenario of medicinal practices of non-codified system of traditional medicine in Belgaum region, India. Methods The study has been conducted as a basic survey of identified non-codified traditional practitioners by convenience sampling with semi structured, open ended interviews and discussions. The learning process, disease diagnosis, treatment, remuneration, sharing of knowledge and socio-demographic data was collected, analysed and discussed. Results One hundred and forty traditional practitioners were identified and interviewed for the present study. These practitioners are locally known as “Vaidya”. The study revealed that the non-codified healthcare tradition is practiced mainly by elderly persons in the age group of 61 years and above (40%). 73% of the practitioners learnt the tradition from their forefathers, and 19% of practitioners developed their own practices through experimentation, reading and learning. 20% of the practitioners follow distinctive “Nadi Pariksha” (pulse examination) for disease diagnosis, while others follow bodily symptoms and complaints. 29% of the traditional practitioners do not charge anything, while 59% practitioners receive money as remuneration. Plant and animal materials are used as sources of medicines, with a variety of preparation methods. The preference ranking test revealed higher education and migration from villages are the main reasons for decreasing interest amongst the younger generation, while deforestation emerged as the main cause of medicinal plants depletion. Conclusion Patrilineal transfer of the knowledge to younger generation was observed in Belgaum region. The observed resemblance in disease diagnosis, plant collection and processing between non-codified traditional system of medicine and Ayurveda require further methodical studies to establish the relationship between the two on a more objective basis. However, the practice appears to be at crossroads with threat of extinction, because of non-inheritance of the knowledge and non-availability of medicinal plants. Hence conservation strategies for both knowledge and resources at societal, scientific and legislative levels are urgently required to preserve the traditional wisdom. PMID:24934868

  15. Computer game-based and traditional learning method: a comparison regarding students’ knowledge retention

    PubMed Central

    2013-01-01

    Background Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Methods Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students’ prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students’ short and long-term knowledge retention. PMID:23442203

  16. Optimal methodologies for terahertz time-domain spectroscopic analysis of traditional pigments in powder form

    NASA Astrophysics Data System (ADS)

    Ha, Taewoo; Lee, Howon; Sim, Kyung Ik; Kim, Jonghyeon; Jo, Young Chan; Kim, Jae Hoon; Baek, Na Yeon; Kang, Dai-ill; Lee, Han Hyoung

    2017-05-01

    We have established optimal methods for terahertz time-domain spectroscopic analysis of highly absorbing pigments in powder form based on our investigation of representative traditional Chinese pigments, such as azurite [blue-based color pigment], Chinese vermilion [red-based color pigment], and arsenic yellow [yellow-based color pigment]. To accurately extract the optical constants in the terahertz region of 0.1 - 3 THz, we carried out transmission measurements in such a way that intense absorption peaks did not completely suppress the transmission level. This required preparation of pellet samples with optimized thicknesses and material densities. In some cases, mixing the pigments with polyethylene powder was required to minimize absorption due to certain peak features. The resulting distortion-free terahertz spectra of the investigated set of pigment species exhibited well-defined unique spectral fingerprints. Our study will be useful to future efforts to establish non-destructive analysis methods of traditional pigments, to construct their spectral databases, and to apply these tools to restoration of cultural heritage materials.

  17. Economics of hardwood silviculture using skyline and conventional logging

    Treesearch

    John E. Baumgras; Gary W. Miller; Chris B. LeDoux

    1995-01-01

    Managing Appalachian hardwood forests to satisfy the growing and diverse demands on this resource will require alternatives to traditional silvicultural methods and harvesting systems. Determining the relative economic efficiency of these alternative methods and systems with respect to harvest cash flows is essential. The effects of silvicultural methods and roundwood...

  18. Traditional lecture versus jigsaw learning method for teaching Medication Therapy Management (MTM) core elements.

    PubMed

    Wilson, Jennifer A; Pegram, Angela H; Battise, Dawn M; Robinson, April M

    2017-11-01

    To determine if traditional didactic lecture or the jigsaw learning method is more effective to teach the medication therapy management (MTM) core elements in a first year pharmacy course. Traditional didactic lecture and a pre-class reading assignment were used in the fall semester cohort, and the jigsaw method was used in the spring semester cohort. Jigsaw is a cooperative learning strategy requiring students to assume responsibility for learning, and subsequently teaching peers. The students were responsible for reading specific sections of the pre-class reading, and then teaching other students in small groups about their specific reading assignments. To assess potential differences, identical pre- and post-tests were administered before and after the MTM section. Additionally, grade performance on an in-class project and final exam questions were compared, and students were surveyed on perceptions of teaching method used. A total of 45 and 43 students completed both the pre- and post-test in the fall and spring (96% and 93% response rate), respectively. Improvement in post-test scores favored the traditional method (p = 0.001). No statistical differences were noted between groups with grade performance on the in-class project and final exam questions. However, students favored the jigsaw method over traditional lecture and perceived improvements in problem solving skills, listening/communication skills and encouragement of cooperative learning (p = 0.018, 0.025 and 0.031). Although students favored the jigsaw learning method, traditional didactic lecture was more effective for the pre- and post-knowledge test performance. This may indicate that traditional didactic lecture is more effective for more foundational content. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Computer game-based and traditional learning method: a comparison regarding students' knowledge retention.

    PubMed

    Rondon, Silmara; Sassi, Fernanda Chiarion; Furquim de Andrade, Claudia Regina

    2013-02-25

    Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students' prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students' performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students' short and long-term knowledge retention.

  20. Digital Assist: A Comparison of Two Note-Taking Methods (Traditional vs. Digital Pen) for Students with Emotional Behavioral Disorders

    ERIC Educational Resources Information Center

    Rody, Carlotta A.

    2013-01-01

    High school biology classes traditionally follow a lecture format to disseminate content and new terminology. With the inclusive practices of No Child Left Behind, the Common Core State Standards, and end-of-course exam requirement for high school diplomas, classes include a large range of achievement levels and abilities. Teachers assume, often…

  1. Automated Cellient(™) cytoblocks: better, stronger, faster?

    PubMed

    Prendeville, S; Brosnan, T; Browne, T J; McCarthy, J

    2014-12-01

    Cytoblocks (CBs), or cell blocks, provide additional morphological detail and a platform for immunocytochemistry (ICC) in cytopathology. The Cellient(™) system produces CBs in 45 minutes using methanol fixation, compared with traditional CBs, which require overnight formalin fixation. This study compares Cellient and traditional CB methods in terms of cellularity, morphology and immunoreactivity, evaluates the potential to add formalin fixation to the Cellient method for ICC studies and determines the optimal sectioning depth for maximal cellularity in Cellient CBs. One hundred and sixty CBs were prepared from 40 cytology samples (32 malignant, eight benign) using four processing methods: (A) traditional; (B) Cellient (methanol fixation); (C) Cellient using additional formalin fixation for 30 minutes; (D) Cellient using additional formalin fixation for 60 minutes. Haematoxylin and eosin-stained sections were assessed for cellularity and morphology. ICC was assessed on 14 cases with a panel of antibodies. Three additional Cellient samples were serially sectioned to determine the optimal sectioning depth. Scoring was performed by two independent, blinded reviewers. For malignant cases, morphology was superior with Cellient relative to traditional CBs (P < 0.001). Cellularity was comparable across all methods. ICC was excellent in all groups and the addition of formalin at any stage during the Cellient process did not influence the staining quality. Serial sectioning through Cellient CBs showed optimum cellularity at 30-40 μm with at least 27 sections obtainable. Cellient CBs provide superior morphology to traditional CBs and, if required, formalin fixation may be added to the Cellient process for ICC. Optimal Cellient CB cellularity is achieved at 30-40 μm, which will impact on the handling of cases in daily practice. © 2014 John Wiley & Sons Ltd.

  2. A Composite Model of Wound Segmentation Based on Traditional Methods and Deep Neural Networks

    PubMed Central

    Wang, Changjian; Liu, Xiaohui; Jin, Shiyao

    2018-01-01

    Wound segmentation plays an important supporting role in the wound observation and wound healing. Current methods of image segmentation include those based on traditional process of image and those based on deep neural networks. The traditional methods use the artificial image features to complete the task without large amounts of labeled data. Meanwhile, the methods based on deep neural networks can extract the image features effectively without the artificial design, but lots of training data are required. Combined with the advantages of them, this paper presents a composite model of wound segmentation. The model uses the skin with wound detection algorithm we designed in the paper to highlight image features. Then, the preprocessed images are segmented by deep neural networks. And semantic corrections are applied to the segmentation results at last. The model shows a good performance in our experiment. PMID:29955227

  3. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  4. Comparison of a novel distillation method versus a traditional distillation method in a model gin system using liquid/liquid extraction.

    PubMed

    Greer, Derek; Pfahl, Les; Rieck, Jim; Daniels, Tim; Garza, Oscar

    2008-10-08

    This research studied a novel form of distillation (high vacuum distillation) as a method for preserving volatile aroma chemicals important to the organoleptic attributes of a four botanical model gin as well as the degradation products generated during the heating required in traditional methods of gin distillation. A 2 (5) factorial experiment was conducted in a partially confounded incomplete block design and analyzed using the PROC MIXED procedure from SAS. A model gin was made of dried juniper berries (Juniperus communis), coriander seed (Coriandrum sativum), angelica root (Angelica archangelica), and dry lemon peel (Citrus limonum). This was distilled on a traditional still utilizing atmospheric pressure and a heating mantel to initiate phase separation as well as a novel still (high vacuum) utilizing high vacuum pressures below 0.1 mmHg and temperatures below -15 degrees C to initiate phase separation. The degradation products (alpha-pinene, alpha-phellandrene, E-caryophyllene, and beta-myrcene) were present at greater levels (approximately 10 times) in the traditional still-made gin as compared to the novel gin.

  5. Failure mode and effects analysis: a comparison of two common risk prioritisation methods.

    PubMed

    McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L

    2016-05-01

    Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  6. Heuristic Modeling for TRMM Lifetime Predictions

    NASA Technical Reports Server (NTRS)

    Jordan, P. S.; Sharer, P. J.; DeFazio, R. L.

    1996-01-01

    Analysis time for computing the expected mission lifetimes of proposed frequently maneuvering, tightly altitude constrained, Earth orbiting spacecraft have been significantly reduced by means of a heuristic modeling method implemented in a commercial-off-the-shelf spreadsheet product (QuattroPro) running on a personal computer (PC). The method uses a look-up table to estimate the maneuver frequency per month as a function of the spacecraft ballistic coefficient and the solar flux index, then computes the associated fuel use by a simple engine model. Maneuver frequency data points are produced by means of a single 1-month run of traditional mission analysis software for each of the 12 to 25 data points required for the table. As the data point computations are required only a mission design start-up and on the occasion of significant mission redesigns, the dependence on time consuming traditional modeling methods is dramatically reduced. Results to date have agreed with traditional methods to within 1 to 1.5 percent. The spreadsheet approach is applicable to a wide variety of Earth orbiting spacecraft with tight altitude constraints. It will be particularly useful to such missions as the Tropical Rainfall Measurement Mission scheduled for launch in 1997, whose mission lifetime calculations are heavily dependent on frequently revised solar flux predictions.

  7. Micro versus macro solid phase extraction for monitoring water contaminants: a preliminary study using trihalomethanes.

    PubMed

    Alexandrou, Lydon D; Spencer, Michelle J S; Morrison, Paul D; Meehan, Barry J; Jones, Oliver A H

    2015-04-15

    Solid phase extraction is one of the most commonly used pre-concentration and cleanup steps in environmental science. However, traditional methods need electrically powered pumps, can use large volumes of solvent (if multiple samples are run), and require several hours to filter a sample. Additionally, if the cartridge is open to the air volatile compounds may be lost and sample integrity compromised. In contrast, micro cartridge based solid phase extraction can be completed in less than 2 min by hand, uses only microlitres of solvent and provides comparable concentration factors to established methods. It is also an enclosed system so volatile components are not lost. The sample can also be eluted directly into a detector (e.g. a mass spectrometer) if required. However, the technology is new and has not been much used for environmental analysis. In this study we compare traditional (macro) and the new micro solid phase extraction for the analysis of four common volatile trihalomethanes (trichloromethane, bromodichloromethane, dibromochloromethane and tribromomethane). The results demonstrate that micro solid phase extraction is faster and cheaper than traditional methods with similar recovery rates for the target compounds. This method shows potential for further development in a range of applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Chemical profiling approach to evaluate the influence of traditional and simplified decoction methods on the holistic quality of Da-Huang-Xiao-Shi decoction using high-performance liquid chromatography coupled with diode-array detection and time-of-flight mass spectrometry.

    PubMed

    Yan, Xuemei; Zhang, Qianying; Feng, Fang

    2016-04-01

    Da-Huang-Xiao-Shi decoction, consisting of Rheum officinale Baill, Mirabilitum, Phellodendron amurense Rupr. and Gardenia jasminoides Ellis, is a traditional Chinese medicine used for the treatment of jaundice. As described in "Jin Kui Yao Lue", a traditional multistep decoction of Da-Huang-Xiao-Shi decoction was required while simplified one-step decoction was used in recent repsorts. To investigate the chemical difference between the decoctions obtained by the traditional and simplified preparations, a sensitive and reliable approach of high-performance liquid chromatography coupled with diode-array detection and electrospray ionization time-of-flight mass spectrometry was established. As a result, a total of 105 compounds were detected and identified. Analysis of the chromatogram profiles of the two decoctions showed that many compounds in the decoction of simplified preparation had changed obviously compared with those in traditional preparation. The changes of constituents would be bound to cause the differences in the therapeutic effects of the two decoctions. The present study demonstrated that certain preparation methods significantly affect the holistic quality of traditional Chinese medicines and the use of a suitable preparation method is crucial for these medicines to produce special clinical curative effect. This research results elucidated the scientific basis of traditional preparation methods in Chinese medicines. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Using a novel flood prediction model and GIS automation to measure the valley and channel morphology of large river networks

    EPA Science Inventory

    Traditional methods for measuring river valley and channel morphology require intensive ground-based surveys which are often expensive, time consuming, and logistically difficult to implement. The number of surveys required to assess the hydrogeomorphic structure of large river n...

  10. Concurrent myotomy and tunneling after establishment of a half tunnel instead of myotomy after establishment of a full tunnel: a more efficient method of peroral endoscopic myotomy.

    PubMed

    Philips, George M; Dacha, Sunil; Keilin, Steve A; Willingham, Field F; Cai, Qiang

    2016-04-01

    Peroral endoscopic myotomy (POEM) is a time-consuming and challenging procedure. Traditionally, the myotomy is done after the submucosal tunnel has been completed. Starting the myotomy earlier, after submucosal tunneling is half completed (concurrent myotomy and tunneling), may be more efficient. This study aims to assess if the method of concurrent myotomy and tunneling may decrease the procedural time and be efficacious. This is a retrospective case series of patients who underwent modified POEM (concurrent myotomy and tunneling) or traditional POEM at a tertiary care medical center. Modified POEM or traditional POEM was performed at the discretion of the endoscopist in patients presenting with achalasia. The total procedural duration, myotomy duration, myotomy length, and time per unit length of myotomy were recorded for both modified and traditional POEM. Modified POEM was performed in 6 patients whose mean age (± standard deviation [SD]) was 58 ± 13.3 years. Of these, 5 patients had type II achalasia and 1 patient had esophageal dysmotility. The mean Eckardt score (± SD) before the procedure was 8.8 ± 1.3. The modified technique was performed in 47 ± 8 minutes, with 6 ± 1 minutes required per centimeter of myotomy and 3 ± 1 minutes required per centimeter of submucosal space. The Eckardt score was 3 ± 1.1 at 1 month and 3 ± 2.5 at 3 months. The procedure time for modified POEM was significantly shorter than that for traditional POEM. Modified POEM with short submucosal tunneling may be more efficient than traditional POEM with long submucosal tunneling, and outcomes may be equivalent over short-term follow-up. Long-term data and randomized controlled studies are needed to compare the clinical efficacy of modified POEM with that of the traditional method.

  11. Translating Radiometric Requirements for Satellite Sensors to Match International Standards.

    PubMed

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.

  12. Translating Radiometric Requirements for Satellite Sensors to Match International Standards

    PubMed Central

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032

  13. Flight Crew Training: Multi-Crew Pilot License Training versus Traditional Training and Its Relationship with Job Performance

    ERIC Educational Resources Information Center

    Cushing, Thomas S.

    2013-01-01

    In 2006, the International Civil Aviation Organization promulgated requirements for a Multi-Crew Pilot License for First Officers, in which the candidate attends approximately two years of ground school and trains as part of a two-person crew in a simulator of a Boeing 737 or an Airbus 320 airliner. In the traditional method, a candidate qualifies…

  14. Impact of abbreviated lecture with interactive mini-cases vs traditional lecture on student performance in the large classroom.

    PubMed

    Marshall, Leisa L; Nykamp, Diane L; Momary, Kathryn M

    2014-12-15

    To compare the impact of 2 different teaching and learning methods on student mastery of learning objectives in a pharmacotherapy module in the large classroom setting. Two teaching and learning methods were implemented and compared in a required pharmacotherapy module for 2 years. The first year, multiple interactive mini-cases with inclass individual assessment and an abbreviated lecture were used to teach osteoarthritis; a traditional lecture with 1 inclass case discussion was used to teach gout. In the second year, the same topics were used but the methods were flipped. Student performance on pre/post individual readiness assessment tests (iRATs), case questions, and subsequent examinations were compared each year by the teaching and learning method and then between years by topic for each method. Students also voluntarily completed a 20-item evaluation of the teaching and learning methods. Postpresentation iRATs were significantly higher than prepresentation iRATs for each topic each year with the interactive mini-cases; there was no significant difference in iRATs before and after traditional lecture. For osteoarthritis, postpresentation iRATs after interactive mini-cases in year 1 were significantly higher than postpresentation iRATs after traditional lecture in year 2; the difference in iRATs for gout per learning method was not significant. The difference between examination performance for osteoarthritis and gout was not significant when the teaching and learning methods were compared. On the student evaluations, 2 items were significant both years when answers were compared by teaching and learning method. Each year, students ranked their class participation higher with interactive cases than with traditional lecture, but both years they reported enjoying the traditional lecture format more. Multiple interactive mini-cases with an abbreviated lecture improved immediate mastery of learning objectives compared to a traditional lecture format, regardless of therapeutic topic, but did not improve student performance on subsequent examinations.

  15. [Preparation of panax notoginseng saponins-tanshinone H(A) composite method for pulmonary delivery with spray-drying method and its characterization].

    PubMed

    Wang, Hua-Mei; Fu, Ting-Ming; Guo, Li-Wei

    2013-02-01

    To prepare panax notoginseng saponins-tanshinone II(A) composite particles for pulmonary delivery, in order to explore a dry powder particle preparation method ensuring synchronized arrival of multiple components of traditional Chinese medicine compounds at absorption sites. Panax notoginseng saponins-tanshinone II(A) composite particles were prepared with spray-drying method, and characterized by scanning electron microscopy (SEM), confocal laser scanning microscope (CLSM), X-ray diffraction (XRD), infrared analysis (IR), dry laser particle size analysis, high performance liquid chromatography (HPLC) and the aerodynamic behavior was evaluated by a Next Generation Impactor (NGI). The dry powder particles produced had narrow particle size distribution range and good aerodynamic behavior, and could realize synchronized administration of multiple components. The spray-drying method is used to combine traditional Chinese medicine components with different physical and chemical properties in the same particle, and product into traditional Chinese medicine compound particles in line with the requirements for pulmonary delivery.

  16. Concurrent myotomy and tunneling after establishment of a half tunnel instead of myotomy after establishment of a full tunnel: a more efficient method of peroral endoscopic myotomy

    PubMed Central

    Philips, George M.; Dacha, Sunil; Keilin, Steve A.; Willingham, Field F.; Cai, Qiang

    2016-01-01

    Background and study aims: Peroral endoscopic myotomy (POEM) is a time-consuming and challenging procedure. Traditionally, the myotomy is done after the submucosal tunnel has been completed. Starting the myotomy earlier, after submucosal tunneling is half completed (concurrent myotomy and tunneling), may be more efficient. This study aims to assess if the method of concurrent myotomy and tunneling may decrease the procedural time and be efficacious. Patients and methods: This is a retrospective case series of patients who underwent modified POEM (concurrent myotomy and tunneling) or traditional POEM at a tertiary care medical center. Modified POEM or traditional POEM was performed at the discretion of the endoscopist in patients presenting with achalasia. The total procedural duration, myotomy duration, myotomy length, and time per unit length of myotomy were recorded for both modified and traditional POEM. Results: Modified POEM was performed in 6 patients whose mean age (± standard deviation [SD]) was 58 ± 13.3 years. Of these, 5 patients had type II achalasia and 1 patient had esophageal dysmotility. The mean Eckardt score (± SD) before the procedure was 8.8 ± 1.3. The modified technique was performed in 47 ± 8 minutes, with 6 ± 1 minutes required per centimeter of myotomy and 3 ± 1 minutes required per centimeter of submucosal space. The Eckardt score was 3 ± 1.1 at 1 month and 3 ± 2.5 at 3 months. The procedure time for modified POEM was significantly shorter than that for traditional POEM. Conclusions: Modified POEM with short submucosal tunneling may be more efficient than traditional POEM with long submucosal tunneling, and outcomes may be equivalent over short-term follow-up. Long-term data and randomized controlled studies are needed to compare the clinical efficacy of modified POEM with that of the traditional method. PMID:27092318

  17. [Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].

    PubMed

    Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia

    2008-07-01

    Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.

  18. Spatial modelling of disease using data- and knowledge-driven approaches.

    PubMed

    Stevens, Kim B; Pfeiffer, Dirk U

    2011-09-01

    The purpose of spatial modelling in animal and public health is three-fold: describing existing spatial patterns of risk, attempting to understand the biological mechanisms that lead to disease occurrence and predicting what will happen in the medium to long-term future (temporal prediction) or in different geographical areas (spatial prediction). Traditional methods for temporal and spatial predictions include general and generalized linear models (GLM), generalized additive models (GAM) and Bayesian estimation methods. However, such models require both disease presence and absence data which are not always easy to obtain. Novel spatial modelling methods such as maximum entropy (MAXENT) and the genetic algorithm for rule set production (GARP) require only disease presence data and have been used extensively in the fields of ecology and conservation, to model species distribution and habitat suitability. Other methods, such as multicriteria decision analysis (MCDA), use knowledge of the causal factors of disease occurrence to identify areas potentially suitable for disease. In addition to their less restrictive data requirements, some of these novel methods have been shown to outperform traditional statistical methods in predictive ability (Elith et al., 2006). This review paper provides details of some of these novel methods for mapping disease distribution, highlights their advantages and limitations, and identifies studies which have used the methods to model various aspects of disease distribution. Copyright © 2011. Published by Elsevier Ltd.

  19. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  20. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  1. A rapid method for the sampling of atmospheric water vapour for isotopic analysis.

    PubMed

    Peters, Leon I; Yakir, Dan

    2010-01-01

    Analysis of the stable isotopic composition of atmospheric moisture is widely applied in the environmental sciences. Traditional methods for obtaining isotopic compositional data from ambient moisture have required complicated sampling procedures, expensive and sophisticated distillation lines, hazardous consumables, and lengthy treatments prior to analysis. Newer laser-based techniques are expensive and usually not suitable for large-scale field campaigns, especially in cases where access to mains power is not feasible or high spatial coverage is required. Here we outline the construction and usage of a novel vapour-sampling system based on a battery-operated Stirling cycle cooler, which is simple to operate, does not require any consumables, or post-collection distillation, and is light-weight and highly portable. We demonstrate the ability of this system to reproduce delta(18)O isotopic compositions of ambient water vapour, with samples taken simultaneously by a traditional cryogenic collection technique. Samples were collected over 1 h directly into autosampler vials and were analysed by mass spectrometry after pyrolysis of 1 microL aliquots to CO. This yielded an average error of < +/-0.5 per thousand, approximately equal to the signal-to-noise ratio of traditional approaches. This new system provides a rapid and reliable alternative to conventional cryogenic techniques, particularly in cases requiring high sample throughput or where access to distillation lines, slurry maintenance or mains power is not feasible. Copyright 2009 John Wiley & Sons, Ltd.

  2. Development of a novel and highly efficient method of isolating bacteriophages from water.

    PubMed

    Liu, Weili; Li, Chao; Qiu, Zhi-Gang; Jin, Min; Wang, Jing-Feng; Yang, Dong; Xiao, Zhong-Hai; Yuan, Zhao-Kang; Li, Jun-Wen; Xu, Qun-Ying; Shen, Zhi-Qiang

    2017-08-01

    Bacteriophages are widely used to the treatment of drug-resistant bacteria and the improvement of food safety through bacterial lysis. However, the limited investigations on bacteriophage restrict their further application. In this study, a novel and highly efficient method was developed for isolating bacteriophage from water based on the electropositive silica gel particles (ESPs) method. To optimize the ESPs method, we evaluated the eluent type, flow rate, pH, temperature, and inoculation concentration of bacteriophage using bacteriophage f2. The quantitative detection reported that the recovery of the ESPs method reached over 90%. The qualitative detection demonstrated that the ESPs method effectively isolated 70% of extremely low-concentration bacteriophage (10 0 PFU/100L). Based on the host bacteria composed of 33 standard strains and 10 isolated strains, the bacteriophages in 18 water samples collected from the three sites in the Tianjin Haihe River Basin were isolated by the ESPs and traditional methods. Results showed that the ESPs method was significantly superior to the traditional method. The ESPs method isolated 32 strains of bacteriophage, whereas the traditional method isolated 15 strains. The sample isolation efficiency and bacteriophage isolation efficiency of the ESPs method were 3.28 and 2.13 times higher than those of the traditional method. The developed ESPs method was characterized by high isolation efficiency, efficient handling of large water sample size and low requirement on water quality. Copyright © 2017. Published by Elsevier B.V.

  3. Multiplexed microsatellite recovery using massively parallel sequencing

    Treesearch

    T.N. Jennings; B.J. Knaus; T.D. Mullins; S.M. Haig; R.C. Cronn

    2011-01-01

    Conservation and management of natural populations requires accurate and inexpensive genotyping methods. Traditional microsatellite, or simple sequence repeat (SSR), marker analysis remains a popular genotyping method because of the comparatively low cost of marker development, ease of analysis and high power of genotype discrimination. With the availability of...

  4. A Novel Approach to Rotorcraft Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Everett, Richard A.; Newman, John A.

    2002-01-01

    Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.

  5. Precision wildlife monitoring using unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Hodgson, Jarrod C.; Baylis, Shane M.; Mott, Rowan; Herrod, Ashley; Clarke, Rohan H.

    2016-03-01

    Unmanned aerial vehicles (UAVs) represent a new frontier in environmental research. Their use has the potential to revolutionise the field if they prove capable of improving data quality or the ease with which data are collected beyond traditional methods. We apply UAV technology to wildlife monitoring in tropical and polar environments and demonstrate that UAV-derived counts of colony nesting birds are an order of magnitude more precise than traditional ground counts. The increased count precision afforded by UAVs, along with their ability to survey hard-to-reach populations and places, will likely drive many wildlife monitoring projects that rely on population counts to transition from traditional methods to UAV technology. Careful consideration will be required to ensure the coherence of historic data sets with new UAV-derived data and we propose a method for determining the number of duplicated (concurrent UAV and ground counts) sampling points needed to achieve data compatibility.

  6. Explicating Metatheory for Mixed Methods Research in Educational Leadership: An Application of Habermas's "Theory of Communicative Action"

    ERIC Educational Resources Information Center

    Whiteman, Rodney S.

    2015-01-01

    Purpose: Mixed methods research can provide a fruitful line of inquiry for educational leadership, program evaluation, and policy analysis; however, mixed methods research requires a metatheory that allows for mixing what have traditionally been considered incompatible qualitative and quantitative inquiry. The purpose of this paper is to apply…

  7. The use of video conferencing to develop a community of practice for preceptors located in rural and non traditional placement settings: an evaluation study.

    PubMed

    Zournazis, Helen E; Marlow, Annette H

    2015-03-01

    Support for nursing students in rural and non-traditional health environments within Tasmania is predominately undertaken by preceptors. It is recognised that preceptors who work within these environments, require support in their role and opportunities to communicate with academic staff within universities. Multiple methods of information distribution support and networking opportunities provide preceptors with flexible options to keep them abreast of the student learning process. This paper presents survey findings from preceptors in rural and non-traditional professional experience placement environments taken from a pilot project regarding the implementation of video conferencing forums for education and peer networking in Tasmania. The purpose of the evaluation was to establish whether video conferencing met the requirements of preceptors' understanding of learning and teaching requirements during students' professional experience placement. The findings reveal preceptors' workload pressures and the need for organisational support were key barriers that prevented preceptor participation. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  8. Infography use to requirements specification for the design of the building

    NASA Astrophysics Data System (ADS)

    Losev, Konstantin; Sinenko, Sergey

    2017-10-01

    The study contributes to a growing body of research Transport infrastructure in a construction object life cycle management and presents areas in which further investigation is needed. The object of study are Railway buildings and structures and the Employer’s information requirements (EIR) for design of individual residential building. The task of the study was to determine necessary and sufficient scope of parameters which contained in inforaphic form of EIR comparing with traditional text form of EIR. Also, the task was to determine what categories of the traditional EIR are transferred to an infographic representation form and what categories are ignored in case of a relatively low complexity building. Methods that have been used in the study were infographical representation of text and further expert evaluation. Conclusions of the study present the necessary and sufficient scope of parameters for inforaphic form of EIR, the relations between infographic parameters and categories of the EIR traditional form and subcategories of the traditional EIR that are ignored in case of a relatively low complexity building.

  9. Enzymatic preparation of nanocrystalline and microcrystalline cellulose

    Treesearch

    Sarah R. Anderson; Dominic Esposito; William Gillette; J.Y. Zhu; Ulrich Baxa; Scott E. Mcneil

    2014-01-01

    Traditional cellulose nanocrystal (CNC) production methods use harsh chemicals, are energetically expensive, and result in a hydrophilic sulfate surface chemistry with limited utility. Enzymatic production of CNCs is a less expensive alternative production method that eliminates the need for harsh chemicals and requires much less energy for mechanical fibrillation and...

  10. Calculating the Financial Impact of Population Growth on Education.

    ERIC Educational Resources Information Center

    Cline, Daniel H.

    It is particularly difficult to make accurate enrollment projections for areas that are experiencing a rapid expansion in their population. The traditional method of calculating cohort survival ratios must be modified and supplemented with additional information to ensure accuracy; cost projection methods require detailed analyses of current costs…

  11. Predictive models for Escherichia coli concentrations at inland lake beaches and relationship of model variables to pathogen detection

    EPA Science Inventory

    Methods are needed improve the timeliness and accuracy of recreational water‐quality assessments. Traditional culture methods require 18–24 h to obtain results and may not reflect current conditions. Predictive models, based on environmental and water quality variables, have been...

  12. Preparing Students for Flipped or Team-Based Learning Methods

    ERIC Educational Resources Information Center

    Balan, Peter; Clark, Michele; Restall, Gregory

    2015-01-01

    Purpose: Teaching methods such as Flipped Learning and Team-Based Learning require students to pre-learn course materials before a teaching session, because classroom exercises rely on students using self-gained knowledge. This is the reverse to "traditional" teaching when course materials are presented during a lecture, and students are…

  13. Simplified Microarray Technique for Identifying mRNA in Rare Samples

    NASA Technical Reports Server (NTRS)

    Almeida, Eduardo; Kadambi, Geeta

    2007-01-01

    Two simplified methods of identifying messenger ribonucleic acid (mRNA), and compact, low-power apparatuses to implement the methods, are at the proof-of-concept stage of development. These methods are related to traditional methods based on hybridization of nucleic acid, but whereas the traditional methods must be practiced in laboratory settings, these methods could be practiced in field settings. Hybridization of nucleic acid is a powerful technique for detection of specific complementary nucleic acid sequences, and is increasingly being used for detection of changes in gene expression in microarrays containing thousands of gene probes. A traditional microarray study entails at least the following six steps: 1. Purification of cellular RNA, 2. Amplification of complementary deoxyribonucleic acid [cDNA] by polymerase chain reaction (PCR), 3. Labeling of cDNA with fluorophores of Cy3 (a green cyanine dye) and Cy5 (a red cyanine dye), 4. Hybridization to a microarray chip, 5. Fluorescence scanning the array(s) with dual excitation wavelengths, and 6. Analysis of the resulting images. This six-step procedure must be performed in a laboratory because it requires bulky equipment.

  14. A novel calibration method for non-orthogonal shaft laser theodolite measurement system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Bin, E-mail: wubin@tju.edu.cn, E-mail: xueting@tju.edu.cn; Yang, Fengting; Ding, Wen

    2016-03-15

    Non-orthogonal shaft laser theodolite (N-theodolite) is a new kind of large-scale metrological instrument made up by two rotary tables and one collimated laser. There are three axes for an N-theodolite. According to naming conventions in traditional theodolite, rotary axes of two rotary tables are called as horizontal axis and vertical axis, respectively, and the collimated laser beam is named as sight axis. And the difference between N-theodolite and traditional theodolite is obvious, since the former one with no orthogonal and intersecting accuracy requirements. So the calibration method for traditional theodolite is no longer suitable for N-theodolite, while the calibration methodmore » applied currently is really complicated. Thus this paper introduces a novel calibration method for non-orthogonal shaft laser theodolite measurement system to simplify the procedure and to improve the calibration accuracy. A simple two-step process, calibration for intrinsic parameters and for extrinsic parameters, is proposed by the novel method. And experiments have shown its efficiency and accuracy.« less

  15. Integrating Internal Standards into Disposable Capillary Electrophoresis Devices To Improve Quantification

    PubMed Central

    2017-01-01

    To improve point-of-care quantification using microchip capillary electrophoresis (MCE), the chip-to-chip variabilities inherent in disposable, single-use devices must be addressed. This work proposes to integrate an internal standard (ISTD) into the microchip by adding it to the background electrolyte (BGE) instead of the sample—thus eliminating the need for additional sample manipulation, microchip redesigns, and/or system expansions required for traditional ISTD usage. Cs and Li ions were added as integrated ISTDs to the BGE, and their effects on the reproducibility of Na quantification were explored. Results were then compared to the conclusions of our previous publication which used Cs and Li as traditional ISTDs. The in-house fabricated microchips, electrophoretic protocols, and solution matrixes were kept constant, allowing the proposed method to be reliably compared to the traditional method. Using the integrated ISTDs, both Cs and Li improved the Na peak area reproducibility approximately 2-fold, to final RSD values of 2.2–4.7% (n = 900). In contrast (to previous work), Cs as a traditional ISTD resulted in final RSDs of 2.5–8.8%, while the traditional Li ISTD performed poorly with RSDs of 6.3–14.2%. These findings suggest integrated ISTDs are a viable method to improve the precision of disposable MCE devices—giving matched or superior results to the traditional method in this study while neither increasing system cost nor complexity. PMID:28192985

  16. Value of the polymerase chain reaction method for detecting tuberculosis in the bronchial tissue involved by anthracosis.

    PubMed

    Mirsadraee, Majid; Shafahie, Ahmad; Reza Khakzad, Mohammad; Sankian, Mojtaba

    2014-04-01

    Anthracofibrosis is the black discoloration of the bronchial mucosa with deformity and obstruction. Association of this disease with tuberculosis (TB) was approved. The objective of this study was to find the additional benefit of assessment of TB by the polymerase chain reaction (PCR) method. Bronchoscopy was performed on 103 subjects (54 anthracofibrosis and 49 control subjects) who required bronchoscopy for their pulmonary problems. According to bronchoscopic findings, participants were classified to anthracofibrosis and nonanthracotic groups. They were examined for TB with traditional methods such as direct smear (Ziehl-Neelsen staining), Löwenstein-Jensen culture, and histopathology and the new method "PCR" for Mycobacterium tuberculosis genome (IS6110). Age, sex, smoking, and clinical findings were not significantly different in the TB and the non-TB groups. Acid-fast bacilli could be detected by a direct smear in 12 (25%) of the anthracofibrosis subjects, and adding the results of culture and histopathology traditional tests indicated TB in 27 (31%) of the cases. Mycobacterium tuberculosis was diagnosed by PCR in 18 (33%) patients, but the difference was not significant. Detection of acid-fast bacilli in control nonanthracosis subjects was significantly lower (3, 6%), but PCR (20, 40%) and accumulation of results from all traditional methods (22, 44%) showed a nonsignificant difference. The PCR method showed a result equal to traditional methods including accumulation of smear, culture, and histopathology.

  17. Traditional Chinese and Thai medicine in a comparative perspective.

    PubMed

    He, Ke

    2015-12-01

    The work presented in this paper compares traditional Chinese medicine and traditional Thai medicine, expounding on origins, academic thinking, theoretical system, diagnostic method and modern development. Based on a secondary analysis of available literature, the paper concentrates on two crucial historical developments: (1) the response to, and consequences of, the impact of the Western medicine; and (2) the revival of traditional medicine in these two countries and its prospects. From a comparative perspective, the analysis has led to the conclusion that the rise and fall of traditional medicine is an issue closely related with social and political issues; and the development of traditional medicines requires national policy and financial support from governments, human resource development, the improvement of service quality, and the dissemination of traditional medicine knowledge to the public. In addition, this paper also suggests deepening exchanges and cooperation between China and Thailand, strengthening cooperation between traditional medicine and medical tourism. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Tensor-GMRES method for large sparse systems of nonlinear equations

    NASA Technical Reports Server (NTRS)

    Feng, Dan; Pulliam, Thomas H.

    1994-01-01

    This paper introduces a tensor-Krylov method, the tensor-GMRES method, for large sparse systems of nonlinear equations. This method is a coupling of tensor model formation and solution techniques for nonlinear equations with Krylov subspace projection techniques for unsymmetric systems of linear equations. Traditional tensor methods for nonlinear equations are based on a quadratic model of the nonlinear function, a standard linear model augmented by a simple second order term. These methods are shown to be significantly more efficient than standard methods both on nonsingular problems and on problems where the Jacobian matrix at the solution is singular. A major disadvantage of the traditional tensor methods is that the solution of the tensor model requires the factorization of the Jacobian matrix, which may not be suitable for problems where the Jacobian matrix is large and has a 'bad' sparsity structure for an efficient factorization. We overcome this difficulty by forming and solving the tensor model using an extension of a Newton-GMRES scheme. Like traditional tensor methods, we show that the new tensor method has significant computational advantages over the analogous Newton counterpart. Consistent with Krylov subspace based methods, the new tensor method does not depend on the factorization of the Jacobian matrix. As a matter of fact, the Jacobian matrix is never needed explicitly.

  19. I Environmental DNA sampling is more sensitive than a traditional survey technique for detecting an aquatic invader.

    PubMed

    Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A

    2015-10-01

    Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales.

  20. Machine cost analysis using the traditional machine-rate method and ChargeOut!

    Treesearch

    E. M. (Ted) Bilek

    2009-01-01

    Forestry operations require ever more use of expensive capital equipment. Mechanization is frequently necessary to perform cost-effective and safe operations. Increased capital should mean more sophisticated capital costing methodologies. However the machine rate method, which is the costing methodology most frequently used, dates back to 1942. CHARGEOUT!, a recently...

  1. 3. Evaluation of unstable lands for interagency watershed analysis

    Treesearch

    Leslie M. Reid; Robert R. Ziemer; Mark E. Smith; Colin Close

    1994-01-01

    Abstract - Although methods for evaluating landslide rates and distributions are well developed, much less attention has been paid to evaluating the biological and physical role of landsliding. New directions in land management on Federal lands of the Pacific Northwest now require such evaluations for designing Riparian Reserves. Traditional analysis methods are no...

  2. Turning Teaching Upside Down

    ERIC Educational Resources Information Center

    Seeley, Cathy L.

    2017-01-01

    The traditional method of teaching math--showing students how to do a procedure, then assigning problems that require them to use that exact procedure--leads to adults who don't know how to approach problems that don't look like those in their math book. Seeley describes an alternative teaching method (upside-down teaching) in which teachers give…

  3. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  4. Task analysis method for procedural training curriculum development.

    PubMed

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments.

  5. Evaluation of Team-Based Learning and Traditional Instruction in Teaching Removable Partial Denture Concepts.

    PubMed

    Echeto, Luisa F; Sposetti, Venita; Childs, Gail; Aguilar, Maria L; Behar-Horenstein, Linda S; Rueda, Luis; Nimmo, Arthur

    2015-09-01

    The aim of this study was to evaluate the effectiveness of team-based learning (TBL) methodology on dental students' retention of knowledge regarding removable partial denture (RPD) treatment. The process of learning RPD treatment requires that students first acquire foundational knowledge and then use critical thinking skills to apply that knowledge to a variety of clinical situations. The traditional approach to teaching, characterized by a reliance on lectures, is not the most effective method for learning clinical applications. To address the limitations of that approach, the teaching methodology of the RPD preclinical course at the University of Florida was changed to TBL, which has been shown to motivate student learning and improve clinical performance. A written examination was constructed to compare the impact of TBL with that of traditional teaching regarding students' retention of knowledge and their ability to evaluate, diagnose, and treatment plan a partially edentulous patient with an RPD prosthesis. Students taught using traditional and TBL methods took the same examination. The response rate (those who completed the examination) for the class of 2013 (traditional method) was 94% (79 students of 84); for the class of 2014 (TBL method), it was 95% (78 students of 82). The results showed that students who learned RPD with TBL scored higher on the examination than those who learned RPD with traditional methods. Compared to the students taught with the traditional method, the TBL students' proportion of passing grades was statistically significantly higher (p=0.002), and 23.7% more TBL students passed the examination. The mean score for the TBL class (0.758) compared to the conventional class (0.700) was statistically significant with a large effect size, also demonstrating the practical significance of the findings. The results of the study suggest that TBL methodology is a promising approach to teaching RPD with successful outcomes.

  6. The Evolution of Genetics: Alzheimer's and Parkinson's Diseases.

    PubMed

    Singleton, Andrew; Hardy, John

    2016-06-15

    Genetic discoveries underlie the majority of the current thinking in neurodegenerative disease. This work has been driven by the significant gains made in identifying causal mutations; however, the translation of genetic causes of disease into pathobiological understanding remains a challenge. The application of a second generation of genetics methods allows the dissection of moderate and mild genetic risk factors for disease. This requires new thinking in two key areas: what constitutes proof of pathogenicity, and how do we translate these findings to biological understanding. Here we describe the progress and ongoing evolution in genetics. We describe a view that rejects the tradition that genetic proof has to be absolute before functional characterization and centers on a multi-dimensional approach integrating genetics, reference data, and functional work. We also argue that these challenges cannot be efficiently met by traditional hypothesis-driven methods but that high content system-wide efforts are required. Published by Elsevier Inc.

  7. Pain Perception: Computerized versus Traditional Local Anesthesia in Pediatric Patients.

    PubMed

    Mittal, M; Kumar, A; Srivastava, D; Sharma, P; Sharma, S

    2015-01-01

    Local anesthetic injection is one of the most anxiety- provoking procedure for both children and adult patients in dentistry. A computerized system for slow delivery of local anesthetic has been developed as a possible solution to reduce the pain related to the local anesthetic injection. The present study was conducted to evaluate and compare pain perception rates in pediatric patients with computerized system and traditional methods, both objectively and subjectively. It was a randomized controlled study in one hundred children aged 8-12 years in healthy physical and mental state, assessed as being cooperative, requiring extraction of maxillary primary molars. Children were divided into two groups by random sampling - Group A received buccal and palatal infiltration injection using Wand, while Group B received buccal and palatal infiltration using traditional syringe. Visual Analog scale (VAS) was used for subjective evaluation of pain perception by patient. Sound, Eye, Motor (SEM) scale was used as an objective method where sound, eye and motor reactions of patient were observed and heart rate measurement using pulse oximeter was used as the physiological parameter for objective evaluation. Patients experienced significantly less pain of injection with the computerized method during palatal infiltration, while less pain was not statistically significant during buccal infiltration. Heart rate increased during both buccal and palatal infiltration in traditional and computerized local anesthesia, but difference between traditional and computerized method was not statistically significant. It was concluded that pain perception was significantly more during traditional palatal infiltration injection as compared to computerized palatal infiltration, while there was no difference in pain perception during buccal infiltration in both the groups.

  8. High-throughput screening for bioactive components from traditional Chinese medicine.

    PubMed

    Zhu, Yanhui; Zhang, Zhiyun; Zhang, Meng; Mais, Dale E; Wang, Ming-Wei

    2010-12-01

    Throughout the centuries, traditional Chinese medicine has been a rich resource in the development of new drugs. Modern drug discovery, which relies increasingly on automated high throughput screening and quick hit-to-lead development, however, is confronted with the challenges of the chemical complexity associated with natural products. New technologies for biological screening as well as library building are in great demand in order to meet the requirements. Here we review the developments in these techniques under the perspective of their applicability in natural product drug discovery. Methods in library building, component characterizing, biological evaluation, and other screening methods including NMR and X-ray diffraction are discussed.

  9. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  10. Table-top job analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-12-01

    The purpose of this Handbook is to establish general training program guidelines for training personnel in developing training for operation, maintenance, and technical support personnel at Department of Energy (DOE) nuclear facilities. TTJA is not the only method of job analysis; however, when conducted properly TTJA can be cost effective, efficient, and self-validating, and represents an effective method of defining job requirements. The table-top job analysis is suggested in the DOE Training Accreditation Program manuals as an acceptable alternative to traditional methods of analyzing job requirements. DOE 5480-20A strongly endorses and recommends it as the preferred method for analyzing jobsmore » for positions addressed by the Order.« less

  11. Improving traditional balancing methods for high-speed rotors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ling, J.; Cao, Y.

    1996-01-01

    This paper introduces frequency response functions, analyzes the relationships between the frequency response functions and influence coefficients theoretically, and derives corresponding mathematical equations for high-speed rotor balancing. The relationships between the imbalance masses on the rotor and frequency response functions are also analyzed based upon the modal balancing method, and the equations related to the static and dynamic imbalance masses and the frequency response function are obtained. Experiments on a high-speed rotor balancing rig were performed to verify the theory, and the experimental data agree satisfactorily with the analytical solutions. The improvement on the traditional balancing method proposed in thismore » paper will substantially reduce the number of rotor startups required during the balancing process of rotating machinery.« less

  12. Comparing the long-term retention of a physiology course for medical students with the traditional and problem-based learning.

    PubMed

    Pourshanazari, A A; Roohbakhsh, A; Khazaei, M; Tajadini, H

    2013-03-01

    The rapid improvements in medical sciences and the ever-increasing related data, however, require novel methods of instruction. One such method, which has been given less than due attention in Iran, is problem-based learning (PBL). In this study, we aimed to evaluate the impact of study skills and the PBL methods on short and long-term retention of information provided for medical students in the course of respiratory physiology and compare it with traditional learning method. In this study, 39 medical students from Medical School of Kerman University of Medical Sciences, Kerman, Iran (2006-2010) were enrolled in the study and allocated randomly in three equal groups (13 in each group). All groups underwent a pre-test to be assessed for their basic information regarding respiratory physiology. Two groups were instructed using the traditional method, and one group used PBL. Among the two groups of the traditional method, one was instructed about study skills and the other was not. Once the PBL group took the study skill workshop, they were aided by tutors for their education. In the final term test, those students who had learned study skills and were instructed with the traditional method scored higher compared to other groups (p < 0.05). However, in the 1 year (p < 0.05) and 4 year (p < 0.01) interval examinations, the PBL group achieved significantly higher scores. Despite the fact that PBL had no positive effect on the final term exam of our students, it yielded a more profound and retained understanding of the subject course. Moreover, considering the positive effect of study skills on long-term student scores, we recommend students to receive instructions regarding the appropriate study skills when initiated into universities.

  13. Methods for Automating Analysis of Glacier Morphology for Regional Modelling: Centerlines, Extensions, and Elevation Bands

    NASA Astrophysics Data System (ADS)

    Viger, R. J.; Van Beusekom, A. E.

    2016-12-01

    The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.

  14. CB4-03: An Eye on the Future: A Review of Data Virtualization Techniques to Improve Research Analytics

    PubMed Central

    Richter, Jack; McFarland, Lela; Bredfeldt, Christine

    2012-01-01

    Background/Aims Integrating data across systems can be a daunting process. The traditional method of moving data to a common location, mapping fields with different formats and meanings, and performing data cleaning activities to ensure valid and reliable integration across systems can be both expensive and extremely time consuming. As the scope of needed research data increases, the traditional methodology may not be sustainable. Data Virtualization provides an alternative to traditional methods that may reduce the effort required to integrate data across disparate systems. Objective Our goal was to survey new methods in data integration, cloud computing, enterprise data management and virtual data management for opportunities to increase the efficiency of producing VDW and similar data sets. Methods Kaiser Permanente Information Technology (KPIT), in collaboration with the Mid-Atlantic Permanente Research Institute (MAPRI) reviewed methodologies in the burgeoning field of Data Virtualization. We identified potential strengths and weaknesses of new approaches to data integration. For each method, we evaluated its potential application for producing effective research data sets. Results Data Virtualization provides opportunities to reduce the amount of data movement required to integrate data sources on different platforms in order to produce research data sets. Additionally, Data Virtualization also includes methods for managing “fuzzy” matching used to match fields known to have poor reliability such as names, addresses and social security numbers. These methods could improve the efficiency of integrating state and federal data such as patient race, death, and tumors with internal electronic health record data. Discussion The emerging field of Data Virtualization has considerable potential for increasing the efficiency of producing research data sets. An important next step will be to develop a proof of concept project that will help us understand to benefits and drawbacks of these techniques.

  15. Least-cost transportation planning in ODOT : feasibility report.

    DOT National Transportation Integrated Search

    1995-03-01

    Least-Cost Planning or Integrated Resource Planning is used in the electric utility industry to broaden the scope of choices to meet service requirements. This typically includes methods to reduce to demands for electricity as well the more tradition...

  16. EPA Biofuels Research: Biofuel Vapor Generation and Monitoring Methods

    EPA Science Inventory

    The interest in renewable fuels and alternative energy sources has stimulated development of alternatives to traditional petroleum-based fuels. The EPA's Office of Transportation Air Quality (OTAQ) requires information regarding the potential health hazards ofthese fuels regardin...

  17. Estimation of crossing conflict at signalized intersection using high-resolution traffic data : final report.

    DOT National Transportation Integrated Search

    2017-03-01

    This project explores the possibility of using high-resolution traffic signal data to evaluate intersection safety. : Traditional methods using historical crash data collected from infrequently and randomly occurring vehicle : collisions can require ...

  18. A single center's experience with the bedside subdural evacuating port system: a useful alternative to traditional methods for chronic subdural hematoma evacuation.

    PubMed

    Safain, Mina; Roguski, Marie; Antoniou, Alexander; Schirmer, Clemens M; Schirmer, Clemens S; Malek, Adel M; Riesenburger, Ron

    2013-03-01

    Object The traditional methods for managing symptomatic chronic subdural hematoma (SDH) include evacuation via a bur hole or craniotomy, both with or without drain placement. Because chronic SDH frequently occurs in elderly patients with multiple comorbidities, the bedside approach afforded by the subdural evacuating port system (SEPS) is an attractive alternative method that is performed under local anesthesia and conscious sedation. The goal of this study was to evaluate the radiographic and clinical outcomes of SEPS as compared with traditional methods. Methods A prospectively maintained database of 23 chronic SDHs treated by bur hole or craniotomy and of 23 chronic SDHs treated by SEPS drainage at Tufts Medical Center was compiled, and a retrospective chart review was performed. Information regarding demographics, comorbidities, presenting symptoms, and outcome was collected. The volume of SDH before and after treatment was semiautomatically measured using imaging software. Results There was no significant difference in initial SDH volume (94.5 cm(3) vs 112.6 cm(3), respectively; p = 0.25) or final SDH volume (31.9 cm(3) vs 28.2 cm(3), respectively; p = 0.65) between SEPS drainage and traditional methods. In addition, there was no difference in mortality (4.3% vs 9.1%, respectively; p = 0.61), length of stay (11 days vs 9.1 days, respectively; p = 0.48), or stability of subdural evacuation (94.1% vs 83.3%, respectively; p = 0.60) for the SEPS and traditional groups at an average follow-up of 12 and 15 weeks, respectively. Only 2 of 23 SDHs treated by SEPS required further treatment by bur hole or craniotomy due to inadequate evacuation of subdural blood. Conclusions The SEPS is a safe and effective alternative to traditional methods of evacuation of chronic SDHs and should be considered in patients presenting with a symptomatic chronic SDH.

  19. Evaluating Student-Generated Film as a Learning Tool for Qualitative Methods: Geographical "Drifts" and the City

    ERIC Educational Resources Information Center

    Anderson, Jon

    2013-01-01

    Film as a tool for learning offers considerable opportunity for enhancing student understanding. This paper reflects on the experiences of a project that required students to make a short film demonstrating their practical understanding of qualitative methods. In the psychogeographical tradition, students were asked to "drift" across the…

  20. A Model for Engaging Students in a Research Experience Involving Variational Techniques, Mathematica, and Descent Methods.

    ERIC Educational Resources Information Center

    Mahavier, W. Ted

    2002-01-01

    Describes a two-semester numerical methods course that serves as a research experience for undergraduate students without requiring external funding or the modification of current curriculum. Uses an engineering problem to introduce students to constrained optimization via a variation of the traditional isoperimetric problem of finding the curve…

  1. Using Student-Centered Cases in the Classroom: An Action Inquiry Approach to Leadership Development

    ERIC Educational Resources Information Center

    Foster, Pacey; Carboni, Inga

    2009-01-01

    This article addresses the concern that business schools are not adequately developing the practical leadership skills that are required in the real world of management. The article begins by discussing the limitations of traditional case methods for teaching behavioral skills. This approach is contrasted with an alternative case method drawn from…

  2. A fast algorithm for forward-modeling of gravitational fields in spherical coordinates with 3D Gauss-Legendre quadrature

    NASA Astrophysics Data System (ADS)

    Zhao, G.; Liu, J.; Chen, B.; Guo, R.; Chen, L.

    2017-12-01

    Forward modeling of gravitational fields at large-scale requires to consider the curvature of the Earth and to evaluate the Newton's volume integral in spherical coordinates. To acquire fast and accurate gravitational effects for subsurface structures, subsurface mass distribution is usually discretized into small spherical prisms (called tesseroids). The gravity fields of tesseroids are generally calculated numerically. One of the commonly used numerical methods is the 3D Gauss-Legendre quadrature (GLQ). However, the traditional GLQ integration suffers from low computational efficiency and relatively poor accuracy when the observation surface is close to the source region. We developed a fast and high accuracy 3D GLQ integration based on the equivalence of kernel matrix, adaptive discretization and parallelization using OpenMP. The equivalence of kernel matrix strategy increases efficiency and reduces memory consumption by calculating and storing the same matrix elements in each kernel matrix just one time. In this method, the adaptive discretization strategy is used to improve the accuracy. The numerical investigations show that the executing time of the proposed method is reduced by two orders of magnitude compared with the traditional method that without these optimized strategies. High accuracy results can also be guaranteed no matter how close the computation points to the source region. In addition, the algorithm dramatically reduces the memory requirement by N times compared with the traditional method, where N is the number of discretization of the source region in the longitudinal direction. It makes the large-scale gravity forward modeling and inversion with a fine discretization possible.

  3. A Summary of the Space-Time Conservation Element and Solution Element (CESE) Method

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.

    2015-01-01

    The space-time Conservation Element and Solution Element (CESE) method for solving conservation laws is examined for its development motivation and design requirements. The characteristics of the resulting scheme are discussed. The discretization of the Euler equations is presented to show readers how to construct a scheme based on the CESE method. The differences and similarities between the CESE method and other traditional methods are discussed. The strengths and weaknesses of the method are also addressed.

  4. The Determinants of Traditional Medicine Use in Northern Tanzania: A Mixed-Methods Study

    PubMed Central

    Stanifer, John W.; Patel, Uptal D.; Karia, Francis; Thielman, Nathan; Maro, Venance; Shimbi, Dionis; Kilaweh, Humphrey; Lazaro, Matayo; Matemu, Oliver; Omolo, Justin; Boyd, David

    2015-01-01

    Introduction Traditional medicines are an important part of healthcare in sub-Saharan Africa, and building successful disease treatment programs that are sensitive to traditional medicine practices will require an understanding of their current use and roles, including from a biomedical perspective. Therefore, we conducted a mixed-method study in Northern Tanzania in order to characterize the extent of and reasons for the use of traditional medicines among the general population so that we can better inform public health efforts in the region. Methods Between December 2013 and June 2014 in Kilimanjaro, Tanzania, we conducted 5 focus group discussions and 27 in-depth interviews of key informants. The data from these sessions were analyzed using an inductive framework method with cultural insider-outsider coding. From these results, we developed a structured survey designed to test different aspects of traditional medicine use and administered it to a random sample of 655 adults from the community. The results were triangulated to explore converging and diverging themes. Results Most structured survey participants (68%) reported knowing someone who frequently used traditional medicines, and the majority (56%) reported using them themselves in the previous year. The most common uses were for symptomatic ailments (42%), chronic diseases (15%), reproductive problems (11%), and malaria/febrile illnesses (11%). We identified five major determinants for traditional medicine use in Northern Tanzania: biomedical healthcare delivery, credibility of traditional practices, strong cultural identities, individual health status, and disease understanding. Conclusions In order to better formulate effective local disease management programs that are sensitive to TM practices, we described the determinants of TM use. Additionally, we found TM use to be high in Northern Tanzania and that its use is not limited to lower-income areas or rural settings. After symptomatic ailments, chronic diseases were reported as the most common reason for TM use which may be particularly important in Northern Tanzania where non-communicable diseases are a rapidly growing burden. PMID:25848762

  5. A dynamic access control method based on QoS requirement

    NASA Astrophysics Data System (ADS)

    Li, Chunquan; Wang, Yanwei; Yang, Baoye; Hu, Chunyang

    2013-03-01

    A dynamic access control method is put forward to ensure the security of the sharing service in Cloud Manufacturing, according to the application characteristics of cloud manufacturing collaborative task. The role-based access control (RBAC) model is extended according to the characteristics of cloud manufacturing in this method. The constraints are considered, which are from QoS requirement of the task context to access control, based on the traditional static authorization. The fuzzy policy rules are established about the weighted interval value of permissions. The access control authorities of executable service by users are dynamically adjusted through the fuzzy reasoning based on the QoS requirement of task. The main elements of the model are described. The fuzzy reasoning algorithm of weighted interval value based QoS requirement is studied. An effective method is provided to resolve the access control of cloud manufacturing.

  6. The Traditional Chinese Medicine and Relevant Treatment for the Efficacy and Safety of Atopic Dermatitis: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Shi, Zhao-feng; Song, Tie-bing; Xie, Juan; Yan, Yi-quan

    2017-01-01

    Background Atopic dermatitis (AD) has become a common skin disease that requires systematic and comprehensive treatment to achieve adequate clinical control. Traditional Chinese medicines and related treatments have shown clinical effects for AD in many studies. But the systematic reviews and meta-analyses for them are lacking. Objective The systematic review and meta-analysis based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement were conducted to evaluate the efficacy and safety of traditional Chinese medicines and related treatments for AD treatment. Methods Randomized controlled trials (RCTs) were searched based on standardized searching rules in eight medical databases from the inception up to December 2016 and a total of 24 articles with 1,618 patients were enrolled in this meta-analysis. Results The results revealed that traditional Chinese medicines and related treatments did not show statistical differences in clinical effectiveness, SCORAD amelioration, and SSRI amelioration for AD treatment compared with control group. However, EASI amelioration of traditional Chinese medicines and related treatments for AD was superior to control group. Conclusion We need to make conclusion cautiously for the efficacy and safety of traditional Chinese medicine and related treatment on AD therapy. More standard, multicenter, double-blind randomized controlled trials (RCTs) of traditional Chinese medicine and related treatment for AD were required to be conducted for more clinical evidences providing in the future. PMID:28713436

  7. Comparing the Long-Term Retention of a Physiology Course for Medical Students with the Traditional and Problem-Based Learning

    ERIC Educational Resources Information Center

    Pourshanazari, A. A.; Roohbakhsh, A.; Khazaei, M.; Tajadini, H.

    2013-01-01

    The rapid improvements in medical sciences and the ever-increasing related data, however, require novel methods of instruction. One such method, which has been given less than due attention in Iran, is problem-based learning (PBL). In this study, we aimed to evaluate the impact of study skills and the PBL methods on short and long-term retention…

  8. The relationship between human security, demand for arms and disarmament in the horn of Africa.

    PubMed

    Gebrewold, Kiflemariam

    2002-01-01

    The drive to find security through possession of weapons is linked to the history and culture of a social group. Amongst pastoralists in the Horn of Africa there is a failure of security through state systems such as police and the recent replacement of less-lethal traditional weapons by small arms and other light weapons. A warrior or vendetta culture with these arms leads to violent inter-clan clashes with many casualties, although traditional methods of weapons control still seem operational within clans. Understanding the drive to seek weapons is essential in finding ways to control their use. Improving the capacities of the police must come hand in hand with human rights training and an end to corruption. Further work is required on how traditional methods of arms control can be co-operatively linked with state controls.

  9. Iodine Absorption Cells Purity Testing.

    PubMed

    Hrabina, Jan; Zucco, Massimo; Philippe, Charles; Pham, Tuan Minh; Holá, Miroslava; Acef, Ouali; Lazar, Josef; Číp, Ondřej

    2017-01-06

    This article deals with the evaluation of the chemical purity of iodine-filled absorption cells and the optical frequency references used for the frequency locking of laser standards. We summarize the recent trends and progress in absorption cell technology and we focus on methods for iodine cell purity testing. We compare two independent experimental systems based on the laser-induced fluorescence method, showing an improvement of measurement uncertainty by introducing a compensation system reducing unwanted influences. We show the advantages of this technique, which is relatively simple and does not require extensive hardware equipment. As an alternative to the traditionally used methods we propose an approach of hyperfine transitions' spectral linewidth measurement. The key characteristic of this method is demonstrated on a set of testing iodine cells. The relationship between laser-induced fluorescence and transition linewidth methods will be presented as well as a summary of the advantages and disadvantages of the proposed technique (in comparison with traditional measurement approaches).

  10. Iodine Absorption Cells Purity Testing

    PubMed Central

    Hrabina, Jan; Zucco, Massimo; Philippe, Charles; Pham, Tuan Minh; Holá, Miroslava; Acef, Ouali; Lazar, Josef; Číp, Ondřej

    2017-01-01

    This article deals with the evaluation of the chemical purity of iodine-filled absorption cells and the optical frequency references used for the frequency locking of laser standards. We summarize the recent trends and progress in absorption cell technology and we focus on methods for iodine cell purity testing. We compare two independent experimental systems based on the laser-induced fluorescence method, showing an improvement of measurement uncertainty by introducing a compensation system reducing unwanted influences. We show the advantages of this technique, which is relatively simple and does not require extensive hardware equipment. As an alternative to the traditionally used methods we propose an approach of hyperfine transitions’ spectral linewidth measurement. The key characteristic of this method is demonstrated on a set of testing iodine cells. The relationship between laser-induced fluorescence and transition linewidth methods will be presented as well as a summary of the advantages and disadvantages of the proposed technique (in comparison with traditional measurement approaches). PMID:28067834

  11. Rapid detection of Escherichia coli and enterococci in recreational water using an immunomagnetic separation/adenosine triphosphate technique

    USGS Publications Warehouse

    Bushon, R.N.; Brady, A.M.; Likirdopulos, C.A.; Cireddu, J.V.

    2009-01-01

    Aims: The aim of this study was to examine a rapid method for detecting Escherichia coli and enterococci in recreational water. Methods and Results: Water samples were assayed for E. coli and enterococci by traditional and immunomagnetic separation/adenosine triphosphate (IMS/ATP) methods. Three sample treatments were evaluated for the IMS/ATP method: double filtration, single filtration, and direct analysis. Pearson's correlation analysis showed strong, significant, linear relations between IMS/ATP and traditional methods for all sample treatments; strongest linear correlations were with the direct analysis (r = 0.62 and 0.77 for E. coli and enterococci, respectively). Additionally, simple linear regression was used to estimate bacteria concentrations as a function of IMS/ATP results. The correct classification of water-quality criteria was 67% for E. coli and 80% for enterococci. Conclusions: The IMS/ATP method is a viable alternative to traditional methods for faecal-indicator bacteria. Significance and Impact of the Study: The IMS/ATP method addresses critical public health needs for the rapid detection of faecal-indicator contamination and has potential for satisfying US legislative mandates requiring methods to detect bathing water contamination in 2 h or less. Moreover, IMS/ATP equipment is considerably less costly and more portable than that for molecular methods, making the method suitable for field applications. ?? 2009 The Authors.

  12. Semi-automating the manual literature search for systematic reviews increases efficiency.

    PubMed

    Chapman, Andrea L; Morgan, Laura C; Gartlehner, Gerald

    2010-03-01

    To minimise retrieval bias, manual literature searches are a key part of the search process of any systematic review. Considering the need to have accurate information, valid results of the manual literature search are essential to ensure scientific standards; likewise efficient approaches that minimise the amount of personnel time required to conduct a manual literature search are of great interest. The objective of this project was to determine the validity and efficiency of a new manual search method that utilises the scopus database. We used the traditional manual search approach as the gold standard to determine the validity and efficiency of the proposed scopus method. Outcome measures included completeness of article detection and personnel time involved. Using both methods independently, we compared the results based on accuracy of the results, validity and time spent conducting the search, efficiency. Regarding accuracy, the scopus method identified the same studies as the traditional approach indicating its validity. In terms of efficiency, using scopus led to a time saving of 62.5% compared with the traditional approach (3 h versus 8 h). The scopus method can significantly improve the efficiency of manual searches and thus of systematic reviews.

  13. Aspheres for high speed cine lenses

    NASA Astrophysics Data System (ADS)

    Beder, Christian

    2005-09-01

    To fulfil the requirements of today's high performance cine lenses aspheres are an indispensable part of lens design. Among making them manageable in shape and size, tolerancing aspheres is an essential part of the development process. The traditional method of tolerancing individual aspherical coefficients results in unemployable theoretical figures only. In order to obtain viable parameters that can easily be dealt with in a production line, more enhanced techniques are required. In this presentation, a method of simulating characteristic manufacturing errors and deducing surface deviation and slope error tolerances will be shown.

  14. The CREATE Method Does Not Result in Greater Gains in Critical Thinking than a More Traditional Method of Analyzing the Primary Literature †

    PubMed Central

    Segura-Totten, Miriam; Dalman, Nancy E.

    2013-01-01

    Analysis of the primary literature in the undergraduate curriculum is associated with gains in student learning. In particular, the CREATE (Consider, Read, Elucidate hypotheses, Analyze and interpret the data, and Think of the next Experiment) method is associated with an increase in student critical thinking skills. We adapted the CREATE method within a required cell biology class and compared the learning gains of students using CREATE to those of students involved in less structured literature discussions. We found that while both sets of students had gains in critical thinking, students who used the CREATE method did not show significant improvement over students engaged in a more traditional method for dissecting the literature. Students also reported similar learning gains for both literature discussion methods. Our study suggests that, at least in our educational context, the CREATE method does not lead to higher learning gains than a less structured way of reading primary literature. PMID:24358379

  15. A Residual Mass Ballistic Testing Method to Compare Armor Materials or Components (Residual Mass Ballistic Testing Method)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin Langhorst; Thomas M Lillo; Henry S Chu

    2014-05-01

    A statistics based ballistic test method is presented for use when comparing multiple groups of test articles of unknown relative ballistic perforation resistance. The method is intended to be more efficient than many traditional methods for research and development testing. To establish the validity of the method, it is employed in this study to compare test groups of known relative ballistic performance. Multiple groups of test articles were perforated using consistent projectiles and impact conditions. Test groups were made of rolled homogeneous armor (RHA) plates and differed in thickness. After perforation, each residual projectile was captured behind the target andmore » its mass was measured. The residual masses measured for each test group were analyzed to provide ballistic performance rankings with associated confidence levels. When compared to traditional V50 methods, the residual mass (RM) method was found to require fewer test events and be more tolerant of variations in impact conditions.« less

  16. Personality preference distribution of dental students admitted to one dental school using different selection methods.

    PubMed

    von Bergmann, Hsingchi; Dalrymple, Kirsten R; Shuler, Charles F

    2014-04-01

    This study sought to determine whether using the Myers-Briggs Type Indicator (MBTI) would detect differences in personality preferences in first-year dental students admitted to the same dental school through different admission methods. First-year dental students admitted in 2000 and 2001 were given the MBTI instrument during orientation prior to the start of classes. In fall 2000, the Class of 2004 had 140 students, with 116 in the traditional track and twenty-four in the parallel problem-based learning (PBL) track. In fall 2001, the Class of 2005 had 144 students, all enrolled in the PBL curriculum. All students admitted to the PBL track had experienced a process that included evaluation of their participation in a small group. Students in the traditional track had individual interviews with faculty members. Both student groups were required to meet the same baseline grade point average and Dental Admission Test standards. In 2000, the PBL students showed personality preferences that were distinctly different from the personality preferences of traditional track students in the categories of Extroversion (89 percent PBL, 44 percent traditional) and Thinking (72 percent PBL, 39 percent traditional). In 2001, the all-PBL class retained the trend towards Extroversion (69 percent). This study suggests that admission method may effectively change the personality preference distribution exhibited by the students who are admitted to dental school.

  17. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  18. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  19. A new method for determining which stars are near a star sensor field-of-view

    NASA Technical Reports Server (NTRS)

    Yates, Russell E., Jr.; Vedder, John D.

    1991-01-01

    A new method is described for determining which stars in a navigation star catalog are near a star sensor field of view (FOV). This method assumes that an estimate of spacecraft inertial attitude is known. Vector component ranges for the star sensor FOV are computed, so that stars whose vector components lie within these ranges are near the star sensor FOV. This method requires no presorting of the navigation star catalog, and is more efficient than tradition methods.

  20. Unidirectional Fabric Drape Testing Method

    PubMed Central

    Mei, Zaihuan; Yang, Jingzhi; Zhou, Ting; Zhou, Hua

    2015-01-01

    In most cases, fabrics such as curtains, skirts, suit pants and so on are draped under their own gravity parallel to fabric plane while the gravity is perpendicular to fabric plane in traditional drape testing method. As a result, it does not conform to actual situation and the test data is not convincing enough. To overcome this problem, this paper presents a novel method which simulates the real mechanical conditions and ensures the gravity is parallel to the fabric plane. This method applied a low-cost Kinect Sensor device to capture the 3-dimensional (3D) drape profile, thus we obtained the drape degree parameters and aesthetic parameters by 3D reconstruction and image processing and analysis techniques. The experiment was conducted on our self-devised drape-testing instrument by choosing different kinds of weave structure fabrics as our testing samples and the results were compared with those of traditional method and subjective evaluation. Through regression and correlation analysis we found that this novel testing method was significantly correlated with the traditional and subjective evaluation method. We achieved a new, non-contact 3D measurement method for drape testing, namely unidirectional fabric drape testing method. This method is more suitable for evaluating drape behavior because it is more in line with actual mechanical conditions of draped fabrics and has a well consistency with the requirements of visual and aesthetic style of fabrics. PMID:26600387

  1. Lattice Boltzmann model for simulation of magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Chen, Shiyi; Chen, Hudong; Martinez, Daniel; Matthaeus, William

    1991-01-01

    A numerical method, based on a discrete Boltzmann equation, is presented for solving the equations of magnetohydrodynamics (MHD). The algorithm provides advantages similar to the cellular automaton method in that it is local and easily adapted to parallel computing environments. Because of much lower noise levels and less stringent requirements on lattice size, the method appears to be more competitive with traditional solution methods. Examples show that the model accurately reproduces both linear and nonlinear MHD phenomena.

  2. Parallel Exploration of Interaction Space by BioID and Affinity Purification Coupled to Mass Spectrometry.

    PubMed

    Hesketh, Geoffrey G; Youn, Ji-Young; Samavarchi-Tehrani, Payman; Raught, Brian; Gingras, Anne-Claude

    2017-01-01

    Complete understanding of cellular function requires knowledge of the composition and dynamics of protein interaction networks, the importance of which spans all molecular cell biology fields. Mass spectrometry-based proteomics approaches are instrumental in this process, with affinity purification coupled to mass spectrometry (AP-MS) now widely used for defining interaction landscapes. Traditional AP-MS methods are well suited to providing information regarding the temporal aspects of soluble protein-protein interactions, but the requirement to maintain protein-protein interactions during cell lysis and AP means that both weak-affinity interactions and spatial information is lost. A more recently developed method called BioID employs the expression of bait proteins fused to a nonspecific biotin ligase, BirA*, that induces in vivo biotinylation of proximal proteins. Coupling this method to biotin affinity enrichment and mass spectrometry negates many of the solubility and interaction strength issues inherent in traditional AP-MS methods, and provides unparalleled spatial context for protein interactions. Here we describe the parallel implementation of both BioID and FLAG AP-MS allowing simultaneous exploration of both spatial and temporal aspects of protein interaction networks.

  3. Rapid measurement of protein osmotic second virial coefficients by self-interaction chromatography.

    PubMed Central

    Tessier, Peter M; Lenhoff, Abraham M; Sandler, Stanley I

    2002-01-01

    Weak protein interactions are often characterized in terms of the osmotic second virial coefficient (B(22)), which has been shown to correlate with protein phase behavior, such as crystallization. Traditional methods for measuring B(22), such as static light scattering, are too expensive in terms of both time and protein to allow extensive exploration of the effects of solution conditions on B(22). In this work we have measured protein interactions using self-interaction chromatography, in which protein is immobilized on chromatographic particles and the retention of the same protein is measured in isocratic elution. The relative retention of the protein reflects the average protein interactions, which we have related to the second virial coefficient via statistical mechanics. We obtain quantitative agreement between virial coefficients measured by self-interaction chromatography and traditional characterization methods for both lysozyme and chymotrypsinogen over a wide range of pH and ionic strengths, yet self-interaction chromatography requires at least an order of magnitude less time and protein than other methods. The method thus holds significant promise for the characterization of protein interactions requiring only commonly available laboratory equipment, little specialized expertise, and relatively small investments of both time and protein. PMID:11867474

  4. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    PubMed

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Manufacturing PDMS micro lens array using spin coating under a multiphase system

    NASA Astrophysics Data System (ADS)

    Sun, Rongrong; Yang, Hanry; Rock, D. Mitchell; Danaei, Roozbeh; Panat, Rahul; Kessler, Michael R.; Li, Lei

    2017-05-01

    The development of micro lens arrays has garnered much interest due to increased demand of miniaturized systems. Traditional methods for manufacturing micro lens arrays have several shortcomings. For example, they require expensive facilities and long lead time, and traditional lens materials (i.e. glass) are typically heavy, costly and difficult to manufacture. In this paper, we explore a method for manufacturing a polydimethylsiloxane (PDMS) micro lens array using a simple spin coating technique. The micro lens array, formed under an interfacial tension dominated system, and the influence of material properties and process parameters on the fabricated lens shape are examined. The lenses fabricated using this method show comparable optical properties—including surface finish and image quality—with a reduced cost and manufacturing lead time.

  6. Using Penelope to assess the correctness of NASA Ada software: A demonstration of formal methods as a counterpart to testing

    NASA Technical Reports Server (NTRS)

    Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey

    1993-01-01

    Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.

  7. Analysis and evaluation of methods for backcalculation of Mr values : volume 1 : research report : final report.

    DOT National Transportation Integrated Search

    1993-01-01

    Use of the 1986 AASHTO Design Guide requires accurate estimates of the resilient modulus of flexible pavement materials. Traditionally, these properties have been determined from either laboratory testing or by backcalculation from deflection data. S...

  8. RECENT APPLICATIONS OF SOURCE APPORTIONMENT METHODS AND RELATED NEEDS

    EPA Science Inventory

    Traditional receptor modeling studies have utilized factor analysis (like principal component analysis, PCA) and/or Chemical Mass Balance (CMB) to assess source influences. The limitations with these approaches is that PCA is qualitative and CMB requires the input of source pr...

  9. Advanced Computing for Science.

    ERIC Educational Resources Information Center

    Hut, Piet; Sussman, Gerald Jay

    1987-01-01

    Discusses some of the contributions that high-speed computing is making to the study of science. Emphasizes the use of computers in exploring complicated systems without the simplification required in traditional methods of observation and experimentation. Provides examples of computer assisted investigations in astronomy and physics. (TW)

  10. Real-Time Culture Change Improves Lean Success: Sequenced Culture Change Gets Failing Grades.

    PubMed

    Kusy, Mitchell; Diamond, Marty; Vrchota, Scott

    2015-01-01

    Success with the Lean management system is rooted in a culture of stakeholder engagement and commitment. Unfortunately, many leaders view Lean as an "add-on" tool instead of one that requires a new way of thinking and approaching culture. This article addresses the "why, how, and what" to promote a Lean culture that works. We present a five-phased approach grounded in evidence-based practices of real-time culture change. We further help healthcare leaders understand the differences between traditional "sequenced" approaches to culture change and "real-time" methods--and why these real-time practices are more sustainable and ultimately more successful than traditional culture change methods.

  11. Comparison of Conjugate Gradient Density Matrix Search and Chebyshev Expansion Methods for Avoiding Diagonalization in Large-Scale Electronic Structure Calculations

    NASA Technical Reports Server (NTRS)

    Bates, Kevin R.; Daniels, Andrew D.; Scuseria, Gustavo E.

    1998-01-01

    We report a comparison of two linear-scaling methods which avoid the diagonalization bottleneck of traditional electronic structure algorithms. The Chebyshev expansion method (CEM) is implemented for carbon tight-binding calculations of large systems and its memory and timing requirements compared to those of our previously implemented conjugate gradient density matrix search (CG-DMS). Benchmark calculations are carried out on icosahedral fullerenes from C60 to C8640 and the linear scaling memory and CPU requirements of the CEM demonstrated. We show that the CPU requisites of the CEM and CG-DMS are similar for calculations with comparable accuracy.

  12. Reliable oligonucleotide conformational ensemble generation in explicit solvent for force field assessment using reservoir replica exchange molecular dynamics simulations

    PubMed Central

    Henriksen, Niel M.; Roe, Daniel R.; Cheatham, Thomas E.

    2013-01-01

    Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 microseconds of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations. PMID:23477537

  13. Reliable oligonucleotide conformational ensemble generation in explicit solvent for force field assessment using reservoir replica exchange molecular dynamics simulations.

    PubMed

    Henriksen, Niel M; Roe, Daniel R; Cheatham, Thomas E

    2013-04-18

    Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example, by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 μs of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations.

  14. A modified indirect mathematical model for evaluation of ethanol production efficiency in industrial-scale continuous fermentation processes.

    PubMed

    Canseco Grellet, M A; Castagnaro, A; Dantur, K I; De Boeck, G; Ahmed, P M; Cárdenas, G J; Welin, B; Ruiz, R M

    2016-10-01

    To calculate fermentation efficiency in a continuous ethanol production process, we aimed to develop a robust mathematical method based on the analysis of metabolic by-product formation. This method is in contrast to the traditional way of calculating ethanol fermentation efficiency, where the ratio between the ethanol produced and the sugar consumed is expressed as a percentage of the theoretical conversion yield. Comparison between the two methods, at industrial scale and in sensitivity studies, showed that the indirect method was more robust and gave slightly higher fermentation efficiency values, although fermentation efficiency of the industrial process was found to be low (~75%). The traditional calculation method is simpler than the indirect method as it only requires a few chemical determinations in samples collected. However, a minor error in any measured parameter will have an important impact on the calculated efficiency. In contrast, the indirect method of calculation requires a greater number of determinations but is much more robust since an error in any parameter will only have a minor effect on the fermentation efficiency value. The application of the indirect calculation methodology in order to evaluate the real situation of the process and to reach an optimum fermentation yield for an industrial-scale ethanol production is recommended. Once a high fermentation yield has been reached the traditional method should be used to maintain the control of the process. Upon detection of lower yields in an optimized process the indirect method should be employed as it permits a more accurate diagnosis of causes of yield losses in order to correct the problem rapidly. The low fermentation efficiency obtained in this study shows an urgent need for industrial process optimization where the indirect calculation methodology will be an important tool to determine process losses. © 2016 The Society for Applied Microbiology.

  15. Developing indicators of pattern identification in patients with stroke using traditional Korean medicine

    PubMed Central

    2012-01-01

    Background The traditional Korean medical diagnoses employ pattern identification (PI), a diagnostic system that entails the comprehensive analysis of symptoms and signs. The PI needs to be standardized due to its ambiguity. Therefore, this study was performed to establish standard indicators of the PI for stroke through the traditional Korean medical literature, expert consensus and a clinical field test. Methods We sorted out stroke patterns with an expert committee organized by the Korean Institute of Oriental Medicine. The expert committee composed a document for a standardized pattern of identification for stroke based on the traditional Korean medical literature, and we evaluated the clinical significance of the document through a field test. Results We established five stroke patterns from the traditional Korean medical literature and extracted 117 indicators required for diagnosis. The indicators were evaluated by a field test and verified by the expert committee. Conclusions This study sought to develop indicators of PI based on the traditional Korean medical literature. This process contributed to the standardization of traditional Korean medical diagnoses. PMID:22410195

  16. A combined finite element-boundary element formulation for solution of two-dimensional problems via CGFFT

    NASA Technical Reports Server (NTRS)

    Collins, Jeffery D.; Jin, Jian-Ming; Volakis, John L.

    1990-01-01

    A method for the computation of electromagnetic scattering from arbitrary two-dimensional bodies is presented. The method combines the finite element and boundary element methods leading to a system for solution via the conjugate gradient Fast Fourier Transform (FFT) algorithm. Two forms of boundaries aimed at reducing the storage requirement of the boundary integral are investigated. It is shown that the boundary integral becomes convolutional when a circular enclosure is chosen, resulting in reduced storage requirement when the system is solved via the conjugate gradient FFT method. The same holds for the ogival enclosure, except that some of the boundary integrals are not convolutional and must be carefully treated to maintain O(N) memory requirement. Results for several circular and ogival structures are presented and shown to be in excellent agreement with those obtained by traditional methods.

  17. Transforming Multidisciplinary Customer Requirements to Product Design Specifications

    NASA Astrophysics Data System (ADS)

    Ma, Xiao-Jie; Ding, Guo-Fu; Qin, Sheng-Feng; Li, Rong; Yan, Kai-Yin; Xiao, Shou-Ne; Yang, Guang-Wu

    2017-09-01

    With the increasing of complexity of complex mechatronic products, it is necessary to involve multidisciplinary design teams, thus, the traditional customer requirements modeling for a single discipline team becomes difficult to be applied in a multidisciplinary team and project since team members with various disciplinary backgrounds may have different interpretations of the customers' requirements. A new synthesized multidisciplinary customer requirements modeling method is provided for obtaining and describing the common understanding of customer requirements (CRs) and more importantly transferring them into a detailed and accurate product design specifications (PDS) to interact with different team members effectively. A case study of designing a high speed train verifies the rationality and feasibility of the proposed multidisciplinary requirement modeling method for complex mechatronic product development. This proposed research offersthe instruction to realize the customer-driven personalized customization of complex mechatronic product.

  18. Unsupervised Cryo-EM Data Clustering through Adaptively Constrained K-Means Algorithm

    PubMed Central

    Xu, Yaofang; Wu, Jiayi; Yin, Chang-Cheng; Mao, Youdong

    2016-01-01

    In single-particle cryo-electron microscopy (cryo-EM), K-means clustering algorithm is widely used in unsupervised 2D classification of projection images of biological macromolecules. 3D ab initio reconstruction requires accurate unsupervised classification in order to separate molecular projections of distinct orientations. Due to background noise in single-particle images and uncertainty of molecular orientations, traditional K-means clustering algorithm may classify images into wrong classes and produce classes with a large variation in membership. Overcoming these limitations requires further development on clustering algorithms for cryo-EM data analysis. We propose a novel unsupervised data clustering method building upon the traditional K-means algorithm. By introducing an adaptive constraint term in the objective function, our algorithm not only avoids a large variation in class sizes but also produces more accurate data clustering. Applications of this approach to both simulated and experimental cryo-EM data demonstrate that our algorithm is a significantly improved alterative to the traditional K-means algorithm in single-particle cryo-EM analysis. PMID:27959895

  19. Unsupervised Cryo-EM Data Clustering through Adaptively Constrained K-Means Algorithm.

    PubMed

    Xu, Yaofang; Wu, Jiayi; Yin, Chang-Cheng; Mao, Youdong

    2016-01-01

    In single-particle cryo-electron microscopy (cryo-EM), K-means clustering algorithm is widely used in unsupervised 2D classification of projection images of biological macromolecules. 3D ab initio reconstruction requires accurate unsupervised classification in order to separate molecular projections of distinct orientations. Due to background noise in single-particle images and uncertainty of molecular orientations, traditional K-means clustering algorithm may classify images into wrong classes and produce classes with a large variation in membership. Overcoming these limitations requires further development on clustering algorithms for cryo-EM data analysis. We propose a novel unsupervised data clustering method building upon the traditional K-means algorithm. By introducing an adaptive constraint term in the objective function, our algorithm not only avoids a large variation in class sizes but also produces more accurate data clustering. Applications of this approach to both simulated and experimental cryo-EM data demonstrate that our algorithm is a significantly improved alterative to the traditional K-means algorithm in single-particle cryo-EM analysis.

  20. Application of capability indices and control charts in the analytical method control strategy.

    PubMed

    Oliva, Alexis; Llabres Martinez, Matías

    2017-08-01

    In this study, we assessed the usefulness of control charts in combination with the process capability indices, C pm and C pk , in the control strategy of an analytical method. The traditional X-chart and moving range chart were used to monitor the analytical method over a 2-year period. The results confirmed that the analytical method is in-control and stable. Different criteria were used to establish the specifications limits (i.e. analyst requirements) for fixed method performance (i.e. method requirements). If the specification limits and control limits are equal in breadth, the method can be considered "capable" (C pm  = 1), but it does not satisfy the minimum method capability requirements proposed by Pearn and Shu (2003). Similar results were obtained using the C pk index. The method capability was also assessed as a function of method performance for fixed analyst requirements. The results indicate that the method does not meet the requirements of the analytical target approach. A real-example data of a SEC with light-scattering detection method was used as a model whereas previously published data were used to illustrate the applicability of the proposed approach. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Comparison of Passive Samplers for Monitoring Dissolved Organic Contaminants in Water Column Deployments

    EPA Science Inventory

    Nonionic organic contaminants (NOCs) are difficult to measure in the water column due to their inherent chemical properties resulting in low water solubility and high particle activity. Traditional sampling methods require large quantities of water to be extracted and interferen...

  2. Soil texture classification algorithm using RGB characteristics of soil images

    USDA-ARS?s Scientific Manuscript database

    Soil texture has an important influence on agriculture, affecting crop selection, movement of nutrients and water, soil electrical conductivity, and crop growth. Soil texture has traditionally been determined in the laboratory using pipette and hydrometer methods that require a considerable amount o...

  3. [Study on two preparation methods for beta-CD inclusion compound of four traditional Chinese medicine volatile oils].

    PubMed

    Li, Hailiang; Cui, Xiaoli; Tong, Yan; Gong, Muxin

    2012-04-01

    To compare inclusion effects and process conditions of two preparation methods-colloid mill and saturated solution-for beta-CD inclusion compound of four traditional Chinese medicine volatile oils and study the relationship between each process condition and volatile oil physical properties and the regularity of selective inclusion of volatile oil components. Volatile oils from Nardostachyos Radix et Rhizoma, Amomi Fructus, Zingiberis Rhizoma and Angelicaesinensis Radix were prepared using two methods in the orthogonal test. These inclusion compounds by optimized processes were assessed and compared by such methods as TLC, IR and scanning electron microscope. Inclusion oils were extracted by steam distillation, and the components found before and after inclusion were analyzed by GC-MS. Analysis showed that new inclusion compounds, but inclusion compounds prepared by the two processes had differences to some extent. The colloid mill method showed a better inclusion effect than the saturated solution method, indicating that their process conditions had relations with volatile oil physical properties. There were differences in the inclusion selectivity of components between each other. The colloid mill method for inclusion preparation is more suitable for industrial requirements. To prepare volatile oil inclusion compounds with heavy gravity and high refractive index, the colloid mill method needs longer time and more water, while the saturated solution method requires higher temperature and more beta-cyclodextrin. The inclusion complex prepared with the colloid mill method contains extended molecular weight chemical composition, but the kinds of components are reduced.

  4. Hybrid sentiment analysis utilizing multiple indicators to determine temporal shifts of opinion in OSNs

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Hall, Robert T.; Fields, Jeremy; White, Holly M.

    2016-05-01

    Utilization of traditional sentiment analysis for predicting the outcome of an event on a social network depends on: precise understanding of what topics relate to the event, selective elimination of trends that don't fit, and in most cases, expert knowledge of major players of the event. Sentiment analysis has traditionally taken one of two approaches to derive a quantitative value from qualitative text. These approaches include the bag of words model", and the usage of "NLP" to attempt a real understanding of the text. Each of these methods yield very similar accuracy results with the exception of some special use cases. To do so, however, they both impose a large computational burden on the analytic system. Newer approaches have this same problem. No matter what approach is used, SA typically caps out around 80% in accuracy. However, accuracy is the result of both polarity and degree of polarity, nothing else. In this paper we present a method for hybridizing traditional SA methods to better determine shifts in opinion over time within social networks. This hybridization process involves augmenting traditional SA measurements with contextual understanding, and knowledge about writers' demographics. Our goal is to not only to improve accuracy, but to do so with minimal impact to computation requirements.

  5. Layer stacking: A novel algorithm for individual forest tree segmentation from LiDAR point clouds

    Treesearch

    Elias Ayrey; Shawn Fraver; John A. Kershaw; Laura S. Kenefic; Daniel Hayes; Aaron R. Weiskittel; Brian E. Roth

    2017-01-01

    As light detection and ranging (LiDAR) technology advances, it has become common for datasets to be acquired at a point density high enough to capture structural information from individual trees. To process these data, an automatic method of isolating individual trees from a LiDAR point cloud is required. Traditional methods for segmenting trees attempt to isolate...

  6. Reducing the width of confidence intervals for the difference between two population means by inverting adaptive tests.

    PubMed

    O'Gorman, Thomas W

    2018-05-01

    In the last decade, it has been shown that an adaptive testing method could be used, along with the Robbins-Monro search procedure, to obtain confidence intervals that are often narrower than traditional confidence intervals. However, these confidence interval limits require a great deal of computation and some familiarity with stochastic search methods. We propose a method for estimating the limits of confidence intervals that uses only a few tests of significance. We compare these limits to those obtained by a lengthy Robbins-Monro stochastic search and find that the proposed method is nearly as accurate as the Robbins-Monro search. Adaptive confidence intervals that are produced by the proposed method are often narrower than traditional confidence intervals when the distributions are long-tailed, skewed, or bimodal. Moreover, the proposed method of estimating confidence interval limits is easy to understand, because it is based solely on the p-values from a few tests of significance.

  7. How does tele-learning compare with other forms of education delivery? A systematic review of tele-learning educational outcomes for health professionals.

    PubMed

    Tomlinson, Jo; Shaw, Tim; Munro, Ana; Johnson, Ros; Madden, D Lynne; Phillips, Rosemary; McGregor, Deborah

    2013-11-01

    Telecommuniciation technologies, including audio and videoconferencing facilities, afford geographically dispersed health professionals the opportunity to connect and collaborate with others. Recognised for enabling tele-consultations and tele-collaborations between teams of health care professionals and their patients, these technologies are also well suited to the delivery of distance learning programs, known as tele-learning. To determine whether tele-learning delivery methods achieve equivalent learning outcomes when compared with traditional face-to-face education delivery methods. A systematic literature review was commissioned by the NSW Ministry of Health to identify results relevant to programs applying tele-learning delivery methods in the provision of education to health professionals. The review found few studies that rigorously compared tele-learning with traditional formats. There was some evidence, however, to support the premise that tele-learning models achieve comparable learning outcomes and that participants are generally satisfied with and accepting of this delivery method. The review illustrated that tele-learning technologies not only enable distance learning opportunities, but achieve comparable learning outcomes to traditional face-to-face models. More rigorous evidence is required to strengthen these findings and should be the focus of future tele-learning research.

  8. [Research on NIR equivalent spectral measurement].

    PubMed

    Wang, Zhi-Hong; Liu, Jie; Sun, Yu-Yang; Teng, Fei; Lin, Jun

    2013-04-01

    When the spectra of the diffuse reflectance of low reflectivity samples or the transmittance of low transmisivity samples are measured by a portable near infrared (NIR) spectrometer, because there is the noise of the spectrometer, the smaller the reflectance or transmittance of the sample, the lower its SNR. Even if treated by denoise methods, the spectra can not meet the requirement of NIR analysis. So the equivalent spectrum measure method was researched. Based on the intensity of the reflected or transmitted signal by the sample under the traditional measure conditions, the light current of the spectrometer was enlarged, and then the signal of the measured sample increased; the reflected or transmitted light of the measure reference was reduced to avoid the signal of the measure reference over range. Moreover the equivalent spectrum of the sample was calculated in order to make it identical with the spectrum measured by traditional method. Thus the NIR spectral SNR was improved. The results of theory analysis and experiments show that if the light signal of the spectrometer was properly increased according to the reflected or transmitted signal of the low reflectivity or transmisivity sample, the equivalent spectrum was the same as the spectrum measured by traditional method and its SNR was improved.

  9. Clinic Design and Continuity in Internal Medicine Resident Clinics: Findings of the Educational Innovations Project Ambulatory Collaborative

    PubMed Central

    Francis, Maureen D.; Wieland, Mark L.; Drake, Sean; Gwisdalla, Keri Lyn; Julian, Katherine A.; Nabors, Christopher; Pereira, Anne; Rosenblum, Michael; Smith, Amy; Sweet, David; Thomas, Kris; Varney, Andrew; Warm, Eric; Wininger, David; Francis, Mark L.

    2015-01-01

    Background Many internal medicine (IM) programs have reorganized their resident continuity clinics to improve trainees' ambulatory experience. Downstream effects on continuity of care and other clinical and educational metrics are unclear. Methods This multi-institutional, cross-sectional study included 713 IM residents from 12 programs. Continuity was measured using the usual provider of care method (UPC) and the continuity for physician method (PHY). Three clinic models (traditional, block, and combination) were compared using analysis of covariance. Multivariable linear regression analysis was used to analyze the effect of practice metrics and clinic model on continuity. Results UPC, reflecting continuity from the patient perspective, was significantly different, and was highest in the block model, midrange in combination model, and lowest in the traditional model programs. PHY, reflecting continuity from the perspective of the resident provider, was significantly lower in the block model than in combination and traditional programs. Panel size, ambulatory workload, utilization, number of clinics attended in the study period, and clinic model together accounted for 62% of the variation found in UPC and 26% of the variation found in PHY. Conclusions Clinic model appeared to have a significant effect on continuity measured from both the patient and resident perspectives. Continuity requires balance between provider availability and demand for services. Optimizing this balance to maximize resident education, and the health of the population served, will require consideration of relevant local factors and priorities in addition to the clinic model. PMID:26217420

  10. Identification of emergent off-nominal operational requirements during conceptual architecting of the more electric aircraft

    NASA Astrophysics Data System (ADS)

    Armstrong, Michael James

    Increases in power demands and changes in the design practices of overall equipment manufacturers has led to a new paradigm in vehicle systems definition. The development of unique power systems architectures is of increasing importance to overall platform feasibility and must be pursued early in the aircraft design process. Many vehicle systems architecture trades must be conducted concurrent to platform definition. With an increased complexity introduced during conceptual design, accurate predictions of unit level sizing requirements must be made. Architecture specific emergent requirements must be identified which arise due to the complex integrated effect of unit behaviors. Off-nominal operating scenarios present sizing critical requirements to the aircraft vehicle systems. These requirements are architecture specific and emergent. Standard heuristically defined failure mitigation is sufficient for sizing traditional and evolutionary architectures. However, architecture concepts which vary significantly in terms of structure and composition require that unique failure mitigation strategies be defined for accurate estimations of unit level requirements. Identifying of these off-nominal emergent operational requirements require extensions to traditional safety and reliability tools and the systematic identification of optimal performance degradation strategies. Discrete operational constraints posed by traditional Functional Hazard Assessment (FHA) are replaced by continuous relationships between function loss and operational hazard. These relationships pose the objective function for hazard minimization. Load shedding optimization is performed for all statistically significant failures by varying the allocation of functional capability throughout the vehicle systems architecture. Expressing hazards, and thereby, reliability requirements as continuous relationships with the magnitude and duration of functional failure requires augmentations to the traditional means for system safety assessment (SSA). The traditional two state and discrete system reliability assessment proves insufficient. Reliability is, therefore, handled in an analog fashion: as a function of magnitude of failure and failure duration. A series of metrics are introduced which characterize system performance in terms of analog hazard probabilities. These include analog and cumulative system and functional risk, hazard correlation, and extensions to the traditional component importance metrics. Continuous FHA, load shedding optimization, and analog SSA constitute the SONOMA process (Systematic Off-Nominal Requirements Analysis). Analog system safety metrics inform both architecture optimization (changes in unit level capability and reliability) and architecture augmentation (changes in architecture structure and composition). This process was applied for two vehicle systems concepts (conventional and 'more-electric') in terms of loss/hazard relationships with varying degrees of fidelity. Application of this process shows that the traditional assumptions regarding the structure of the function loss vs. hazard relationship apply undue design bias to functions and components during exploratory design. This bias is illustrated in terms of inaccurate estimations of the system and function level risk and unit level importance. It was also shown that off-nominal emergent requirements must be defined specific to each architecture concept. Quantitative comparisons of architecture specific off-nominal performance were obtained which provide evidence to the need for accurate definition of load shedding strategies during architecture exploratory design. Formally expressing performance degradation strategies in terms of the minimization of a continuous hazard space enhances the system architects ability to accurately predict sizing critical emergent requirements concurrent to architecture definition. Furthermore, the methods and frameworks generated here provide a structured and flexible means for eliciting these architecture specific requirements during the performance of architecture trades.

  11. Using geophysical images of a watershed subsurface to predict soil textural properties

    USDA-ARS?s Scientific Manuscript database

    Subsurface architecture, in particular changes in soil type across the landscape, is an important control on the hydrological and ecological function of a watershed. Traditional methods of mapping soils involving subjective assignment of soil boundaries are inadequate for studies requiring a quantit...

  12. Overcoming Content-Associated Challenges Using Attention-Focused Methods

    ERIC Educational Resources Information Center

    Lebec, Michael T.; Kesteloot, Lauren

    2015-01-01

    A common challenge in higher education is teaching required content for which students traditionally have lower levels of interest. Physical therapist education programs experience this challenge when training entry-level students to document in the medical record. The authors compared learning outcomes among physical therapy students taught…

  13. An Introduction to Modern Missing Data Analyses

    ERIC Educational Resources Information Center

    Baraldi, Amanda N.; Enders, Craig K.

    2010-01-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…

  14. The Building Commissioning Handbook.

    ERIC Educational Resources Information Center

    Heinz, John A.; Casault, Rick

    This book discusses building commissioning, which is the process of certifying that a new facility meets the required specifications. As buildings have become more complex, the traditional methods for building start-up and final acceptance have been proven inadequate, and building commissioning has been developed, which often necessitates the use…

  15. Creating a smART Camp

    ERIC Educational Resources Information Center

    Maurer, Matthew J.; Tokarsky, Rebecca; Zalewsky, Laura

    2011-01-01

    Many of the skills and talents required to be a successful scientist, such as analysis, experimentation, and creativity, can be developed and reinforced through art. Both science and art challenge students to make observations, experiment with different techniques, and use both traditional and nontraditional methods to express their ideas. The…

  16. Direct Allocation Costing: Informed Management Decisions in a Changing Environment.

    ERIC Educational Resources Information Center

    Mancini, Cesidio G.; Goeres, Ernest R.

    1995-01-01

    It is argued that colleges and universities can use direct allocation costing to provide quantitative information needed for decision making. This method of analysis requires institutions to modify traditional ideas of costing, looking to the private sector for examples of accurate costing techniques. (MSE)

  17. Method and algorithm for image processing

    DOEpatents

    He, George G.; Moon, Brain D.

    2003-12-16

    The present invention is a modified Radon transform. It is similar to the traditional Radon transform for the extraction of line parameters and similar to traditional slant stack for the intensity summation of pixels away from a given pixel, for example ray paths that spans 360 degree at a given grid in the time and offset domain. However, the present invention differs from these methods in that the intensity and direction of a composite intensity for each pixel are maintained separately instead of combined after the transformation. An advantage of this approach is elimination of the work required to extract the line parameters in the transformed domain. The advantage of the modified Radon Transform method is amplified when many lines are present in the imagery or when the lines are just short segments which both occur in actual imagery.

  18. [A new method of fabricating photoelastic model by rapid prototyping].

    PubMed

    Fan, Li; Huang, Qing-feng; Zhang, Fu-qiang; Xia, Yin-pei

    2011-10-01

    To explore a novel method of fabricating the photoelastic model using rapid prototyping technique. A mandible model was made by rapid prototyping with computerized three-dimensional reconstruction, then the photoelastic model with teeth was fabricated by traditional impression duplicating and mould casting. The photoelastic model of mandible with teeth, which was fabricated indirectly by rapid prototyping, was very similar to the prototype in geometry and physical parameters. The model was of high optical sensibility and met the experimental requirements. Photoelastic model of mandible with teeth indirectly fabricated by rapid prototyping meets the photoelastic experimental requirements well.

  19. Improved Sensor Fault Detection, Isolation, and Mitigation Using Multiple Observers Approach

    PubMed Central

    Wang, Zheng; Anand, D. M.; Moyne, J.; Tilbury, D. M.

    2017-01-01

    Traditional Fault Detection and Isolation (FDI) methods analyze a residual signal to detect and isolate sensor faults. The residual signal is the difference between the sensor measurements and the estimated outputs of the system based on an observer. The traditional residual-based FDI methods, however, have some limitations. First, they require that the observer has reached its steady state. In addition, residual-based methods may not detect some sensor faults, such as faults on critical sensors that result in an unobservable system. Furthermore, the system may be in jeopardy if actions required for mitigating the impact of the faulty sensors are not taken before the faulty sensors are identified. The contribution of this paper is to propose three new methods to address these limitations. Faults that occur during the observers' transient state can be detected by analyzing the convergence rate of the estimation error. Open-loop observers, which do not rely on sensor information, are used to detect faults on critical sensors. By switching among different observers, we can potentially mitigate the impact of the faulty sensor during the FDI process. These three methods are systematically integrated with a previously developed residual-based method to provide an improved FDI and mitigation capability framework. The overall approach is validated mathematically, and the effectiveness of the overall approach is demonstrated through simulation on a 5-state suspension system. PMID:28924303

  20. Feasibility study for converting traditional line assembly into work cells for termination of fiber optics cable

    NASA Astrophysics Data System (ADS)

    Caldeira, Rylan; Honnungar, Sunilkumar

    2018-04-01

    Most of small to medium industries tend to follow traditional systems of manufacturing which aims at maximum resource utilization irrespective of giving attention to customers volatile demand. In recent times manufacturing is being shifted to be consumers centered, with intense competition among industries to satisfy the customer needs in the required quantity and at the right time. To achieve this, companies investigate the possibility of implementation of cellular manufacturing which is characterized by high variety with optimum usage of resources. Cellular layout coupled with the application of lean methodology, places focus on the production process rather than the production methods so as to identify the wastage and apply methods to further improve productivity.

  1. HIV Epidemic Appraisals for Assisting in the Design of Effective Prevention Programmes: Shifting the Paradigm Back to Basics

    PubMed Central

    Mishra, Sharmistha; Sgaier, Sema K.; Thompson, Laura H.; Moses, Stephen; Ramesh, B. M.; Alary, Michel; Wilson, David; Blanchard, James F.

    2012-01-01

    Background To design HIV prevention programmes, it is critical to understand the temporal and geographic aspects of the local epidemic and to address the key behaviours that drive HIV transmission. Two methods have been developed to appraise HIV epidemics and guide prevention strategies. The numerical proxy method classifies epidemics based on current HIV prevalence thresholds. The Modes of Transmission (MOT) model estimates the distribution of incidence over one year among risk-groups. Both methods focus on the current state of an epidemic and provide short-term metrics which may not capture the epidemiologic drivers. Through a detailed analysis of country and sub-national data, we explore the limitations of the two traditional methods and propose an alternative approach. Methods and Findings We compared outputs of the traditional methods in five countries for which results were published, and applied the numeric and MOT model to India and six districts within India. We discovered three limitations of the current methods for epidemic appraisal: (1) their results failed to identify the key behaviours that drive the epidemic; (2) they were difficult to apply to local epidemics with heterogeneity across district-level administrative units; and (3) the MOT model was highly sensitive to input parameters, many of which required extraction from non-regional sources. We developed an alternative decision-tree framework for HIV epidemic appraisals, based on a qualitative understanding of epidemiologic drivers, and demonstrated its applicability in India. The alternative framework offered a logical algorithm to characterize epidemics; it required minimal but key data. Conclusions Traditional appraisals that utilize the distribution of prevalent and incident HIV infections in the short-term could misguide prevention priorities and potentially impede efforts to halt the trajectory of the HIV epidemic. An approach that characterizes local transmission dynamics provides a potentially more effective tool with which policy makers can design intervention programmes. PMID:22396756

  2. The use of handwriting examinations beyond the traditional court purpose.

    PubMed

    Agius, Anna; Jones, Kylie; Epple, Rochelle; Morelato, Marie; Moret, Sébastien; Chadwick, Scott; Roux, Claude

    2017-09-01

    Traditionally, forensic science has predominantly focused its resources and objectives on addressing court related questions. However, this view restricts the contribution of forensic science to one function and results in lost opportunities as investigative and intelligence roles are often overlooked. A change of perspective and expansion of the contributions of forensic science is required to take advantage of the benefits of abductive and inductive thought processes throughout the investigative and intelligence functions. One forensic discipline that has the potential to broaden its traditional focus is handwriting examination. Typically used in investigations that are focused on both criminal and civil cases, the examination procedure and outcome are time consuming and subjective, requiring a detailed study of the features of the handwriting in question. Traditionally, the major handwriting features exploited are characteristics that are often considered individual (or at least highly polymorphic) and habitual. However, handwriting can be considered as an information vector in an intelligence framework. One such example is the recognition of key elements related to the author's native language. This paper discusses the traditional method generally used around the world and proposes a theoretical approach to expand the application of handwriting examination towards gaining additional information for intelligence purposes. This concept will be designed and tested in a future research project. Copyright © 2017 The Chartered Society of Forensic Sciences. All rights reserved.

  3. Solid fat content as a substitute for total polar compound analysis in edible oils

    USDA-ARS?s Scientific Manuscript database

    The solid fat contents (SFC) of heated edible oil samples were measured and found to correlate positively with total polar compounds (TPC) and inversely with triglyceride concentration. Traditional methods for determination of total polar compounds require a laboratory setting and are time intensiv...

  4. Toxicity Screening of Volatile Chemicals Using a Novel Air-Liquid Interface In Vitro Exposure System

    EPA Science Inventory

    Traditional in vitro dosing methods require, for example, the addition of particulate matter (PM), PM extracts, or chemicals in dimethyl sulfoxide (DMSO) or water into cell culture medium. However, about 10% of chemicals nominated for study in the U.S Environmental Protection Age...

  5. Comparison of Passive Samplers for Monitoring Dissolved Organic Contaminants in Water Column Deployments NAC/SETAC 2012

    EPA Science Inventory

    Nonionic organic contaminants (NOCs) are difficult to measure in the water column due to their inherent chemical properties resulting in low water solubility and high particle activity. Traditional sampling methods require large quantities of water to be extracted and interferen...

  6. Comparison of Passive Samplers for Monitoring Dissolved Organic Contaminants in Water Column Deployments (SETAC Europe 22nd Annual Meeting)

    EPA Science Inventory

    Nonionic organic contaminants (NOCs) are difficult to measure in the water column due to their inherent chemical properties resulting in low water solubility and high particle activity. Traditional sampling methods require large quantities of water to be extracted and interferen...

  7. Levels of Understanding--A Guide to the Teaching and Assessment of Knowledge

    ERIC Educational Resources Information Center

    White, Charles S.

    2007-01-01

    Traditional education, employing lectures or telecommunicative instruction methods, has been very effective in providing topical facts. However, the development of student skills and thinking ability require higher levels of instruction and more opportunity to practice and apply acquired knowledge. As students progress through a particular…

  8. The Art of E-Teaching

    ERIC Educational Resources Information Center

    Hoskins, Barbara J.

    2010-01-01

    Today, teachers are facing a new generation of students known as the Millennials, or the digital generation. They have grown up with the Internet, cell phones, and multiple methods of electronic communication; however, they learned in traditional classrooms where they were required to disconnect. Faculty members generally fall into the Baby Boomer…

  9. Public Debates Shaping Forestry's Future: An Analysis.

    Treesearch

    David Fan; David Bengston

    1997-01-01

    The job of forest managers and policy makers is growing increasingly complex because of rapid change in the social, political, economic, and scientific environments in which forest management is carried out. Managing public lands in ways that are responsive to changing social conditions requires continuous monitoring and assessment. But traditional methods for...

  10. TEAMS. Team Exercise for Action Management Skills: A Semester-Long Team-Management Simulation.

    ERIC Educational Resources Information Center

    Wagenheim, Gary

    A team-oriented approach is replacing the traditional management style in today's organizations. Because team management skills differ, they require different teaching methods. This paper describes an administrator education course designed to develop team management skills from an applied and behavioral viewpoint. Students participate in…

  11. Admission Policy Evolution in Emerging Professional Programs: A Case Study

    ERIC Educational Resources Information Center

    Holley, Paul W.

    2006-01-01

    Professional program admission at U.S. universities has become increasingly competitive in the last 20 years, due to enrollment caps, core class requirements, transfer course acceptance, industry draw, and the appeal of starting salaries. As the competition steadily increases, students often find methods to exploit traditional policy, resulting in…

  12. Rapid and potentially portable detection and quantification technologies for foodborne pathogens

    USDA-ARS?s Scientific Manuscript database

    Introduction Traditional microbial culture methods are able to detect and identify a single specific bacterium, but may require days or weeks and typically do not produce quantitative data. The quest for faster, quantitative results has spurred development of “rapid methods” which usually employ bio...

  13. A one pot organic/CdSe nanoparticle hybrid material synthesis with in situ π-conjugated ligand functionalization.

    PubMed

    Mazzio, Katherine A; Okamoto, Ken; Li, Zhi; Gutmann, Sebastian; Strein, Elisabeth; Ginger, David S; Schlaf, Rudy; Luscombe, Christine K

    2013-02-14

    A one pot method for organic/colloidal CdSe nanoparticle hybrid material synthesis is presented. Relative to traditional ligand exchange processes, these materials require smaller amounts of the desired capping ligand, shorter syntheses and fewer processing steps, while maintaining nanoparticle morphology.

  14. Researching the Effects of Frame-Focused Instruction on Second Language Acquisition

    ERIC Educational Resources Information Center

    Sokolova, Elena; Burmistrova, Anna

    2012-01-01

    In the context of globalization, the research of innovative teaching methods and techniques becomes relevant. The traditional teaching approach where the training of practice material is preceded by rule-presentation (explanation + mechanical form-oriented practice) does not meet the requirements of constantly developing rational language…

  15. Aiding Participation and Engagement in a Blended Learning Environment

    ERIC Educational Resources Information Center

    Alrushiedat, Nimer; Olfman, Lorne

    2013-01-01

    This research was conducted as a field experiment that explored the potential benefits of anchoring in asynchronous online discussions for business statistics classes required for information systems majors. These classes are usually taught using traditional methods with emphasis on lecturing, knowledge reproduction, and treatment of students as…

  16. Pig Mandible as a Valuable Tool to Improve Periodontal Surgery Techniques

    ERIC Educational Resources Information Center

    Zangrando, Mariana S. Ragghianti; Sant'Ana, Adriana C. P.; Greghi, Sebastião L. A.; de Rezende, Maria Lucia R.; Damante, Carla A.

    2014-01-01

    Clinical education in dental practice is a challenge for professionals and students. The traditional method of clinical training in Periodontology usually is based on following the procedure and practicing under supervision, until achieving proficiency. However, laboratory practice is required before direct care in patients. Specific anatomic…

  17. A location-based multiple point statistics method: modelling the reservoir with non-stationary characteristics

    NASA Astrophysics Data System (ADS)

    Yin, Yanshu; Feng, Wenjie

    2017-12-01

    In this paper, a location-based multiple point statistics method is developed to model a non-stationary reservoir. The proposed method characterizes the relationship between the sedimentary pattern and the deposit location using the relative central position distance function, which alleviates the requirement that the training image and the simulated grids have the same dimension. The weights in every direction of the distance function can be changed to characterize the reservoir heterogeneity in various directions. The local integral replacements of data events, structured random path, distance tolerance and multi-grid strategy are applied to reproduce the sedimentary patterns and obtain a more realistic result. This method is compared with the traditional Snesim method using a synthesized 3-D training image of Poyang Lake and a reservoir model of Shengli Oilfield in China. The results indicate that the new method can reproduce the non-stationary characteristics better than the traditional method and is more suitable for simulation of delta-front deposits. These results show that the new method is a powerful tool for modelling a reservoir with non-stationary characteristics.

  18. Finite Element Analysis in Concurrent Processing: Computational Issues

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Watson, Brian; Vanderplaats, Garrett

    2004-01-01

    The purpose of this research is to investigate the potential application of new methods for solving large-scale static structural problems on concurrent computers. It is well known that traditional single-processor computational speed will be limited by inherent physical limits. The only path to achieve higher computational speeds lies through concurrent processing. Traditional factorization solution methods for sparse matrices are ill suited for concurrent processing because the null entries get filled, leading to high communication and memory requirements. The research reported herein investigates alternatives to factorization that promise a greater potential to achieve high concurrent computing efficiency. Two methods, and their variants, based on direct energy minimization are studied: a) minimization of the strain energy using the displacement method formulation; b) constrained minimization of the complementary strain energy using the force method formulation. Initial results indicated that in the context of the direct energy minimization the displacement formulation experienced convergence and accuracy difficulties while the force formulation showed promising potential.

  19. Longevity and optimal health: working toward an integrative methodology.

    PubMed

    Oz, Mehmet; Tallent, Jeremy

    2009-08-01

    Efforts to foster a research dialogue between traditions as seemingly divergent as Western biomedicine and Indo-Tibetan medical and self-regulatory practice require a carefully conceived set of methodological guidelines. To approach a useful methodology, some specific structural differences between traditions must be negotiated, for example the Indo-Tibetan emphasis on holism in medicine and ethics, which appears to run contrary to Western trends toward specialization in both clinical and research contexts. Certain pitfalls must be avoided as well, including the tendency to appropriate elements of either tradition in a reductionistic manner. However, research methods offering creative solutions to these problems are now emerging, successfully engendering quantitative insight without subsuming one tradition within the terms of the other. Only through continued, creative work exploring both the potentials and limitations of this dialogue can collaborative research insight be attained, and an appropriate and useful set of methodological principles be approached.

  20. Remote Sensing of Soil Moisture: A Comparison of Optical and Thermal Methods

    NASA Astrophysics Data System (ADS)

    Foroughi, H.; Naseri, A. A.; Boroomandnasab, S.; Sadeghi, M.; Jones, S. B.; Tuller, M.; Babaeian, E.

    2017-12-01

    Recent technological advances in satellite and airborne remote sensing have provided new means for large-scale soil moisture monitoring. Traditional methods for soil moisture retrieval require thermal and optical RS observations. In this study we compared the traditional trapezoid model parameterized based on the land surface temperature - normalized difference vegetation index (LST-NDVI) space with the recently developed optical trapezoid model OPTRAM parameterized based on the shortwave infrared transformed reflectance (STR)-NDVI space for an extensive sugarcane field located in Southwestern Iran. Twelve Landsat-8 satellite images were acquired during the sugarcane growth season (April to October 2016). Reference in situ soil moisture data were obtained at 22 locations at different depths via core sampling and oven-drying. The obtained results indicate that the thermal/optical and optical prediction methods are comparable, both with volumetric moisture content estimation errors of about 0.04 cm3 cm-3. However, the OPTRAM model is more efficient because it does not require thermal data and can be universally parameterized for a specific location, because unlike the LST-soil moisture relationship, the reflectance-soil moisture relationship does not significantly vary with environmental variables (e.g., air temperature, wind speed, etc.).

  1. Design of a real-time system of moving ship tracking on-board based on FPGA in remote sensing images

    NASA Astrophysics Data System (ADS)

    Yang, Tie-jun; Zhang, Shen; Zhou, Guo-qing; Jiang, Chuan-xian

    2015-12-01

    With the broad attention of countries in the areas of sea transportation and trade safety, the requirements of efficiency and accuracy of moving ship tracking are becoming higher. Therefore, a systematic design of moving ship tracking onboard based on FPGA is proposed, which uses the Adaptive Inter Frame Difference (AIFD) method to track a ship with different speed. For the Frame Difference method (FD) is simple but the amount of computation is very large, it is suitable for the use of FPGA to implement in parallel. But Frame Intervals (FIs) of the traditional FD method are fixed, and in remote sensing images, a ship looks very small (depicted by only dozens of pixels) and moves slowly. By applying invariant FIs, the accuracy of FD for moving ship tracking is not satisfactory and the calculation is highly redundant. So we use the adaptation of FD based on adaptive extraction of key frames for moving ship tracking. A FPGA development board of Xilinx Kintex-7 series is used for simulation. The experiments show that compared with the traditional FD method, the proposed one can achieve higher accuracy of moving ship tracking, and can meet the requirement of real-time tracking in high image resolution.

  2. A Robust Cooperated Control Method with Reinforcement Learning and Adaptive H∞ Control

    NASA Astrophysics Data System (ADS)

    Obayashi, Masanao; Uchiyama, Shogo; Kuremoto, Takashi; Kobayashi, Kunikazu

    This study proposes a robust cooperated control method combining reinforcement learning with robust control to control the system. A remarkable characteristic of the reinforcement learning is that it doesn't require model formula, however, it doesn't guarantee the stability of the system. On the other hand, robust control system guarantees stability and robustness, however, it requires model formula. We employ both the actor-critic method which is a kind of reinforcement learning with minimal amount of computation to control continuous valued actions and the traditional robust control, that is, H∞ control. The proposed system was compared method with the conventional control method, that is, the actor-critic only used, through the computer simulation of controlling the angle and the position of a crane system, and the simulation result showed the effectiveness of the proposed method.

  3. Event DAS-444Ø6-6 soybean grown in Brazil is compositionally equivalent to non-transgenic soybean.

    PubMed

    Fast, Brandon J; Galan, Maria P; Schafer, Ariane C

    2016-04-02

    Soybean event DAS-444Ø6-6 is tolerant to the herbicides 2,4-D, glyphosate, and glufosinate. An investigation of potential unintended adverse compositional changes in a genetically modified crop is required to meet government regulatory requirements in various geographies. A study to meet these requirements in Brazil was completed demonstrating compositional equivalency between DAS-444Ø6-6 and non-transgenic soybean. This study supplements the extensive literature supporting transgenesis as less disruptive of crop composition compared with traditional breeding methods.

  4. Student preparation time for traditional lecture versus team-based learning in a pharmacotherapy course.

    PubMed

    DeJongh, Beth; Lemoine, Nicia; Buckley, Elizabeth; Traynor, Laura

    2018-03-01

    Determine how much time students spent preparing for traditional lecture versus team-based learning (TBL) for a pharmacotherapy course and determine if time spent in each pedagogy was within stated expectations for the course. Instructors used a combination of traditional lecture and TBL to deliver material. Before each lecture, instructors recorded the amount of time students spent preparing for each method using a one-question clicker-response survey. Instructors delivered 16 hours of TBL, 32 hours of traditional lecture, and eight hours of a mix of TBL and traditional lecture. The median of students completing the survey each week was 89. A large percentage of the class (40.9%) did not prepare for traditional lecture while only 3.4% did not prepare for TBL. About 61% of students spent between 30 min and two hours preparing for a two-hour TBL session and only 10% spent more than three hours preparing. Results of this project show students spend little time preparing for traditional lectures without in-class accountability, which may give students the perception that TBL requires too much preparation time. Copyright © 2017. Published by Elsevier Inc.

  5. Cochrane Qualitative and Implementation Methods Group guidance series-paper 5: methods for integrating qualitative and implementation evidence within intervention effectiveness reviews.

    PubMed

    Harden, Angela; Thomas, James; Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Flemming, Kate; Booth, Andrew; Garside, Ruth; Hannes, Karin; Noyes, Jane

    2018-05-01

    The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method evidence from process evaluations. Despite a proliferation of methods for the synthesis of qualitative research, less attention has focused on how to integrate these syntheses within intervention effectiveness reviews. In this article, we report updated guidance from the group on approaches, methods, and tools, which can be used to integrate the findings from quantitative studies evaluating intervention effectiveness with those from qualitative studies and process evaluations. We draw on conceptual analyses of mixed methods systematic review designs and the range of methods and tools that have been used in published reviews that have successfully integrated different types of evidence. We outline five key methods and tools as devices for integration which vary in terms of the levels at which integration takes place; the specialist skills and expertise required within the review team; and their appropriateness in the context of limited evidence. In situations where the requirement is the integration of qualitative and process evidence within intervention effectiveness reviews, we recommend the use of a sequential approach. Here, evidence from each tradition is synthesized separately using methods consistent with each tradition before integration takes place using a common framework. Reviews which integrate qualitative and process evaluation evidence alongside quantitative evidence on intervention effectiveness in a systematic way are rare. This guidance aims to support review teams to achieve integration and we encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Technology-based vs. traditional instruction. A comparison of two methods for teaching the skill of performing a 12-lead ECG.

    PubMed

    Jeffries, Pamela R; Woolf, Shirley; Linde, Beverly

    2003-01-01

    The purpose of this study was to compare the effectiveness of an interactive, multimedia CD-ROM with traditional methods of teaching the skill of performing a 12-lead ECG. A randomized pre/posttest experimental design was used. Seventy-seven baccalaureate nursing students in a required, senior-level critical-care course at a large midwestern university were recruited for the study. Two teaching methods were compared. The traditional method included a self-study module, a brief lecture and demonstration by an instructor, and hands-on experience using a plastic manikin and a real 12-lead ECG machine in the learning laboratory. The second method covered the same content using an interactive, multimedia CD-ROM embedded with virtual reality and supplemented with a self-study module. There were no significant (p < .05) baseline differences in pretest scores between the two groups and no significant differences by group in cognitive gains, student satisfaction with their learning method, or perception of self-efficacy in performing the skill. Overall results indicated that both groups were satisfied with their instructional method and were similar in their ability to demonstrate the skill correctly on a live, simulated patient. This evaluation study is a beginning step to assess new and potentially more cost-effective teaching methods and their effects on student learning outcomes and behaviors, including the transfer of skill acquisition via a computer simulation to a real patient.

  7. Three-channel dynamic photometric stereo: a new method for 4D surface reconstruction and volume recovery

    NASA Astrophysics Data System (ADS)

    Schroeder, Walter; Schulze, Wolfram; Wetter, Thomas; Chen, Chi-Hsien

    2008-08-01

    Three-dimensional (3D) body surface reconstruction is an important field in health care. A popular method for this purpose is laser scanning. However, using Photometric Stereo (PS) to record lumbar lordosis and the surface contour of the back poses a viable alternative due to its lower costs and higher flexibility compared to laser techniques and other methods of three-dimensional body surface reconstruction. In this work, we extended the traditional PS method and proposed a new method for obtaining surface and volume data of a moving object. The principle of traditional Photometric Stereo uses at least three images of a static object taken under different light sources to obtain 3D information of the object. Instead of using normal light, the light sources in the proposed method consist of the RGB-Color-Model's three colors: red, green and blue. A series of pictures taken with a video camera can now be separated into the different color channels. Each set of the three images can then be used to calculate the surface normals as a traditional PS. This method waives the requirement that the object imaged must be kept still as in almost all the other body surface reconstruction methods. By putting two cameras opposite to a moving object and lighting the object with the colored light, the time-varying surface (4D) data can easily be calculated. The obtained information can be used in many medical fields such as rehabilitation, diabetes screening or orthopedics.

  8. Pharmacovigilance of herbal medicines: the potential contributions of ethnobotanical and ethnopharmacological studies.

    PubMed

    Rodrigues, Eliana; Barnes, Joanne

    2013-01-01

    Typically, ethnobotanical/ethnopharmacological (EB/EP) surveys are used to describe uses, doses/dosages, sources and methods of preparation of traditional herbal medicines; their application to date in examining the adverse effects, contraindications and other safety aspects of these preparations is limited. From a pharmacovigilance perspective, numerous challenges exist in applying its existing methods to studying the safety profile of herbal medicines, particularly where used by indigenous cultures. This paper aims to contribute to the methodological aspects of EB/EP field work, and to extend the reach of pharmacovigilance, by proposing a tool comprising a list of questions that could be applied during interview and observational studies. The questions focus on the collection of information on the safety profile of traditional herbal medicines as it is embedded in traditional knowledge, as well as on identifying personal experiences (spontaneous reports) of adverse or undesirable effects associated with the use of traditional herbal medicines. Questions on the precise composition of traditional prescriptions or 'recipes', their preparation, storage, administration and dosing are also included. Strengths and limitations of the tool are discussed. From this interweaving of EB/EP and pharmacovigilance arises a concept of ethnopharmacovigilance for traditional herbal medicines: the scope of EB/EP is extended to include exploration of the potential harmful effects of medicinal plants, and the incorporation of pharmacovigilance questions into EB/EP studies provides a new opportunity for collection of 'general' traditional knowledge on the safety of traditional herbal medicines and, importantly, a conduit for collection of spontaneous reports of suspected adverse effects. Whether the proposed tool can yield data sufficiently rich and of an appropriate quality for application of EB/EP (e.g. data verification and quantitative analysis tools) and pharmacovigilance techniques (e.g. causality assessment and data mining) requires field testing.

  9. Analysis of drugs in human tissues by supercritical fluid extraction/immunoassay

    NASA Astrophysics Data System (ADS)

    Furton, Kenneth G.; Sabucedo, Alberta; Rein, Joseph; Hearn, W. L.

    1997-02-01

    A rapid, readily automated method has been developed for the quantitative analysis of phenobarbital from human liver tissues based on supercritical carbon dioxide extraction followed by fluorescence enzyme immunoassay. The method developed significantly reduces sample handling and utilizes the entire liver homogenate. The current method yields comparable recoveries and precision and does not require the use of an internal standard, although traditional GC/MS confirmation can still be performed on sample extracts. Additionally, the proposed method uses non-toxic, inexpensive carbon dioxide, thus eliminating the use of halogenated organic solvents.

  10. On the application of the lattice Boltzmann method to the investigation of glottal flow

    PubMed Central

    Kucinschi, Bogdan R.; Afjeh, Abdollah A.; Scherer, Ronald C.

    2008-01-01

    The production of voice is directly related to the vibration of the vocal folds, which is generated by the interaction between the glottal flow and the tissue of the vocal folds. In the current study, the aerodynamics of the symmetric glottis is investigated numerically for a number of static configurations. The numerical investigation is based on the lattice Boltzmann method (LBM), which is an alternative approach within computational fluid dynamics. Compared to the traditional Navier–Stokes computational fluid dynamics methods, the LBM is relatively easy to implement and can deal with complex geometries without requiring a dedicated grid generator. The multiple relaxation time model was used to improve the numerical stability. The results obtained with LBM were compared to the results provided by a traditional Navier–Stokes solver and experimental data. It was shown that LBM results are satisfactory for all the investigated cases. PMID:18646995

  11. A fast and accurate frequency estimation algorithm for sinusoidal signal with harmonic components

    NASA Astrophysics Data System (ADS)

    Hu, Jinghua; Pan, Mengchun; Zeng, Zhidun; Hu, Jiafei; Chen, Dixiang; Tian, Wugang; Zhao, Jianqiang; Du, Qingfa

    2016-10-01

    Frequency estimation is a fundamental problem in many applications, such as traditional vibration measurement, power system supervision, and microelectromechanical system sensors control. In this paper, a fast and accurate frequency estimation algorithm is proposed to deal with low efficiency problem in traditional methods. The proposed algorithm consists of coarse and fine frequency estimation steps, and we demonstrate that it is more efficient than conventional searching methods to achieve coarse frequency estimation (location peak of FFT amplitude) by applying modified zero-crossing technique. Thus, the proposed estimation algorithm requires less hardware and software sources and can achieve even higher efficiency when the experimental data increase. Experimental results with modulated magnetic signal show that the root mean square error of frequency estimation is below 0.032 Hz with the proposed algorithm, which has lower computational complexity and better global performance than conventional frequency estimation methods.

  12. Outcome modelling strategies in epidemiology: traditional methods and basic alternatives

    PubMed Central

    Greenland, Sander; Daniel, Rhian; Pearce, Neil

    2016-01-01

    Abstract Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the ‘change-in-estimate’ (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE). PMID:27097747

  13. A method to measure internal contact angle in opaque systems by magnetic resonance imaging.

    PubMed

    Zhu, Weiqin; Tian, Ye; Gao, Xuefeng; Jiang, Lei

    2013-07-23

    Internal contact angle is an important parameter for internal wettability characterization. However, due to the limitation of optical imaging, methods available for contact angle measurement are only suitable for transparent or open systems. For most of the practical situations that require contact angle measurement in opaque or enclosed systems, the traditional methods are not effective. Based upon the requirement, a method suitable for contact angle measurement in nontransparent systems is developed by employing MRI technology. In the Article, the method is demonstrated by measuring internal contact angles in opaque cylindrical tubes. It proves that the method also shows great feasibility in transparent situations and opaque capillary systems. By using the method, contact angle in opaque systems could be measured successfully, which is significant in understanding the wetting behaviors in nontransparent systems and calculating interfacial parameters in enclosed systems.

  14. Pressure balance cross-calibration method using a pressure transducer as transfer standard

    PubMed Central

    Olson, D; Driver, R. G.; Yang, Y

    2016-01-01

    Piston gauges or pressure balances are widely used to realize the SI unit of pressure, the pascal, and to calibrate pressure sensing devices. However, their calibration is time consuming and requires a lot of technical expertise. In this paper, we propose an alternate method of performing a piston gauge cross calibration that incorporates a pressure transducer as an immediate in-situ transfer standard. For a sufficiently linear transducer, the requirement to exactly balance the weights on the two pressure gauges under consideration is greatly relaxed. Our results indicate that this method can be employed without a significant increase in measurement uncertainty. Indeed, in the test case explored here, our results agreed with the traditional method within standard uncertainty, which was less than 6 parts per million. PMID:28303167

  15. Stitching interferometry of a full cylinder without using overlap areas

    NASA Astrophysics Data System (ADS)

    Peng, Junzheng; Chen, Dingfu; Yu, Yingjie

    2017-08-01

    Traditional stitching interferometry requires finding out the overlap correspondence and computing the discrepancies in the overlap regions, which makes it complex and time-consuming to obtain the 360° form map of a cylinder. In this paper, we develop a cylinder stitching model based on a new set of orthogonal polynomials, termed Legendre Fourier (LF) polynomials. With these polynomials, individual subaperture data can be expanded as a composition of the inherent form of a partial cylinder surface and additional misalignment parameters. Then the 360° form map can be acquired by simultaneously fitting all subaperture data with the LF polynomials. A metal shaft was measured to experimentally verify the proposed method. In contrast to traditional stitching interferometry, our technique does not require overlapping of adjacent subapertures, thus significantly reducing the measurement time and making the stitching algorithm simple.

  16. Computer Aided Phenomenography: The Role of Leximancer Computer Software in Phenomenographic Investigation

    ERIC Educational Resources Information Center

    Penn-Edwards, Sorrel

    2010-01-01

    The qualitative research methodology of phenomenography has traditionally required a manual sorting and analysis of interview data. In this paper I explore a potential means of streamlining this procedure by considering a computer aided process not previously reported upon. Two methods of lexicological analysis, manual and automatic, were examined…

  17. Membrane Potential Simulation Program for IBM-PC-Compatible Equipment for Physiology and Biology Students.

    ERIC Educational Resources Information Center

    Barry, Peter H.

    1990-01-01

    A graphic, interactive software program that is suitable for teaching students about the measurement and ion dependence of cell membrane potentials is described. The hardware requirements, the aim of the program, how to use the program, other related programs, and its advantages over traditional methods are included. (KR)

  18. Beyond the Google Search Bar: Evaluating Source Credibility in Contemporary Research

    ERIC Educational Resources Information Center

    Sorenson, Mary E.

    2016-01-01

    Courses: Research Methods, Public Speaking, Communication Theory, any other course that requires college students to engage in a formal research process. Can be conducted in traditional, online, or hybrid courses. Objectives: In this original single-class activity, students will be able to evaluate source credibility for resources that extend…

  19. Citizen Science: A Gateway for Innovation in Disease-Carrying Mosquito Management?

    PubMed

    Bartumeus, Frederic; Oltra, Aitana; Palmer, John R B

    2018-05-21

    Traditional methods for tracking disease-carrying mosquitoes are hitting budget constraints as the scales over which they must be implemented grow exponentially. Citizen science offers a novel solution to this problem but requires new models of innovation in the public health sector. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. High-resolution solution-state NMR of unfractionated plant cell walls

    Treesearch

    John Ralph; Fachuang Lu; Hoon Kim; Dino Ress; Daniel J. Yelle; Kenneth E. Hammel; Sally A. Ralph; Bernadette Nanayakkara; Armin Wagner; Takuya Akiyama; Paul F. Schatz; Shawn D. Mansfield; Noritsugu Terashima; Wout Boerjan; Bjorn Sundberg; Mattias Hedenstrom

    2009-01-01

    Detailed structural studies on the plant cell wall have traditionally been difficult. NMR is one of the preeminent structural tools, but obtaining high-resolution solution-state spectra has typically required fractionation and isolation of components of interest. With recent methods for dissolution of, admittedly, finely divided plant cell wall material, the wall can...

  1. An Evaluation of Outcomes Following the Replacement of Traditional Histology Laboratories with Self-Study Modules

    ERIC Educational Resources Information Center

    Thompson, Andrew R.; Lowrie, Donald J., Jr.

    2017-01-01

    Changes in medical school curricula often require educators to develop teaching strategies that decrease contact hours while maintaining effective pedagogical methods. When faced with this challenge, faculty at the University of Cincinnati College of Medicine converted the majority of in-person histology laboratory sessions to self-study modules…

  2. 20 CFR 655.140 - Review of applications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Certification § 655.140 Review of applications. (a) NPC review. The CO will promptly review the Application for.... Any notice or request sent by the CO(s) to an employer requiring a response will be sent using the provided address via traditional methods to assure next day delivery. The employer's response to such a...

  3. 20 CFR 655.140 - Review of applications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Certification § 655.140 Review of applications. (a) NPC review. The CO will promptly review the Application for.... Any notice or request sent by the CO(s) to an employer requiring a response will be sent using the provided address via traditional methods to assure next day delivery. The employer's response to such a...

  4. Group Comparisons of Mathematics Performance from a Cognitive Diagnostic Perspective

    ERIC Educational Resources Information Center

    Chen, Yi-Hsin; Ferron, John M.; Thompson, Marilyn S.; Gorin, Joanna S.; Tatsuoka, Kikumi K.

    2010-01-01

    Traditional comparisons of test score means identify group differences in broad academic areas, but fail to provide substantive description of how the groups differ on the specific cognitive attributes required for success in the academic area. The rule space method (RSM) allows for group comparisons at the cognitive attribute level, which…

  5. More rapid edgewise crush test methods

    Treesearch

    Thomas J. Urbanik; Arthur H. Catlin; Davide R. Friedman; Richard C. Lund

    1993-01-01

    The use of paraffin wax to reinforce the loading edges of corrugated fiberboard edge-crush specimens requires that the specimens be reconditioned after waxing. The traditional practice employing a 24-h reconditioning period is a conservative approach based on the moisture response rate of corrugated containers. An interlaboratory study was conducted to determine the...

  6. Teaching and Learning with Technology: Effectiveness of ICT Integration in Schools

    ERIC Educational Resources Information Center

    Ghavifekr, Simin; Rosdy, Wan Athirah Wan

    2015-01-01

    Integration of Information, Communication, and Technology (ICT) will assist teachers to the global requirement to replace traditional teaching methods with a technology-based teaching and learning tools and facilities. In Malaysia, ICT is considered as one of the main elements in transforming the country to the future development. The Ministry of…

  7. Analytical alternatives for an annual inventory system

    Treesearch

    Francis A. Roesch; Gregory A. Reams

    1999-01-01

    Methods for analyzing data from the Southern Annual Forest Inventory System (SAFIS) are discussed. Differences between the annual inventory approach and the more traditional periodic approach require that we revisit the previous assumption that there are no important spatial and temporal trends in the data. Over the next few years, the USDA Forest Service Southern...

  8. Views of Pre-Service Teachers on Blog use for Instruction and Social Interaction

    ERIC Educational Resources Information Center

    Kuzu, Abdullah

    2007-01-01

    Rapid development of technology and unique characteristics of the creative society require a shift from traditional teaching concepts to student centered learning in education. One of the methods to provide this change is creating teaching environments enriched by Internet. Blog (weblog) service offered to learners and teachers through Internet…

  9. A Programme for Future Audit Professionals: Using Action Research to Nurture Student Engagement

    ERIC Educational Resources Information Center

    Van Peursem, Karen; Samujh, R. Helen; Nath, Nirmala

    2016-01-01

    Professionals require decision-making skills as well as technical knowledge. One might assume that their university education prepares them for this role yet, and least for future audit professionals, traditional text--and lecture--methods dominate teaching practice. This Participation Action Research study develops with auditing students a…

  10. Definition of Alaskan Aviation Training Requirements. Final Report.

    ERIC Educational Resources Information Center

    Mitchell, M. K.; And Others

    Because of high accident rates and the unique conditions faced in Arctic flying, a project was conducted to develop a training program for airline pilots flying over Alaska. Data were gathered, through the critical incident method in conjunction with traditional job-analysis procedures, about how experienced Alaskan pilots learned to cope with the…

  11. Twenty-First Century Literacy: A Matter of Scale from Micro to Mega

    ERIC Educational Resources Information Center

    Brown, Abbie; Slagter van Tryon, Patricia J.

    2010-01-01

    Twenty-first century technologies require educators to look for new ways to teach literacy skills. Current communication methods are combinations of traditional and newer, network-driven forms. This article describes the changes twenty-first century technologies cause in the perception of time, size, distance, audience, and available data, and…

  12. Intrauterine devices and other forms of contraception: thinking outside the pack.

    PubMed

    Allen, Caitlin; Kolehmainen, Christine

    2015-05-01

    A variety of contraception options are available in addition to traditional combined oral contraceptive pills. Newer long-acting reversible contraceptive (LARC) methods such as intrauterine devices and subcutaneous implants are preferred because they do not depend on patient compliance. They are highly effective and appropriate for most women. Female and male sterilization are other effective but they are irreversible and require counseling to minimize regret. The contraceptive injection, patch, and ring do not require daily administration, but their typical efficacy rates are lower than LARC methods and similar to those for combined oral contraceptive pills. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Specification and Design of Electrical Flight System Architectures with SysML

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L., Jr.; Jimenez, Alejandro

    2012-01-01

    Modern space flight systems are required to perform more complex functions than previous generations to support space missions. This demand is driving the trend to deploy more electronics to realize system functionality. The traditional approach for the specification, design, and deployment of electrical system architectures in space flight systems includes the use of informal definitions and descriptions that are often embedded within loosely coupled but highly interdependent design documents. Traditional methods become inefficient to cope with increasing system complexity, evolving requirements, and the ability to meet project budget and time constraints. Thus, there is a need for more rigorous methods to capture the relevant information about the electrical system architecture as the design evolves. In this work, we propose a model-centric approach to support the specification and design of electrical flight system architectures using the System Modeling Language (SysML). In our approach, we develop a domain specific language for specifying electrical system architectures, and we propose a design flow for the specification and design of electrical interfaces. Our approach is applied to a practical flight system.

  14. Neural Network Based Sensory Fusion for Landmark Detection

    NASA Technical Reports Server (NTRS)

    Kumbla, Kishan -K.; Akbarzadeh, Mohammad R.

    1997-01-01

    NASA is planning to send numerous unmanned planetary missions to explore the space. This requires autonomous robotic vehicles which can navigate in an unstructured, unknown, and uncertain environment. Landmark based navigation is a new area of research which differs from the traditional goal-oriented navigation, where a mobile robot starts from an initial point and reaches a destination in accordance with a pre-planned path. The landmark based navigation has the advantage of allowing the robot to find its way without communication with the mission control station and without exact knowledge of its coordinates. Current algorithms based on landmark navigation however pose several constraints. First, they require large memories to store the images. Second, the task of comparing the images using traditional methods is computationally intensive and consequently real-time implementation is difficult. The method proposed here consists of three stages, First stage utilizes a heuristic-based algorithm to identify significant objects. The second stage utilizes a neural network (NN) to efficiently classify images of the identified objects. The third stage combines distance information with the classification results of neural networks for efficient and intelligent navigation.

  15. Optimized star sensors laboratory calibration method using a regularization neural network.

    PubMed

    Zhang, Chengfen; Niu, Yanxiong; Zhang, Hao; Lu, Jiazhen

    2018-02-10

    High-precision ground calibration is essential to ensure the performance of star sensors. However, the complex distortion and multi-error coupling have brought great difficulties to traditional calibration methods, especially for large field of view (FOV) star sensors. Although increasing the complexity of models is an effective way to improve the calibration accuracy, it significantly increases the demand for calibration data. In order to achieve high-precision calibration of star sensors with large FOV, a novel laboratory calibration method based on a regularization neural network is proposed. A multi-layer structure neural network is designed to represent the mapping of the star vector and the corresponding star point coordinate directly. To ensure the generalization performance of the network, regularization strategies are incorporated into the net structure and the training algorithm. Simulation and experiment results demonstrate that the proposed method can achieve high precision with less calibration data and without any other priori information. Compared with traditional methods, the calibration error of the star sensor decreased by about 30%. The proposed method can satisfy the precision requirement for large FOV star sensors.

  16. Ocean Wave Separation Using CEEMD-Wavelet in GPS Wave Measurement.

    PubMed

    Wang, Junjie; He, Xiufeng; Ferreira, Vagner G

    2015-08-07

    Monitoring ocean waves plays a crucial role in, for example, coastal environmental and protection studies. Traditional methods for measuring ocean waves are based on ultrasonic sensors and accelerometers. However, the Global Positioning System (GPS) has been introduced recently and has the advantage of being smaller, less expensive, and not requiring calibration in comparison with the traditional methods. Therefore, for accurately measuring ocean waves using GPS, further research on the separation of the wave signals from the vertical GPS-mounted carrier displacements is still necessary. In order to contribute to this topic, we present a novel method that combines complementary ensemble empirical mode decomposition (CEEMD) with a wavelet threshold denoising model (i.e., CEEMD-Wavelet). This method seeks to extract wave signals with less residual noise and without losing useful information. Compared with the wave parameters derived from the moving average skill, high pass filter and wave gauge, the results show that the accuracy of the wave parameters for the proposed method was improved with errors of about 2 cm and 0.2 s for mean wave height and mean period, respectively, verifying the validity of the proposed method.

  17. FEM-based strain analysis study for multilayer sheet forming process

    NASA Astrophysics Data System (ADS)

    Zhang, Rongjing; Lang, Lihui; Zafar, Rizwan

    2015-12-01

    Fiber metal laminates have many advantages over traditional laminates (e.g., any type of fiber and resin material can be placed anywhere between the metallic layers without risk of failure of the composite fabric sheets). Furthermore, the process requirements to strictly control the temperature and punch force in fiber metal laminates are also less stringent than those in traditional laminates. To further explore the novel method, this study conducts a finite element method-based (FEM-based) strain analysis on multilayer blanks by using the 3A method. Different forming modes such as wrinkling and fracture are discussed by using experimental and numerical studies. Hydroforming is used for multilayer forming. The Barlat 2000 yield criteria and DYNAFORM/LS-DYNA are used for the simulations. Optimal process parameters are determined on the basis of fixed die-binder gap and variable cavity pressure. The results of this study will enhance the knowledge on the mechanics of multilayer structures formed by using the 3A method and expand its commercial applications.

  18. Conventional and dense gas techniques for the production of liposomes: a review.

    PubMed

    Meure, Louise A; Foster, Neil R; Dehghani, Fariba

    2008-01-01

    The aim of this review paper is to compare the potential of various techniques developed for production of homogenous, stable liposomes. Traditional techniques, such as Bangham, detergent depletion, ether/ethanol injection, reverse-phase evaporation and emulsion methods, were compared with the recent advanced techniques developed for liposome formation. The major hurdles for scaling up the traditional methods are the consumption of large quantities of volatile organic solvent, the stability and homogeneity of the liposomal product, as well as the lengthy multiple steps involved. The new methods have been designed to alleviate the current issues for liposome formulation. Dense gas liposome techniques are still in their infancy, however they have remarkable advantages in reducing the use of organic solvents, providing fast, single-stage production and producing stable, uniform liposomes. Techniques such as the membrane contactor and heating methods are also promising as they eliminate the use of organic solvent, however high temperature is still required for processing.

  19. A combined finite element-boundary integral formulation for solution of two-dimensional scattering problems via CGFFT. [Conjugate Gradient Fast Fourier Transformation

    NASA Technical Reports Server (NTRS)

    Collins, Jeffery D.; Volakis, John L.; Jin, Jian-Ming

    1990-01-01

    A new technique is presented for computing the scattering by 2-D structures of arbitrary composition. The proposed solution approach combines the usual finite element method with the boundary-integral equation to formulate a discrete system. This is subsequently solved via the conjugate gradient (CG) algorithm. A particular characteristic of the method is the use of rectangular boundaries to enclose the scatterer. Several of the resulting boundary integrals are therefore convolutions and may be evaluated via the fast Fourier transform (FFT) in the implementation of the CG algorithm. The solution approach offers the principal advantage of having O(N) memory demand and employs a 1-D FFT versus a 2-D FFT as required with a traditional implementation of the CGFFT algorithm. The speed of the proposed solution method is compared with that of the traditional CGFFT algorithm, and results for rectangular bodies are given and shown to be in excellent agreement with the moment method.

  20. A combined finite element and boundary integral formulation for solution via CGFFT of 2-dimensional scattering problems

    NASA Technical Reports Server (NTRS)

    Collins, Jeffery D.; Volakis, John L.

    1989-01-01

    A new technique is presented for computing the scattering by 2-D structures of arbitrary composition. The proposed solution approach combines the usual finite element method with the boundary integral equation to formulate a discrete system. This is subsequently solved via the conjugate gradient (CG) algorithm. A particular characteristic of the method is the use of rectangular boundaries to enclose the scatterer. Several of the resulting boundary integrals are therefore convolutions and may be evaluated via the fast Fourier transform (FFT) in the implementation of the CG algorithm. The solution approach offers the principle advantage of having O(N) memory demand and employs a 1-D FFT versus a 2-D FFT as required with a traditional implementation of the CGFFT algorithm. The speed of the proposed solution method is compared with that of the traditional CGFFT algorithm, and results for rectangular bodies are given and shown to be in excellent agreement with the moment method.

  1. The Impact of Social Media on Dissemination and Implementation of Clinical Practice Guidelines: A Longitudinal Observational Study

    PubMed Central

    Gronseth, Gary; Dubinsky, Richard; Penfold-Murray, Rebecca; Cox, Julie; Bever Jr, Christopher; Martins, Yolanda; Rheaume, Carol; Shouse, Denise; Getchius, Thomas SD

    2015-01-01

    Background Evidence-based clinical practice guidelines (CPGs) are statements that provide recommendations to optimize patient care for a specific clinical problem or question. Merely reading a guideline rarely leads to implementation of recommendations. The American Academy of Neurology (AAN) has a formal process of guideline development and dissemination. The last few years have seen a burgeoning of social media such as Facebook, Twitter, and LinkedIn, and newer methods of dissemination such as podcasts and webinars. The role of these media in guideline dissemination has not been studied. Systematic evaluation of dissemination methods and comparison of the effectiveness of newer methods with traditional methods is not available. It is also not known whether specific dissemination methods may be more effectively targeted to specific audiences. Objective Our aim was to (1) develop an innovative dissemination strategy by adding social media-based dissemination methods to traditional methods for the AAN clinical practice guidelines “Complementary and alternative medicine in multiple sclerosis” (“CAM in MS”) and (2) evaluate whether the addition of social media outreach improves awareness of the CPG and knowledge of CPG recommendations, and affects implementation of those recommendations. Methods Outcomes were measured by four surveys in each of the two target populations: patients and physicians/clinicians (“physicians”). The primary outcome was the difference in participants’ intent to discuss use of complementary and alternative medicine (CAM) with their physicians or patients, respectively, after novel dissemination, as compared with that after traditional dissemination. Secondary outcomes were changes in awareness of the CPG, knowledge of CPG content, and behavior regarding CAM use in multiple sclerosis (MS). Results Response rates were 25.08% (622/2480) for physicians and 43.5% (348/800) for patients. Awareness of the CPG increased after traditional dissemination (absolute difference, 95% confidence interval: physicians 36%, 95% CI 25-46, and patients 10%, 95% CI 1-11) but did not increase further after novel dissemination (physicians 0%, 95% CI -11 to 11, and patients -4%, 95% CI -6 to 14). Intent to discuss CAM also increased after traditional dissemination but did not change after novel dissemination (traditional: physicians 12%, 95% CI 2-22, and patients 19%, 95% CI 3-33; novel: physicians 11%, 95% CI -1 to -21, and patients -8%, 95% CI -22 to 8). Knowledge of CPG recommendations and behavior regarding CAM use in MS did not change after either traditional dissemination or novel dissemination. Conclusions Social media-based dissemination methods did not confer additional benefit over print-, email-, and Internet-based methods in increasing CPG awareness and changing intent in physicians or patients. Research on audience selection, message formatting, and message delivery is required to utilize Web 2.0 technologies optimally for dissemination. PMID:26272267

  2. Explosive component acceptance tester using laser interferometer technology

    NASA Technical Reports Server (NTRS)

    Wickstrom, Richard D.; Tarbell, William W.

    1993-01-01

    Acceptance testing of explosive components requires a reliable and simple to use testing method that can discern less than optimal performance. For hot-wire detonators, traditional techniques use dent blocks or photographic diagnostic methods. More complicated approaches are avoided because of their inherent problems with setup and maintenance. A recently developed tester is based on using a laser interferometer to measure the velocity of flying plates accelerated by explosively actuated detonators. Unlike ordinary interferometers that monitor displacement of the test article, this device measures velocity directly and is commonly used with non-spectral surfaces. Most often referred to as the VISAR technique (Velocity Interferometer System for Any Reflecting Surface), it has become the most widely-accepted choice for accurate measurement of velocity in the range greater than 1 mm/micro-s. Traditional VISAR devices require extensive setup and adjustment and therefore are unacceptable in a production-testing environment. This paper describes a new VISAR approach which requires virtually no adjustments, yet provides data with accuracy comparable to the more complicated systems. The device, termed the Fixed-Cavity VISAR, is currently being developed to serve as a product verification tool for hot-wire detonators and slappers. An extensive data acquisition and analysis computer code was also created to automate the manipulation of raw data into final results.

  3. A high-performance liquid chromatography-electronic circular dichroism online method for assessing the absolute enantiomeric excess and conversion ratio of asymmetric reactions

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang; Wang, Mingchao; Li, Li; Yin, Dali

    2017-03-01

    Asymmetric reactions often need to be evaluated during the synthesis of chiral compounds. However, traditional evaluation methods require the isolation of the individual enantiomer, which is tedious and time-consuming. Thus, it is desirable to develop simple, practical online detection methods. We developed a method based on high-performance liquid chromatography-electronic circular dichroism (HPLC-ECD) that simultaneously analyzes the material conversion ratio and absolute optical purity of each enantiomer. In particular, only a reverse-phase C18 column instead of a chiral column is required in our method because the ECD measurement provides a g-factor that describes the ratio of each enantiomer in the mixtures. We used our method to analyze the asymmetric hydrosilylation of β-enamino esters, and we discussed the advantage, feasibility, and effectiveness of this new methodology.

  4. Rapid analysis of scattering from periodic dielectric structures using accelerated Cartesian expansions.

    PubMed

    Baczewski, Andrew D; Miller, Nicholas C; Shanker, Balasubramaniam

    2012-04-01

    The analysis of fields in periodic dielectric structures arise in numerous applications of recent interest, ranging from photonic bandgap structures and plasmonically active nanostructures to metamaterials. To achieve an accurate representation of the fields in these structures using numerical methods, dense spatial discretization is required. This, in turn, affects the cost of analysis, particularly for integral-equation-based methods, for which traditional iterative methods require O(N2) operations, N being the number of spatial degrees of freedom. In this paper, we introduce a method for the rapid solution of volumetric electric field integral equations used in the analysis of doubly periodic dielectric structures. The crux of our method is the accelerated Cartesian expansion algorithm, which is used to evaluate the requisite potentials in O(N) cost. Results are provided that corroborate our claims of acceleration without compromising accuracy, as well as the application of our method to a number of compelling photonics applications.

  5. Minerals Intake Distributions in a Large Sample of Iranian at-Risk Population Using the National Cancer Institute Method: Do They Meet Their Requirements?

    PubMed

    Heidari, Zahra; Feizi, Awat; Azadbakht, Leila; Sarrafzadegan, Nizal

    2015-01-01

    Minerals are required for the body's normal function. The current study assessed the intake distribution of minerals and estimated the prevalence of inadequacy and excess among a representative sample of healthy middle aged and elderly Iranian people. In this cross-sectional study, the second follow up to the Isfahan Cohort Study (ICS), 1922 generally healthy people aged 40 and older were investigated. Dietary intakes were collected using 24 hour recalls and two or more consecutive food records. Distribution of minerals intake was estimated using traditional (averaging dietary intake days) and National Cancer Institute (NCI) methods, and the results obtained from the two methods, were compared. The prevalence of minerals intake inadequacy or excess was estimated using the estimated average requirement (EAR) cut-point method, the probability approach and the tolerable upper intake levels (UL). There were remarkable differences between values obtained using traditional and NCI methods, particularly in the lower and upper percentiles of the estimated intake distributions. A high prevalence of inadequacy of magnesium (50 - 100 %), calcium (21 - 93 %) and zinc (30 - 55 % for males > 50 years) was observed. Significant gender differences were found regarding inadequate intakes of calcium (21 - 76 % for males vs. 45 - 93 % for females), magnesium (92 % vs. 100 %), iron (0 vs. 15 % for age group 40 - 50 years) and zinc (29 - 55 % vs. 0 %) (all; p < 0.05). Severely imbalanced intakes of magnesium, calcium and zinc were observed among the middle-aged and elderly Iranian population. Nutritional interventions and population-based education to improve healthy diets among the studied population at risk are needed.

  6. Estimating local noise power spectrum from a few FBP-reconstructed CT scans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeng, Rongping, E-mail: rongping.zeng@fda.hhs.gov; Gavrielides, Marios A.; Petrick, Nicholas

    Purpose: Traditional ways to estimate 2D CT noise power spectrum (NPS) involve an ensemble average of the power spectrums of many noisy scans. When only a few scans are available, regions of interest are often extracted from different locations to obtain sufficient samples to estimate the NPS. Using image samples from different locations ignores the nonstationarity of CT noise and thus cannot accurately characterize its local properties. The purpose of this work is to develop a method to estimate local NPS using only a few fan-beam CT scans. Methods: As a result of FBP reconstruction, the CT NPS has themore » same radial profile shape for all projection angles, with the magnitude varying with the noise level in the raw data measurement. This allows a 2D CT NPS to be factored into products of a 1D angular and a 1D radial function in polar coordinates. The polar separability of CT NPS greatly reduces the data requirement for estimating the NPS. The authors use this property and derive a radial NPS estimation method: in brief, the radial profile shape is estimated from a traditional NPS based on image samples extracted at multiple locations. The amplitudes are estimated by fitting the traditional local NPS to the estimated radial profile shape. The estimated radial profile shape and amplitudes are then combined to form a final estimate of the local NPS. We evaluate the accuracy of the radial NPS method and compared it to traditional NPS methods in terms of normalized mean squared error (NMSE) and signal detectability index. Results: For both simulated and real CT data sets, the local NPS estimated with no more than six scans using the radial NPS method was very close to the reference NPS, according to the metrics of NMSE and detectability index. Even with only two scans, the radial NPS method was able to achieve a fairly good accuracy. Compared to those estimated using traditional NPS methods, the accuracy improvement was substantial when a few scans were available. Conclusions: The radial NPS method was shown to be accurate and efficient in estimating the local NPS of FBP-reconstructed 2D CT images. It presents strong advantages over traditional NPS methods when the number of scans is limited and can be extended to estimate the in-plane NPS of cone-beam CT and multislice helical CT scans.« less

  7. Improving gross anatomy learning using reciprocal peer teaching.

    PubMed

    Manyama, Mange; Stafford, Renae; Mazyala, Erick; Lukanima, Anthony; Magele, Ndulu; Kidenya, Benson R; Kimwaga, Emmanuel; Msuya, Sifael; Kauki, Julius

    2016-03-22

    The use of cadavers in human anatomy teaching requires adequate number of anatomy instructors who can provide close supervision of the students. Most medical schools are facing challenges of lack of trained individuals to teach anatomy. Innovative techniques are therefore needed to impart adequate and relevant anatomical knowledge and skills. This study was conducted in order to evaluate the traditional teaching method and reciprocal peer teaching (RPT) method during anatomy dissection. Debriefing surveys were administered to the 227 first year medical students regarding merits, demerits and impact of both RPT and Traditional teaching experiences on student's preparedness prior to dissection, professionalism and communication skills. Out of this, 159 (70 %) completed the survey on traditional method while 148 (65.2 %) completed survey on RPT method. An observation tool for anatomy faculty was used to assess collaboration, professionalism and teaching skills among students. Student's scores on examinations done before introduction of RPT were compared with examinations scores after introduction of RPT. Our results show that the mean performance of students on objective examinations was significantly higher after introduction of RPT compared to the performance before introduction of RPT [63.7 ± 11.4 versus 58.6 ± 10, mean difference 5.1; 95 % CI = 4.0-6.3; p-value < 0.0001]. Students with low performance prior to RPT benefited more in terms of examination performance compared to those who had higher performance [Mean difference 7.6; p-value < 0.0001]. Regarding student's opinions on traditional method versus RPT, 83 % of students either agreed or strongly agreed that they were more likely to read the dissection manual before the RPT dissection session compared to 35 % for the traditional method. Over 85 % of respondents reported that RPT improved their confidence and ability to present information to peers and faculty compared to 38 % for the tradition method. The majority of faculty reported that the learning environment of the dissection groups was very active learning during RPT sessions and that professionalism was observed by most students during discussions. Introduction of RPT in our anatomy dissection laboratory was generally beneficial to both students and faculty. Both objective (student performance) and subjective data indicate that RPT improved student's performance and had a positive learning experience impact. Our future plan is to continue RPT practice and continually evaluate the RPT protocol.

  8. Systematizing the production of environmental plans: an Australian example

    NASA Astrophysics Data System (ADS)

    Davis, J. Richard

    1985-09-01

    Environmental planning legislation in New South Wales now requires local government authorities to draw up statutory plans that take into account, among other concerns, both the biophysical and the social environmental issues within their jurisdictions. The SIRO-PLAN method of plan production provides a systematic mechanism for fulfilling this requirement. This article describes the application of the method by planning researchers over 18 months to the production of a Local Environmental Plan for a rural local government in New South Wales. The policy formulation, the purposive data collection, and the deliberate adjustment of plans in order to recognize interest group requirements were all found to be valuable features of the method, while the translation of the ultimately chosen land-use plan into the explicit regulatory controls available to the local government authority was found to require further refinement. The capacity of SIRO-PLAN to quantify the resolution of competing environmental concerns in the final plan, although of value to planning researchers, proved too arcane for traditionally trained planners.

  9. Note: thermal imaging enhancement algorithm for gas turbine aerothermal characterization.

    PubMed

    Beer, S K; Lawson, S A

    2013-08-01

    An algorithm was developed to convert radiation intensity images acquired using a black and white CCD camera to thermal images without requiring knowledge of incident background radiation. This unique infrared (IR) thermography method was developed to determine aerothermal characteristics of advanced cooling concepts for gas turbine cooling application. Compared to IR imaging systems traditionally used for gas turbine temperature monitoring, the system developed for the current study is relatively inexpensive and does not require calibration with surface mounted thermocouples.

  10. Fast and Accurate Microplate Method (Biolog MT2) for Detection of Fusarium Fungicides Resistance/Sensitivity.

    PubMed

    Frąc, Magdalena; Gryta, Agata; Oszust, Karolina; Kotowicz, Natalia

    2016-01-01

    The need for finding fungicides against Fusarium is a key step in the chemical plant protection and using appropriate chemical agents. Existing, conventional methods of evaluation of Fusarium isolates resistance to fungicides are costly, time-consuming and potentially environmentally harmful due to usage of high amounts of potentially toxic chemicals. Therefore, the development of fast, accurate and effective detection methods for Fusarium resistance to fungicides is urgently required. MT2 microplates (Biolog(TM)) method is traditionally used for bacteria identification and the evaluation of their ability to utilize different carbon substrates. However, to the best of our knowledge, there is no reports concerning the use of this technical tool to determine fungicides resistance of the Fusarium isolates. For this reason, the objectives of this study are to develop a fast method for Fusarium resistance to fungicides detection and to validate the effectiveness approach between both traditional hole-plate and MT2 microplates assays. In presented study MT2 microplate-based assay was evaluated for potential use as an alternative resistance detection method. This was carried out using three commercially available fungicides, containing following active substances: triazoles (tebuconazole), benzimidazoles (carbendazim) and strobilurins (azoxystrobin), in six concentrations (0, 0.0005, 0.005, 0.05, 0.1, 0.2%), for nine selected Fusarium isolates. In this study, the particular concentrations of each fungicides was loaded into MT2 microplate wells. The wells were inoculated with the Fusarium mycelium suspended in PM4-IF inoculating fluid. Before inoculation the suspension was standardized for each isolates into 75% of transmittance. Traditional hole-plate method was used as a control assay. The fungicides concentrations in control method were the following: 0, 0.0005, 0.005, 0.05, 0.5, 1, 2, 5, 10, 25, and 50%. Strong relationships between MT2 microplate and traditional hole-plate methods were observed regarding to the detection of Fusarium resistance to various fungicides and their concentrations. The tebuconazole was most potent, providing increased efficiency in the growth inhibition of all tested isolates. Almost all among tested isolates were resistant to azoxystrobin-based fungicide. Overall, the MT2 microplates method was effective and timesaving, alternative method for determining Fusarium resistance/sensitivity to fungicides, compering to traditional hole-plate approach.

  11. Fast and Accurate Microplate Method (Biolog MT2) for Detection of Fusarium Fungicides Resistance/Sensitivity

    PubMed Central

    Frąc, Magdalena; Gryta, Agata; Oszust, Karolina; Kotowicz, Natalia

    2016-01-01

    The need for finding fungicides against Fusarium is a key step in the chemical plant protection and using appropriate chemical agents. Existing, conventional methods of evaluation of Fusarium isolates resistance to fungicides are costly, time-consuming and potentially environmentally harmful due to usage of high amounts of potentially toxic chemicals. Therefore, the development of fast, accurate and effective detection methods for Fusarium resistance to fungicides is urgently required. MT2 microplates (BiologTM) method is traditionally used for bacteria identification and the evaluation of their ability to utilize different carbon substrates. However, to the best of our knowledge, there is no reports concerning the use of this technical tool to determine fungicides resistance of the Fusarium isolates. For this reason, the objectives of this study are to develop a fast method for Fusarium resistance to fungicides detection and to validate the effectiveness approach between both traditional hole-plate and MT2 microplates assays. In presented study MT2 microplate-based assay was evaluated for potential use as an alternative resistance detection method. This was carried out using three commercially available fungicides, containing following active substances: triazoles (tebuconazole), benzimidazoles (carbendazim) and strobilurins (azoxystrobin), in six concentrations (0, 0.0005, 0.005, 0.05, 0.1, 0.2%), for nine selected Fusarium isolates. In this study, the particular concentrations of each fungicides was loaded into MT2 microplate wells. The wells were inoculated with the Fusarium mycelium suspended in PM4-IF inoculating fluid. Before inoculation the suspension was standardized for each isolates into 75% of transmittance. Traditional hole-plate method was used as a control assay. The fungicides concentrations in control method were the following: 0, 0.0005, 0.005, 0.05, 0.5, 1, 2, 5, 10, 25, and 50%. Strong relationships between MT2 microplate and traditional hole-plate methods were observed regarding to the detection of Fusarium resistance to various fungicides and their concentrations. The tebuconazole was most potent, providing increased efficiency in the growth inhibition of all tested isolates. Almost all among tested isolates were resistant to azoxystrobin-based fungicide. Overall, the MT2 microplates method was effective and timesaving, alternative method for determining Fusarium resistance/sensitivity to fungicides, compering to traditional hole-plate approach. PMID:27092136

  12. A New Method for Non-destructive Measurement of Biomass, Growth Rates, Vertical Biomass Distribution and Dry Matter Content Based on Digital Image Analysis

    PubMed Central

    Tackenberg, Oliver

    2007-01-01

    Background and Aims Biomass is an important trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive. Thus, they do not allow the development of individual plants to be followed and they require many individuals to be cultivated for repeated measurements. Non-destructive methods do not have these limitations. Here, a non-destructive method based on digital image analysis is presented, addressing not only above-ground fresh biomass (FBM) and oven-dried biomass (DBM), but also vertical biomass distribution as well as dry matter content (DMC) and growth rates. Methods Scaled digital images of the plants silhouettes were taken for 582 individuals of 27 grass species (Poaceae). Above-ground biomass and DMC were measured using destructive methods. With image analysis software Zeiss KS 300, the projected area and the proportion of greenish pixels were calculated, and generalized linear models (GLMs) were developed with destructively measured parameters as dependent variables and parameters derived from image analysis as independent variables. A bootstrap analysis was performed to assess the number of individuals required for re-calibration of the models. Key Results The results of the developed models showed no systematic errors compared with traditionally measured values and explained most of their variance (R2 ≥ 0·85 for all models). The presented models can be directly applied to herbaceous grasses without further calibration. Applying the models to other growth forms might require a re-calibration which can be based on only 10–20 individuals for FBM or DMC and on 40–50 individuals for DBM. Conclusions The methods presented are time and cost effective compared with traditional methods, especially if development or growth rates are to be measured repeatedly. Hence, they offer an alternative way of determining biomass, especially as they are non-destructive and address not only FBM and DBM, but also vertical biomass distribution and DMC. PMID:17353204

  13. Adaptive training of cortical feature maps for a robot sensorimotor controller.

    PubMed

    Adams, Samantha V; Wennekers, Thomas; Denham, Sue; Culverhouse, Phil F

    2013-08-01

    This work investigates self-organising cortical feature maps (SOFMs) based upon the Kohonen Self-Organising Map (SOM) but implemented with spiking neural networks. In future work, the feature maps are intended as the basis for a sensorimotor controller for an autonomous humanoid robot. Traditional SOM methods require some modifications to be useful for autonomous robotic applications. Ideally the map training process should be self-regulating and not require predefined training files or the usual SOM parameter reduction schedules. It would also be desirable if the organised map had some flexibility to accommodate new information whilst preserving previous learnt patterns. Here methods are described which have been used to develop a cortical motor map training system which goes some way towards addressing these issues. The work is presented under the general term 'Adaptive Plasticity' and the main contribution is the development of a 'plasticity resource' (PR) which is modelled as a global parameter which expresses the rate of map development and is related directly to learning on the afferent (input) connections. The PR is used to control map training in place of a traditional learning rate parameter. In conjunction with the PR, random generation of inputs from a set of exemplar patterns is used rather than predefined datasets and enables maps to be trained without deciding in advance how much data is required. An added benefit of the PR is that, unlike a traditional learning rate, it can increase as well as decrease in response to the demands of the input and so allows the map to accommodate new information when the inputs are changed during training. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    NASA Technical Reports Server (NTRS)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.

  15. An in vitro protocol for direct isolation of potential probiotic lactobacilli from raw bovine milk and traditional fermented milks.

    PubMed

    Baruzzi, Federico; Poltronieri, Palmiro; Quero, Grazia Marina; Morea, Maria; Morelli, Lorenzo

    2011-04-01

    A method for isolating potential probiotic lactobacilli directly from traditional milk-based foods was developed. The novel digestion/enrichment protocol was set up taking care to minimize the protective effect of milk proteins and fats and was validated testing three commercial fermented milks containing well-known probiotic Lactobacillus strains. Only probiotic bacteria claimed in the label were isolated from two out of three commercial fermented milks. The application of the new protocol to 15 raw milk samples and 6 traditional fermented milk samples made it feasible to isolate 11 potential probiotic Lactobacillus strains belonging to Lactobacillus brevis, Lactobacillus fermentum, Lactobacillus gasseri, Lactobacillus johnsonii, Lactobacillus plantarum, Lactobacillus reuteri, and Lactobacillus vaginalis species. Even though further analyses need to ascertain functional properties of these lactobacilli, the novel protocol set-up makes it feasible to isolate quickly potential probiotic strains from traditional milk-based foods reducing the amount of time required by traditional procedures that, in addition, do not allow to isolate microorganisms occurring as sub-dominant populations.

  16. Effects of New Funding Models for Patient-Centered Medical Homes on Primary Care Practice Finances and Services: Results of a Microsimulation Model

    PubMed Central

    Basu, Sanjay; Phillips, Russell S.; Song, Zirui; Landon, Bruce E.; Bitton, Asaf

    2016-01-01

    PURPOSE We assess the financial implications for primary care practices of participating in patient-centered medical home (PCMH) funding initiatives. METHODS We estimated practices’ changes in net revenue under 3 PCMH funding initiatives: increased fee-for-service (FFS) payments, traditional FFS with additional per-member-per-month (PMPM) payments, or traditional FFS with PMPM and pay-for-performance (P4P) payments. Net revenue estimates were based on a validated microsimulation model utilizing national practice surveys. Simulated practices reflecting the national range of practice size, location, and patient population were examined under several potential changes in clinical services: investments in patient tracking, communications, and quality improvement; increased support staff; altered visit templates to accommodate longer visits, telephone visits or electronic visits; and extended service delivery hours. RESULTS Under the status quo of traditional FFS payments, clinics operate near their maximum estimated possible net revenue levels, suggesting they respond strongly to existing financial incentives. Practices gained substantial additional net annual revenue per full-time physician under PMPM or PMPM plus P4P payments ($113,300 per year, 95% CI, $28,500 to $198,200) but not under increased FFS payments (−$53,500, 95% CI, −$69,700 to −$37,200), after accounting for costs of meeting PCMH funding requirements. Expanding services beyond minimum required levels decreased net revenue, because traditional FFS revenues decreased. CONCLUSIONS PCMH funding through PMPM payments could substantially improve practice finances but will not offer sufficient financial incentives to expand services beyond minimum requirements for PCMH funding. PMID:27621156

  17. The Impact of Social Media on Dissemination and Implementation of Clinical Practice Guidelines: A Longitudinal Observational Study.

    PubMed

    Narayanaswami, Pushpa; Gronseth, Gary; Dubinsky, Richard; Penfold-Murray, Rebecca; Cox, Julie; Bever, Christopher; Martins, Yolanda; Rheaume, Carol; Shouse, Denise; Getchius, Thomas S D

    2015-08-13

    Evidence-based clinical practice guidelines (CPGs) are statements that provide recommendations to optimize patient care for a specific clinical problem or question. Merely reading a guideline rarely leads to implementation of recommendations. The American Academy of Neurology (AAN) has a formal process of guideline development and dissemination. The last few years have seen a burgeoning of social media such as Facebook, Twitter, and LinkedIn, and newer methods of dissemination such as podcasts and webinars. The role of these media in guideline dissemination has not been studied. Systematic evaluation of dissemination methods and comparison of the effectiveness of newer methods with traditional methods is not available. It is also not known whether specific dissemination methods may be more effectively targeted to specific audiences. Our aim was to (1) develop an innovative dissemination strategy by adding social media-based dissemination methods to traditional methods for the AAN clinical practice guidelines "Complementary and alternative medicine in multiple sclerosis" ("CAM in MS") and (2) evaluate whether the addition of social media outreach improves awareness of the CPG and knowledge of CPG recommendations, and affects implementation of those recommendations. Outcomes were measured by four surveys in each of the two target populations: patients and physicians/clinicians ("physicians"). The primary outcome was the difference in participants' intent to discuss use of complementary and alternative medicine (CAM) with their physicians or patients, respectively, after novel dissemination, as compared with that after traditional dissemination. Secondary outcomes were changes in awareness of the CPG, knowledge of CPG content, and behavior regarding CAM use in multiple sclerosis (MS). Response rates were 25.08% (622/2480) for physicians and 43.5% (348/800) for patients. Awareness of the CPG increased after traditional dissemination (absolute difference, 95% confidence interval: physicians 36%, 95% CI 25-46, and patients 10%, 95% CI 1-11) but did not increase further after novel dissemination (physicians 0%, 95% CI -11 to 11, and patients -4%, 95% CI -6 to 14). Intent to discuss CAM also increased after traditional dissemination but did not change after novel dissemination (traditional: physicians 12%, 95% CI 2-22, and patients 19%, 95% CI 3-33; novel: physicians 11%, 95% CI -1 to -21, and patients -8%, 95% CI -22 to 8). Knowledge of CPG recommendations and behavior regarding CAM use in MS did not change after either traditional dissemination or novel dissemination. Social media-based dissemination methods did not confer additional benefit over print-, email-, and Internet-based methods in increasing CPG awareness and changing intent in physicians or patients. Research on audience selection, message formatting, and message delivery is required to utilize Web 2.0 technologies optimally for dissemination.

  18. Survey on multisensory feedback virtual reality dental training systems.

    PubMed

    Wang, D; Li, T; Zhang, Y; Hou, J

    2016-11-01

    Compared with traditional dental training methods, virtual reality training systems integrated with multisensory feedback possess potentials advantages. However, there exist many technical challenges in developing a satisfactory simulator. In this manuscript, we systematically survey several current dental training systems to identify the gaps between the capabilities of these systems and the clinical training requirements. After briefly summarising the components, functions and unique features of each system, we discuss the technical challenges behind these systems including the software, hardware and user evaluation methods. Finally, the clinical requirements of an ideal dental training system are proposed. Future research/development areas are identified based on an analysis of the gaps between current systems and clinical training requirements. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. A multiple distributed representation method based on neural network for biomedical event extraction.

    PubMed

    Wang, Anran; Wang, Jian; Lin, Hongfei; Zhang, Jianhai; Yang, Zhihao; Xu, Kan

    2017-12-20

    Biomedical event extraction is one of the most frontier domains in biomedical research. The two main subtasks of biomedical event extraction are trigger identification and arguments detection which can both be considered as classification problems. However, traditional state-of-the-art methods are based on support vector machine (SVM) with massive manually designed one-hot represented features, which require enormous work but lack semantic relation among words. In this paper, we propose a multiple distributed representation method for biomedical event extraction. The method combines context consisting of dependency-based word embedding, and task-based features represented in a distributed way as the input of deep learning models to train deep learning models. Finally, we used softmax classifier to label the example candidates. The experimental results on Multi-Level Event Extraction (MLEE) corpus show higher F-scores of 77.97% in trigger identification and 58.31% in overall compared to the state-of-the-art SVM method. Our distributed representation method for biomedical event extraction avoids the problems of semantic gap and dimension disaster from traditional one-hot representation methods. The promising results demonstrate that our proposed method is effective for biomedical event extraction.

  20. Evaluation Measures and Methods: Some Intersections.

    ERIC Educational Resources Information Center

    Elliott, John

    The literature is reviewed for four combinations of evaluation measures and methods: traditional methods with traditional measures (T-Meth/T-Mea), nontraditional methods with traditional measures (N-Meth/T-Mea), traditional measures with nontraditional measures (T-Meth/N-Mea), and nontraditional methods with nontraditional measures (N-Meth/N-Mea).…

  1. Predicting Critical Power in Elite Cyclists: Questioning the Validity of the 3-Minute All-Out Test.

    PubMed

    Bartram, Jason C; Thewlis, Dominic; Martin, David T; Norton, Kevin I

    2017-07-01

    New applications of the critical-power concept, such as the modeling of intermittent-work capabilities, are exciting prospects for elite cycling. However, accurate calculation of the required parameters is traditionally time invasive and somewhat impractical. An alternative single-test protocol (3-min all-out) has recently been proposed, but validation in an elite population is lacking. The traditional approach for parameter establishment, but with fewer tests, could also prove an acceptable compromise. Six senior Australian endurance track-cycling representatives completed 6 efforts to exhaustion on 2 separate days over a 3-wk period. These included 1-, 4-, 6-, 8-, and 10-min self-paced efforts, plus the 3-min all-out protocol. Traditional work-vs-time calculations of CP and anaerobic energy contribution (W') using the 5 self-paced efforts were compared with calculations from the 3-min all-out protocol. The impact of using just 2 or 3 self-paced efforts for traditional CP and W' estimation was also explored using thresholds of agreement (8 W, 2.0 kJ, respectively). CP estimated from the 3-min all-out approach was significantly higher than from the traditional approach (402 ± 33, 351 ± 27 W, P < .001), while W' was lower (15.5 ± 3.0, 24.3 ± 4.0 kJ, P = .02). Five different combinations of 2 or 3 self-paced efforts led to CP estimates within the threshold of agreement, with only 1 combination deemed accurate for W'. In elite cyclists the 3-min all-out approach is not suitable to estimate CP when compared with the traditional method. However, reducing the number of tests used in the traditional method lessens testing burden while maintaining appropriate parameter accuracy.

  2. Computerized assessment of placental calcification post-ultrasound: a novel software tool.

    PubMed

    Moran, M; Higgins, M; Zombori, G; Ryan, J; McAuliffe, F M

    2013-05-01

    Placental calcification is associated with an increased risk of perinatal morbidity and mortality. The subjectivity of current ultrasound methods of assessment of placental calcification indicates that a more objective method is required. The aim of this study was to correlate the percentage of calcification defined by the clinician using a new software tool for calculating the extent of placental calcification with traditional ultrasound methods and with pregnancy outcome. Ninety placental images were individually assessed. An upper threshold was defined, based on high intensity, to quantify calcification within the placenta. Output metrics were then produced including the overall percentage of calcification with respect to the total number of pixels within the region of interest. The results were correlated with traditional ultrasound methods of assessment of placental calcification and with pregnancy outcome. The results demonstrate a significant correlation between placental calcification, as defined using the software, and traditional methods of Grannum grading of placental calcification. Whilst correlation with perinatal outcome and cord pH was not significant as a result of small numbers, patients with placental calcification assessed using the computerized software at the upper quartile had higher rates of poor perinatal outcome when compared with those at the lower quartile (8/22 (36%) vs 3/23 (13%); P = 0.069). These results suggest that this computerized software tool has the potential to become an alternative method of assessing placental calcification. Copyright © 2012 ISUOG. Published by John Wiley & Sons Ltd.

  3. Tuberculosis patients’ knowledge and beliefs about tuberculosis: a mixed methods study from the Pacific Island nation of Vanuatu

    PubMed Central

    2014-01-01

    Background The setting for this study was the Pacific island nation of Vanuatu, an archipelago of 82 islands, located in the South Pacific Ocean. Our objective was to assess the knowledge, attitudes and practices of tuberculosis (TB) patients towards TB. Methods This was a descriptive study using qualitative and quantitative methods. Quantitative analysis was based on the responses provided to closed questions, and we present frequencies to describe the TB patients’ knowledge, attitudes and practice relating to TB. Qualitative analysis was based on open questions permitting fuller explanations. We used thematic analysis and developed a posteriori inductive categories to draw conclusions. Results Thirty five TB patients were interviewed; 22 (63%) were male. They attributed TB to cigarettes, kava, alcohol, contaminated food, sharing eating utensils and “kastom” (the local term for the traditional way of life, but also for sorcery). Most (94%) did not attribute TB to a bacterial cause. However, almost all TB patients (89%) thought that TB was best treated at a hospital with antibiotics. Three quarters (74%) experienced stigma after their TB diagnosis. Seeking health care from a traditional healer was common; 54% of TB patients stated that they would first consult a traditional healer for any illness. When seeking a diagnosis for signs and symptoms of TB, 34% first consulted a traditional healer. Patients cited cost, distance and beliefs about TB causation as reasons for first consulting a traditional healer or going to the hospital. Of the TB patients who consulted a traditional healer first, there was an average of two weeks delay before they consulted the health service. In some cases, however, the delay was up to six years. Conclusion The majority of the TB patients interviewed did not attribute TB to a bacterial cause. Consulting a traditional healer for health care, including while seeking a diagnosis for TB symptoms, was common and may have delayed diagnosis. People require better information about TB to correct commonly held misperceptions about the disease. Traditional healers could also be engaged with the national TB programme, in order to refer people with signs and symptoms of TB to the nearest health service. PMID:24885057

  4. Analysis of munitions constituents in groundwater using a field-portable GC-MS.

    PubMed

    Bednar, A J; Russell, A L; Hayes, C A; Jones, W T; Tackett, P; Splichal, D E; Georgian, T; Parker, L V; Kirgan, R A; MacMillan, D K

    2012-05-01

    The use of munitions constituents (MCs) at military installations can produce soil and groundwater contamination that requires periodic monitoring even after training or manufacturing activities have ceased. Traditional groundwater monitoring methods require large volumes of aqueous samples (e.g., 2-4 L) to be shipped under chain of custody, to fixed laboratories for analysis. The samples must also be packed on ice and shielded from light to minimize degradation that may occur during transport and storage. The laboratory's turn-around time for sample analysis and reporting can be as long as 45 d. This process hinders the reporting of data to customers in a timely manner; yields data that are not necessarily representative of current site conditions owing to the lag time between sample collection and reporting; and incurs significant shipping costs for samples. The current work compares a field portable Gas Chromatograph-Mass Spectrometer (GC-MS) for analysis of MCs on-site with traditional laboratory-based analysis using High Performance Liquid Chromatography with UV absorption detection. The field method provides near real-time (within ~1 h of sampling) concentrations of MCs in groundwater samples. Mass spectrometry provides reliable confirmation of MCs and a means to identify unknown compounds that are potential false positives for methods with UV and other non-selective detectors. Published by Elsevier Ltd.

  5. Semi-automatic surface sediment sampling system - A prototype to be implemented in bivalve fishing surveys

    NASA Astrophysics Data System (ADS)

    Rufino, Marta M.; Baptista, Paulo; Pereira, Fábio; Gaspar, Miguel B.

    2018-01-01

    In the current work we propose a new method to sample surface sediment during bivalve fishing surveys. Fishing institutes all around the word carry out regular surveys with the aim of monitoring the stocks of commercial species. These surveys comprise often more than one hundred of sampling stations and cover large geographical areas. Although superficial sediment grain sizes are among the main drivers of benthic communities and provide crucial information for studies on coastal dynamics, overall there is a strong lack of this type of data, possibly, because traditional surface sediment sampling methods use grabs, that require considerable time and effort to be carried out on regular basis or on large areas. In face of these aspects, we developed an easy and un-expensive method to sample superficial sediments, during bivalve fisheries monitoring surveys, without increasing survey time or human resources. The method was successfully evaluated and validated during a typical bivalve survey carried out on the Northwest coast of Portugal, confirming that it had any interference with the survey objectives. Furthermore, the method was validated by collecting samples using a traditional Van Veen grabs (traditional method), which showed a similar grain size composition to the ones collected by the new method, on the same localities. We recommend that the procedure is implemented on regular bivalve fishing surveys, together with an image analysis system to analyse the collected samples. The new method will provide substantial quantity of data on surface sediment in coastal areas, using a non-expensive and efficient manner, with a high potential application in different fields of research.

  6. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  7. Teaching Vocabulary to Preschoolers with Disabilities Using Adult-Child Shared Bookreading: A Comparison of Traditional and Electronic Books

    ERIC Educational Resources Information Center

    Rhodehouse, Sara Bernice

    2013-01-01

    This study sought to validate adult-child shared storybook reading as a method for teaching target vocabulary words to preschool children with disabilities. The Vocabulary Learning through Books (VLTB) instructional procedure incorporates, adult-child book reading, questioning during reading requiring the child to answer with a target word, and…

  8. Feature Selection on Hyperspectral Data for Dismount Skin Analysis

    DTIC Science & Technology

    2014-03-27

    19 2.4.1 Melanosome Estimation . . . . . . . . . . . . . . . . . . . . . . . 19 2.4.2 Facial Recognition using...require compliant interaction in order to establish their identification. Previously, traditional facial recognition systems have been enhanced by HSI by...calculated as a fundamental method to differentiate between people [38]. In addition, the area of facial recognition has benefited from the rich spectral

  9. Children as Innovators in Action--A Study of Microcontrollers in Finnish Comprehensive Schools

    ERIC Educational Resources Information Center

    Järvinen, Esa-Matti; Karsikas, Arto; Hintikka, Jouni

    2007-01-01

    In authoritative teaching methods, whereby the teacher controls the social interaction and other classroom activities, the actions of many children are often in response to what they perceive to be the teacher's expectations and the requirements of traditional school evaluation practices, such as examinations and tests. In this kind of school…

  10. Methods of Interoperability: Moodle and WeBWork

    ERIC Educational Resources Information Center

    Gage, Michael E.

    2017-01-01

    The first requirement for an online mathematics homework engine\tis to encourage students to practice and reinforce their mathematics skills in ways that are as good or better than traditional paper homework. The use of the computer and the internet should not limit the kind or quality of the mathematics that we teach and, if possible, it should…

  11. Pro-B selection method for uneven-aged management of longleaf pine forests

    Treesearch

    Dale G. Brockway; Edward F. Loewenstein; Kenneth W. Outcalt

    2015-01-01

    Interest in uneven-aged silviculture has increased since advent of ecosystem management programs, which place greater emphasis on ecological values and ecosystem services while also harvesting timber from the forest. However, traditional uneven-aged approaches (e.g., BDq) are often criticized as too complex, costly, and requiring highly-trained staff. The Proportional-...

  12. Virtual Project Management: Examining the Roles and Functions of Online Instructors in Creating Learning Applications with Value

    ERIC Educational Resources Information Center

    Barrett, Bob

    2012-01-01

    While many students and instructors are transitioning from the brick-and-mortar classrooms to virtual classrooms, labs, and simulations, this requires a higher-level of expertise, control, and perseverance by the instructor. Traditional methods of teaching, leading, managing, and organizing learn activities has changed in terms of the virtual…

  13. Participation and Collaborative Learning in Large Class Sizes: Wiki, Can You Help Me?

    ERIC Educational Resources Information Center

    de Arriba, Raúl

    2017-01-01

    Collaborative learning has a long tradition within higher education. However, its application in classes with a large number of students is complicated, since it is a teaching method that requires a high level of participation from the students and careful monitoring of the process by the educator. This article presents an experience in…

  14. Building Up the Other Side of Sesame Street. Organizing and Administering Delivery of Off Campus Continuing Professional Education.

    ERIC Educational Resources Information Center

    Sarlos, Beatrice E.

    Continuing professional education (CPE), defined as educational services offered to professionals (those who possess initial degrees required for practice) without the restrictions of traditional scheduling, credits, tuition, or instruction methods, is discussed. The importance of a uniform terminology to distinguish the specific area of CPE is…

  15. Utilizing Response Time Distributions for Item Selection in CAT

    ERIC Educational Resources Information Center

    Fan, Zhewen; Wang, Chun; Chang, Hua-Hua; Douglas, Jeffrey

    2012-01-01

    Traditional methods for item selection in computerized adaptive testing only focus on item information without taking into consideration the time required to answer an item. As a result, some examinees may receive a set of items that take a very long time to finish, and information is not accrued as efficiently as possible. The authors propose two…

  16. To Scale or Not to Scale: A Return on Investment Model for Evaluating Developmental Education Strategy

    ERIC Educational Resources Information Center

    McCormack, J. Brad

    2012-01-01

    As more and more students look to community colleges for their pursuit of higher education, the number of academically unprepared entrants continues to increase disproportionately. With over 41 percent of all entering community college students requiring some form of remediation (Bettinger & Long, 2009) and traditional methods of instruction…

  17. The Typicality Ranking Task: A New Method to Derive Typicality Judgments from Children.

    PubMed

    Djalal, Farah Mutiasari; Ameel, Eef; Storms, Gert

    2016-01-01

    An alternative method for deriving typicality judgments, applicable in young children that are not familiar with numerical values yet, is introduced, allowing researchers to study gradedness at younger ages in concept development. Contrary to the long tradition of using rating-based procedures to derive typicality judgments, we propose a method that is based on typicality ranking rather than rating, in which items are gradually sorted according to their typicality, and that requires a minimum of linguistic knowledge. The validity of the method is investigated and the method is compared to the traditional typicality rating measurement in a large empirical study with eight different semantic concepts. The results show that the typicality ranking task can be used to assess children's category knowledge and to evaluate how this knowledge evolves over time. Contrary to earlier held assumptions in studies on typicality in young children, our results also show that preference is not so much a confounding variable to be avoided, but that both variables are often significantly correlated in older children and even in adults.

  18. The Typicality Ranking Task: A New Method to Derive Typicality Judgments from Children

    PubMed Central

    Ameel, Eef; Storms, Gert

    2016-01-01

    An alternative method for deriving typicality judgments, applicable in young children that are not familiar with numerical values yet, is introduced, allowing researchers to study gradedness at younger ages in concept development. Contrary to the long tradition of using rating-based procedures to derive typicality judgments, we propose a method that is based on typicality ranking rather than rating, in which items are gradually sorted according to their typicality, and that requires a minimum of linguistic knowledge. The validity of the method is investigated and the method is compared to the traditional typicality rating measurement in a large empirical study with eight different semantic concepts. The results show that the typicality ranking task can be used to assess children’s category knowledge and to evaluate how this knowledge evolves over time. Contrary to earlier held assumptions in studies on typicality in young children, our results also show that preference is not so much a confounding variable to be avoided, but that both variables are often significantly correlated in older children and even in adults. PMID:27322371

  19. Flow cytometry as a method for the evaluation of raw material, product and process in the dairy industry.

    PubMed

    Ruszczyńska, A; Szteyn, J; Wiszniewska-Laszczych, A

    2007-01-01

    Producing dairy products which are safe for consumers requires the constant monitoring of the microbiological quality of raw material, the production process itself and the end product. Traditional methods, still a "gold standard", require a specialized laboratory working on recognized and validated methods. Obtaining results is time- and labor-consuming and do not allow rapid evaluation. Hence, there is a need for a rapid, precise method enabling the real-time monitoring of microbiological quality, and flow cytometry serves this function well. It is based on labeling cells suspended in a solution with fluorescent dyes and pumping them into a measurement zone where they are exposed to a precisely focused laser beam. This paper is aimed at presenting the possibilities of applying flow cytometry in the dairy industry.

  20. Algebraic method for parameter identification of circuit models for batteries under non-zero initial condition

    NASA Astrophysics Data System (ADS)

    Devarakonda, Lalitha; Hu, Tingshu

    2014-12-01

    This paper presents an algebraic method for parameter identification of Thevenin's equivalent circuit models for batteries under non-zero initial condition. In traditional methods, it was assumed that all capacitor voltages have zero initial conditions at the beginning of each charging/discharging test. This would require a long rest time between two tests, leading to very lengthy tests for a charging/discharging cycle. In this paper, we propose an algebraic method which can extract the circuit parameters together with initial conditions. This would theoretically reduce the rest time to 0 and substantially accelerate the testing cycles.

  1. Quantitation of sugar content in pyrolysis liquids after acid hydrolysis using high-performance liquid chromatography without neutralization.

    PubMed

    Johnston, Patrick A; Brown, Robert C

    2014-08-13

    A rapid method for the quantitation of total sugars in pyrolysis liquids using high-performance liquid chromatography (HPLC) was developed. The method avoids the tedious and time-consuming sample preparation required by current analytical methods. It is possible to directly analyze hydrolyzed pyrolysis liquids, bypassing the neutralization step usually required in determination of total sugars. A comparison with traditional methods was used to determine the validity of the results. The calibration curve coefficient of determination on all standard compounds was >0.999 using a refractive index detector. The relative standard deviation for the new method was 1.13%. The spiked sugar recoveries on the pyrolysis liquid samples were between 104 and 105%. The research demonstrates that it is possible to obtain excellent accuracy and efficiency using HPLC to quantitate glucose after acid hydrolysis of polymeric and oligomeric sugars found in fast pyrolysis bio-oils without neutralization.

  2. Comparison of DIGE and post-stained gel electrophoresis with both traditional and SameSpots analysis for quantitative proteomics.

    PubMed

    Karp, Natasha A; Feret, Renata; Rubtsov, Denis V; Lilley, Kathryn S

    2008-03-01

    2-DE is an important tool in quantitative proteomics. Here, we compare the deep purple (DP) system with DIGE using both a traditional and the SameSpots approach to gel analysis. Missing values in the traditional approach were found to be a significant issue for both systems. SameSpots attempts to address the missing value problem. SameSpots was found to increase the proportion of low volume data for DP but not for DIGE. For all the analysis methods applied in this study, the assumptions of parametric tests were met. Analysis of the same images gave significantly lower noise with SameSpots (over traditional) for DP, but no difference for DIGE. We propose that SameSpots gave lower noise with DP due to the stabilisation of the spot area by the common spot outline, but this was not seen with DIGE due to the co-detection process which stabilises the area selected. For studies where measurement of small abundance changes is required, a cost-benefit analysis highlights that DIGE was significantly cheaper regardless of the analysis methods. For studies analysing large changes, DP with SameSpots could be an effective alternative to DIGE but this will be dependent on the biological noise of the system under investigation.

  3. The guinea pig as an animal model for developmental and reproductive toxicology studies.

    PubMed

    Rocca, Meredith S; Wehner, Nancy G

    2009-04-01

    Regulatory guidelines for developmental and reproductive toxicology (DART) studies require selection of "relevant" animal models as determined by kinetic, pharmacological, and toxicological data. Traditionally, rats, mice, and rabbits are the preferred animal models for these studies. However, for test articles that are pharmacologically inactive in the traditional animal models, the guinea pig may be a viable option. This choice should not be made lightly, as guinea pigs have many disadvantages compared to the traditional species, including limited historical control data, variability in pregnancy rates, small and variable litter size, long gestation, relative maturity at birth, and difficulty in dosing and breeding. This report describes methods for using guinea pigs in DART studies and provides results of positive and negative controls. Standard study designs and animal husbandry methods were modified to allow mating on the postpartum estrus in fertility studies and were used for producing cohorts of pregnant females for developmental studies. A positive control study with the pregnancy-disrupting agent mifepristone resulted in the anticipated failure of embryo implantation and supported the use of the guinea pig model. Control data for reproductive endpoints collected from 5 studies are presented. In cases where the traditional animal models are not relevant, the guinea pig can be used successfully for DART studies. (c) 2009 Wiley-Liss, Inc.

  4. Encouraging physician appropriate prescribing of non-steroidal anti-inflammatory therapies: protocol of a randomized controlled trial [ISRCTN43532635

    PubMed Central

    Doupe, Malcolm; Katz, Alan; Kvern, Brent; Manness, Lori-Jean; Metge, Colleen; Thomson, Glen TD; Morrison, Laura; Rother, Kat

    2004-01-01

    Background Traditional non-steroidal anti-inflammatory drugs (NSAIDs) are a widely used class of therapy in the treatment of chronic pain and inflammation. The drugs are effective and can be relatively inexpensive thanks to available generic versions. Unfortunately the traditional NSAIDs are associated with gastrointestinal complications in a small proportion of patients, requiring costly co-therapy with gastro-protective agents. Recently, a new class of non-steroidal anti-inflammatory agents known as coxibs has become available, fashioned to be safer than the traditional NSAIDs but priced considerably higher than the traditional generics. To help physicians choose appropriately and cost-effectively from the expanded number of anti-inflammatory therapies, scientific bodies have issued clinical practice guidelines and third party payers have published restricted reimbursement policies. The objective of this study is to determine whether an educational intervention can prompt physicians to adjust their prescribing in accordance with these expert recommendations. Methods This is an ongoing, randomized controlled trial. All primary care physicians in Manitoba, Canada have been randomly assigned to a control group or an intervention study group. The educational intervention being evaluated consists of an audit and feedback mechanism combined with optional participation in a Continuing Medical Education interactive workshop. The primary outcome of the study is the change, from pre-to post-intervention, in physicians' appropriate prescribing of non-steroidal anti-inflammatory therapies for patients requiring chronic treatment. Three classes of non-steroidal anti-inflammatory therapies have been identified: coxib therapy, traditional NSAID monotherapy, and traditional NSAID therapy combined with gastro-protective agents. Appropriate prescribing is defined based on international clinical practice guidelines and the provincial drug reimbursement policy in Manitoba. PMID:15327694

  5. Comparison of ultrasonic shears and traditional suture ligature for vaginal hysterectomy: randomized controlled trial.

    PubMed

    Fitz-Gerald, Alison Louise; Tan, Jason; Chan, Kok-Weng; Polyakov, Alex; Edwards, Geoff N; Najjar, Haider; Tsaltas, Jim; Vollenhoven, Beverley

    2013-01-01

    To compare operating time, intraoperative blood loss, postoperative analgesia, and length of hospital stay using ultrasonic shears vs traditional suture ligature in vaginal hysterectomy. Randomized controlled trial (Canadian Task Force classification I). Gynecology units within a single health network, university hospital. Forty women requiring vaginal hysterectomy because of benign disease. Vaginal hysterectomy performed using either ultrasonically activated shears (USS) or traditional suture ligatures. Twenty-one patients were randomized to the USS arm, and 19 patients to the traditional suture ligature arm. Patient characteristics were comparable. Mean (SD) hysterectomy time and was similar in both the USS and traditional arms, 28.66 (4.0) minutes vs 32.37 (3.18) minutes (p = .47), as was total operating time, 97.38 (8.9) minutes vs 91.63 (7.69) minutes (p = .63). Operative blood loss was significantly decreased in the USS group: 62.63 (12.46) mL vs 136.05 (21.54) mL (p = .006). There was, however, no significant change in hemoglobin concentration between the 2 groups: 19.53 (1.79) g/L vs -16.72 (2.5) g/L. There was no significant difference in mean oxycodone use: 9.29 (2.66) mg vs 8.06 (3.19) mg (p = .77). Length of hospital stay was similar in both groups: 58.98 (3.27) hours vs 60.05 (6.48) hours (p = .88). There was no significant difference in overall complication rates between the groups. Although the Harmonic scalpel system, compared with the traditional suture ligation method, seems to be a safe alternative for securing the pedicles in vaginal hysterectomy, it offers no benefit insofar as operative time, reduction in clinically significant blood loss, and analgesic requirements. Copyright © 2013 AAGL. Published by Elsevier Inc. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, P.

    The majority of general-purpose low-temperature handheld radiation thermometers are severely affected by the size-of-source effect (SSE). Calibration of these instruments is pointless unless the SSE is accounted for in the calibration process. Traditional SSE measurement techniques, however, are costly and time consuming, and because the instruments are direct-reading in temperature, traditional SSE results are not easily interpretable, particularly by the general user. This paper describes a simplified method for measuring the SSE, suitable for second-tier calibration laboratories and requiring no additional equipment, and proposes a means of reporting SSE results on a calibration certificate that should be easily understood bymore » the non-specialist user.« less

  7. Integration of Traditional and E-Learning Methods to Improve Learning Outcomes for Dental Students in Histopathology.

    PubMed

    Ariana, Armin; Amin, Moein; Pakneshan, Sahar; Dolan-Evans, Elliot; Lam, Alfred K

    2016-09-01

    Dental students require a basic ability to explain and apply general principles of pathology to systemic, dental, and oral pathology. Although there have been recent advances in electronic and online resources, the academic effectiveness of using self-directed e-learning tools in pathology courses for dental students is unclear. The aim of this study was to determine if blended learning combining e-learning with traditional learning methods of lectures and tutorials would improve students' scores and satisfaction over those who experienced traditional learning alone. Two consecutive cohorts of Bachelor of Dentistry and Oral Health students taking the general pathology course at Griffith University in Australia were compared. The control cohort experienced traditional methods only, while members of the study cohort were also offered self-directed learning materials including online resources and online microscopy classes. Final assessments for the course were used to compare the differences in effectiveness of the intervention, and students' satisfaction with the teaching format was evaluated using questionnaires. On the final course assessments, students in the study cohort had significantly higher scores than students in the control cohort (p<0.01). Analysis of questionnaire results showed improved student satisfaction with the course in the study cohort. These findings suggest that the use of e-learning tools such as virtual microscopy and interactive online resources for delivering pathology instruction can be an effective supplement for developing dental students' competence, confidence, and satisfaction.

  8. Integration of a Community Pharmacy Simulation Program into a Therapeutics Course.

    PubMed

    Shin, Jaekyu; Tabatabai, Daryush; Boscardin, Christy; Ferrone, Marcus; Brock, Tina

    2018-02-01

    Objective. To demonstrate the feasibility of integrating the computer simulation, MyDispense, into a therapeutics course and to measure its effects on student perception and learning. Methods. We conducted a prospective study with an experimental phase and an implementation phase. In the first phase, students were randomized to complete a therapeutics case using MyDispense or traditional paper methods in class. In the second phase, all students completed two therapeutic cases using MyDispense in class with the option to complete four additional outside-of-class cases using MyDispense. Students completed pre- and post-tests in class and three surveys. Results. In the experimental phase, mean test scores increased from pre- to post-test for both MyDispense and traditional paper groups, but the difference between the groups was not statistically significant. Students in the traditional paper group reported statistically significant gains in confidence compared to the MyDispense group. In the implementation phase, mean test scores again increased, however, student perception of the use of MyDispense for therapeutics was negative. Completing the optional outside-of-class cases, however, was positively and significantly correlated with the midterm and final examination scores. Conclusion. Implementation of MyDispense in therapeutics may be feasible and has positive effects (eg, correlation with exam scores, capacity for immediate feedback, and potential for effective self-study). With short-term use and in the absence of assessment methods that also require seeking information from patients, students prefer to learn via traditional paper cases.

  9. a New Approach for Accuracy Improvement of Pulsed LIDAR Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Huang, W.; Zhou, X.; He, C.; Li, X.; Huang, Y.; Zhang, L.

    2018-05-01

    In remote sensing applications, the accuracy of time interval measurement is one of the most important parameters that affect the quality of pulsed lidar data. The traditional time interval measurement technique has the disadvantages of low measurement accuracy, complicated circuit structure and large error. A high-precision time interval data cannot be obtained in these traditional methods. In order to obtain higher quality of remote sensing cloud images based on the time interval measurement, a higher accuracy time interval measurement method is proposed. The method is based on charging the capacitance and sampling the change of capacitor voltage at the same time. Firstly, the approximate model of the capacitance voltage curve in the time of flight of pulse is fitted based on the sampled data. Then, the whole charging time is obtained with the fitting function. In this method, only a high-speed A/D sampler and capacitor are required in a single receiving channel, and the collected data is processed directly in the main control unit. The experimental results show that the proposed method can get error less than 3 ps. Compared with other methods, the proposed method improves the time interval accuracy by at least 20 %.

  10. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    NASA Astrophysics Data System (ADS)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  11. Three dimensional scattering center imaging techniques

    NASA Technical Reports Server (NTRS)

    Younger, P. R.; Burnside, W. D.

    1991-01-01

    Two methods to image scattering centers in 3-D are presented. The first method uses 2-D images generated from Inverse Synthetic Aperture Radar (ISAR) measurements taken by two vertically offset antennas. This technique is shown to provide accurate 3-D imaging capability which can be added to an existing ISAR measurement system, requiring only the addition of a second antenna. The second technique uses target impulse responses generated from wideband radar measurements from three slightly different offset antennas. This technique is shown to identify the dominant scattering centers on a target in nearly real time. The number of measurements required to image a target using this technique is very small relative to traditional imaging techniques.

  12. Rapid quantification of vesicle concentration for DOPG/DOPC and Cardiolipin/DOPC mixed lipid systems of variable composition.

    PubMed

    Elmer-Dixon, Margaret M; Bowler, Bruce E

    2018-05-19

    A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.

  13. Geomorphometric analysis of cave ceiling channels mapped with 3-D terrestrial laser scanning

    NASA Astrophysics Data System (ADS)

    Gallay, Michal; Hochmuth, Zdenko; Kaňuk, Ján; Hofierka, Jaroslav

    2016-05-01

    The change of hydrological conditions during the evolution of caves in carbonate rocks often results in a complex subterranean geomorphology, which comprises specific landforms such as ceiling channels, anastomosing half tubes, or speleothems organized vertically in different levels. Studying such complex environments traditionally requires tedious mapping; however, this is being replaced with terrestrial laser scanning technology. Laser scanning overcomes the problem of reaching high ceilings, providing new options to map underground landscapes with unprecedented level of detail and accuracy. The acquired point cloud can be handled conveniently with dedicated software, but applying traditional geomorphometry to analyse the cave surface is limited. This is because geomorphometry has been focused on parameterization and analysis of surficial terrain. The theoretical and methodological concept has been based on two-dimensional (2-D) scalar fields, which are sufficient for most cases of the surficial terrain. The terrain surface is modelled with a bivariate function of altitude (elevation) and represented by a raster digital elevation model. However, the cave is a 3-D entity; therefore, a different approach is required for geomorphometric analysis. In this paper, we demonstrate the benefits of high-resolution cave mapping and 3-D modelling to better understand the palaeohydrography of the Domica cave in Slovakia. This methodological approach adopted traditional geomorphometric methods in a unique manner and also new methods used in 3-D computer graphics, which can be applied to study other 3-D geomorphological forms.

  14. Clinic Design and Continuity in Internal Medicine Resident Clinics: Findings of the Educational Innovations Project Ambulatory Collaborative.

    PubMed

    Francis, Maureen D; Wieland, Mark L; Drake, Sean; Gwisdalla, Keri Lyn; Julian, Katherine A; Nabors, Christopher; Pereira, Anne; Rosenblum, Michael; Smith, Amy; Sweet, David; Thomas, Kris; Varney, Andrew; Warm, Eric; Wininger, David; Francis, Mark L

    2015-03-01

    Many internal medicine (IM) programs have reorganized their resident continuity clinics to improve trainees' ambulatory experience. Downstream effects on continuity of care and other clinical and educational metrics are unclear. This multi-institutional, cross-sectional study included 713 IM residents from 12 programs. Continuity was measured using the usual provider of care method (UPC) and the continuity for physician method (PHY). Three clinic models (traditional, block, and combination) were compared using analysis of covariance. Multivariable linear regression analysis was used to analyze the effect of practice metrics and clinic model on continuity. UPC, reflecting continuity from the patient perspective, was significantly different, and was highest in the block model, midrange in combination model, and lowest in the traditional model programs. PHY, reflecting continuity from the perspective of the resident provider, was significantly lower in the block model than in combination and traditional programs. Panel size, ambulatory workload, utilization, number of clinics attended in the study period, and clinic model together accounted for 62% of the variation found in UPC and 26% of the variation found in PHY. Clinic model appeared to have a significant effect on continuity measured from both the patient and resident perspectives. Continuity requires balance between provider availability and demand for services. Optimizing this balance to maximize resident education, and the health of the population served, will require consideration of relevant local factors and priorities in addition to the clinic model.

  15. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    PubMed

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A New Calibration Method for Commercial RGB-D Sensors.

    PubMed

    Darwish, Walid; Tang, Shenjun; Li, Wenbin; Chen, Wu

    2017-05-24

    Commercial RGB-D sensors such as Kinect and Structure Sensors have been widely used in the game industry, where geometric fidelity is not of utmost importance. For applications in which high quality 3D is required, i.e., 3D building models of centimeter‑level accuracy, accurate and reliable calibrations of these sensors are required. This paper presents a new model for calibrating the depth measurements of RGB-D sensors based on the structured light concept. Additionally, a new automatic method is proposed for the calibration of all RGB-D parameters, including internal calibration parameters for all cameras, the baseline between the infrared and RGB cameras, and the depth error model. When compared with traditional calibration methods, this new model shows a significant improvement in depth precision for both near and far ranges.

  17. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files.

    PubMed

    Sun, Xiaobo; Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng; Qin, Zhaohui S

    2018-06-01

    Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)-based high-performance computing (HPC) implementation, and the popular VCFTools. Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems.

  18. A method of fast mosaic for massive UAV images

    NASA Astrophysics Data System (ADS)

    Xiang, Ren; Sun, Min; Jiang, Cheng; Liu, Lei; Zheng, Hui; Li, Xiaodong

    2014-11-01

    With the development of UAV technology, UAVs are used widely in multiple fields such as agriculture, forest protection, mineral exploration, natural disaster management and surveillances of public security events. In contrast of traditional manned aerial remote sensing platforms, UAVs are cheaper and more flexible to use. So users can obtain massive image data with UAVs, but this requires a lot of time to process the image data, for example, Pix4UAV need approximately 10 hours to process 1000 images in a high performance PC. But disaster management and many other fields require quick respond which is hard to realize with massive image data. Aiming at improving the disadvantage of high time consumption and manual interaction, in this article a solution of fast UAV image stitching is raised. GPS and POS data are used to pre-process the original images from UAV, belts and relation between belts and images are recognized automatically by the program, in the same time useless images are picked out. This can boost the progress of finding match points between images. Levenberg-Marquard algorithm is improved so that parallel computing can be applied to shorten the time of global optimization notably. Besides traditional mosaic result, it can also generate superoverlay result for Google Earth, which can provide a fast and easy way to show the result data. In order to verify the feasibility of this method, a fast mosaic system of massive UAV images is developed, which is fully automated and no manual interaction is needed after original images and GPS data are provided. A test using 800 images of Kelan River in Xinjiang Province shows that this system can reduce 35%-50% time consumption in contrast of traditional methods, and increases respond speed of UAV image processing rapidly.

  19. Safety in numbers 4: The relationship between exposure to authentic and didactic environments and nursing students' learning of medication dosage calculation problem solving knowledge and skills.

    PubMed

    Weeks, Keith W; Clochesy, John M; Hutton, B Meriel; Moseley, Laurie

    2013-03-01

    Advancing the art and science of education practice requires a robust evaluation of the relationship between students' exposure to learning and assessment environments and the development of their cognitive competence (knowing that and why) and functional competence (know-how and skills). Healthcare education translation research requires specific education technology assessments and evaluations that consist of quantitative analyses of empirical data and qualitative evaluations of the lived student experience of the education journey and schemata construction (Weeks et al., 2013a). This paper focuses on the outcomes of UK PhD and USA post-doctorate experimental research. We evaluated the relationship between exposure to traditional didactic methods of education, prototypes of an authentic medication dosage calculation problem-solving (MDC-PS) environment and nursing students' construction of conceptual and calculation competence in medication dosage calculation problem-solving skills. Empirical outcomes from both UK and USA programmes of research identified highly significant differences in the construction of conceptual and calculation competence in MDC-PS following exposure to the authentic learning environment to that following exposure to traditional didactic transmission methods of education (p < 0.001). This research highlighted that for many students exposure to authentic learning environments is an essential first step in the development of conceptual and calculation competence and relevant schemata construction (internal representations of the relationship between the features of authentic dosage problems and calculation functions); and how authentic environments more ably support all cognitive (learning) styles in mathematics than traditional didactic methods of education. Functional competence evaluations are addressed in Macdonald et al. (2013) and Weeks et al. (2013e). Copyright © 2012. Published by Elsevier Ltd.

  20. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files

    PubMed Central

    Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng

    2018-01-01

    Abstract Background Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. Findings In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)–based high-performance computing (HPC) implementation, and the popular VCFTools. Conclusions Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems. PMID:29762754

  1. Multipass haemodialysis: a novel dialysis modality

    PubMed Central

    Heaf, James Goya; Axelsen, Mette; Pedersen, Robert Smith

    2013-01-01

    Introduction Most home haemodialysis (HD) modalities are limited to home use since they are based on a single-pass (SP) technique, which requires preparation of large amounts of dialysate. We present a new dialysis method, which requires minimal dialysate volumes, continuously recycled during treatment [multipass HD (MPHD)]. Theoretical calculations suggest that MPHD performed six times weekly for 8 h/night, using a dialysate bath containing 50% of the calculated body water, will achieve urea clearances equivalent to conventional HD 4 h thrice weekly, and a substantial clearance of higher middle molecules. Methods Ten stable HD patients were dialyzed for 4 h using standard SPHD (dialysate flow 500 mL/min). Used dialysate was collected. One week later, an 8-h MPHD was performed. The dialysate volume was 50% of the calculated water volume, the dialysate inflow 500 mL/min−0.5 × ultrafiltration/min and the outflow 500 mL/min + 0.5 × ultrafiltration/min. Elimination rates of urea, creatinine, uric acid, phosphate and β2-microglobulin (B2M) and dialysate saturation were determined hourly. Results Three hours of MPHD removed 49, 54, 50, 51 and 57%, respectively, of the amounts of urea, creatinine, uric acid, phosphate and B2M that were removed by 4 h conventional HD. The corresponding figures after 8 h MPHD were 63, 78, 74, 78 and 111%. Conclusions Clearance of small molecules using MPHD 6 × 8 h/week will exceed traditional HD 3 × 4 h/week. Similarly, clearance of large molecules will significantly exceed traditional HD and HD 5 × 2.5 h/week. This modality will increase patients' freedom of movement compared with traditional home HD. The new method can also be used in the intensive care unit and for automated peritoneal dialysis. PMID:23136214

  2. Quick, easy, cheap, effective, rugged, and safe sample preparation approach for pesticide residue analysis using traditional detectors in chromatography: A review.

    PubMed

    Rahman, Md Musfiqur; Abd El-Aty, A M; Kim, Sung-Woo; Shin, Sung Chul; Shin, Ho-Chul; Shim, Jae-Han

    2017-01-01

    In pesticide residue analysis, relatively low-sensitivity traditional detectors, such as UV, diode array, electron-capture, flame photometric, and nitrogen-phosphorus detectors, have been used following classical sample preparation (liquid-liquid extraction and open glass column cleanup); however, the extraction method is laborious, time-consuming, and requires large volumes of toxic organic solvents. A quick, easy, cheap, effective, rugged, and safe method was introduced in 2003 and coupled with selective and sensitive mass detectors to overcome the aforementioned drawbacks. Compared to traditional detectors, mass spectrometers are still far more expensive and not available in most modestly equipped laboratories, owing to maintenance and cost-related issues. Even available, traditional detectors are still being used for analysis of residues in agricultural commodities. It is widely known that the quick, easy, cheap, effective, rugged, and safe method is incompatible with conventional detectors owing to matrix complexity and low sensitivity. Therefore, modifications using column/cartridge-based solid-phase extraction instead of dispersive solid-phase extraction for cleanup have been applied in most cases to compensate and enable the adaptation of the extraction method to conventional detectors. In gas chromatography, the matrix enhancement effect of some analytes has been observed, which lowers the limit of detection and, therefore, enables gas chromatography to be compatible with the quick, easy, cheap, effective, rugged, and safe extraction method. For liquid chromatography with a UV detector, a combination of column/cartridge-based solid-phase extraction and dispersive solid-phase extraction was found to reduce the matrix interference and increase the sensitivity. A suitable double-layer column/cartridge-based solid-phase extraction might be the perfect solution, instead of a time-consuming combination of column/cartridge-based solid-phase extraction and dispersive solid-phase extraction. Therefore, replacing dispersive solid-phase extraction with column/cartridge-based solid-phase extraction in the cleanup step can make the quick, easy, cheap, effective, rugged, and safe extraction method compatible with traditional detectors for more sensitive, effective, and green analysis. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Quantification of Protozoa and Viruses from Small Water Volumes

    PubMed Central

    Bonilla, J. Alfredo; Bonilla, Tonya D.; Abdelzaher, Amir M.; Scott, Troy M.; Lukasik, Jerzy; Solo-Gabriele, Helena M.; Palmer, Carol J.

    2015-01-01

    Large sample volumes are traditionally required for the analysis of waterborne pathogens. The need for large volumes greatly limits the number of samples that can be processed. The goals of this study were to compare extraction and detection procedures for quantifying protozoan parasites and viruses from small volumes of marine water. The intent was to evaluate a logistically simpler method of sample collection and processing that would facilitate direct pathogen measures as part of routine monitoring programs. Samples were collected simultaneously using a bilayer device with protozoa capture by size (top filter) and viruses capture by charge (bottom filter). Protozoan detection technologies utilized for recovery of Cryptosporidium spp. and Giardia spp. were qPCR and the more traditional immunomagnetic separation—IFA-microscopy, while virus (poliovirus) detection was based upon qPCR versus plaque assay. Filters were eluted using reagents consistent with the downstream detection technologies. Results showed higher mean recoveries using traditional detection methods over qPCR for Cryptosporidium (91% vs. 45%) and poliovirus (67% vs. 55%) whereas for Giardia the qPCR-based methods were characterized by higher mean recoveries (41% vs. 28%). Overall mean recoveries are considered high for all detection technologies. Results suggest that simultaneous filtration may be suitable for isolating different classes of pathogens from small marine water volumes. More research is needed to evaluate the suitability of this method for detecting pathogens at low ambient concentration levels. PMID:26114244

  4. Quantification of Protozoa and Viruses from Small Water Volumes.

    PubMed

    Bonilla, J Alfredo; Bonilla, Tonya D; Abdelzaher, Amir M; Scott, Troy M; Lukasik, Jerzy; Solo-Gabriele, Helena M; Palmer, Carol J

    2015-06-24

    Large sample volumes are traditionally required for the analysis of waterborne pathogens. The need for large volumes greatly limits the number of samples that can be processed. The aims of this study were to compare extraction and detection procedures for quantifying protozoan parasites and viruses from small volumes of marine water. The intent was to evaluate a logistically simpler method of sample collection and processing that would facilitate direct pathogen measures as part of routine monitoring programs. Samples were collected simultaneously using a bilayer device with protozoa capture by size (top filter) and viruses capture by charge (bottom filter). Protozoan detection technologies utilized for recovery of Cryptosporidium spp. and Giardia spp. were qPCR and the more traditional immunomagnetic separation-IFA-microscopy, while virus (poliovirus) detection was based upon qPCR versus plaque assay. Filters were eluted using reagents consistent with the downstream detection technologies. Results showed higher mean recoveries using traditional detection methods over qPCR for Cryptosporidium (91% vs. 45%) and poliovirus (67% vs. 55%) whereas for Giardia the qPCR-based methods were characterized by higher mean recoveries (41% vs. 28%). Overall mean recoveries are considered high for all detection technologies. Results suggest that simultaneous filtration may be suitable for isolating different classes of pathogens from small marine water volumes. More research is needed to evaluate the suitability of this method for detecting pathogens at low ambient concentration levels.

  5. High-efficiency non-uniformity correction for wide dynamic linear infrared radiometry system

    NASA Astrophysics Data System (ADS)

    Li, Zhou; Yu, Yi; Tian, Qi-Jie; Chang, Song-Tao; He, Feng-Yun; Yin, Yan-He; Qiao, Yan-Feng

    2017-09-01

    Several different integration times are always set for a wide dynamic linear and continuous variable integration time infrared radiometry system, therefore, traditional calibration-based non-uniformity correction (NUC) are usually conducted one by one, and furthermore, several calibration sources required, consequently makes calibration and process of NUC time-consuming. In this paper, the difference of NUC coefficients between different integration times have been discussed, and then a novel NUC method called high-efficiency NUC, which combines the traditional calibration-based non-uniformity correction, has been proposed. It obtains the correction coefficients of all integration times in whole linear dynamic rangesonly by recording three different images of a standard blackbody. Firstly, mathematical procedure of the proposed non-uniformity correction method is validated and then its performance is demonstrated by a 400 mm diameter ground-based infrared radiometry system. Experimental results show that the mean value of Normalized Root Mean Square (NRMS) is reduced from 3.78% to 0.24% by the proposed method. In addition, the results at 4 ms and 70 °C prove that this method has a higher accuracy compared with traditional calibration-based NUC. In the meantime, at other integration time and temperature there is still a good correction effect. Moreover, it greatly reduces the number of correction time and temperature sampling point, and is characterized by good real-time performance and suitable for field measurement.

  6. Enhancing healthcare process design with human factors engineering and reliability science, part 1: setting the context.

    PubMed

    Boston-Fleischhauer, Carol

    2008-01-01

    The design and implementation of efficient, effective, and safe processes are never-ending challenges in healthcare. Less than optimal performance levels and rising concerns about patient safety suggest that traditional process design methods are insufficient to meet design requirements. In this 2-part series, the author presents human factors engineering and reliability science as important knowledge to enhance existing operational and clinical process design methods in healthcare. An examination of these theories, application approaches, and examples are presented.

  7. A distributed agent architecture for real-time knowledge-based systems: Real-time expert systems project, phase 1

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel

    1990-01-01

    We propose a distributed agent architecture (DAA) that can support a variety of paradigms based on both traditional real-time computing and artificial intelligence. DAA consists of distributed agents that are classified into two categories: reactive and cognitive. Reactive agents can be implemented directly in Ada to meet hard real-time requirements and be deployed on on-board embedded processors. A traditional real-time computing methodology under consideration is the rate monotonic theory that can guarantee schedulability based on analytical methods. AI techniques under consideration for reactive agents are approximate or anytime reasoning that can be implemented using Bayesian belief networks as in Guardian. Cognitive agents are traditional expert systems that can be implemented in ART-Ada to meet soft real-time requirements. During the initial design of cognitive agents, it is critical to consider the migration path that would allow initial deployment on ground-based workstations with eventual deployment on on-board processors. ART-Ada technology enables this migration while Lisp-based technologies make it difficult if not impossible. In addition to reactive and cognitive agents, a meta-level agent would be needed to coordinate multiple agents and to provide meta-level control.

  8. Collaborative Decision Model on Stockpile Material of a Traditional Market Infrastructure using Value-Based HBU

    NASA Astrophysics Data System (ADS)

    Utomo, C.; Rahmawati, Y.; Pararta, D. L.; Ariesta, A.

    2017-11-01

    Readiness of infrastructure establishment is needed in the early phase of real estate development. To meet the needs of retail property in the form of traditional markets, the Government prepares to build a new 1300 units. Traditional market development requires infrastructure development. One of it is the preparation of sand material embankment as much as ± 200,000 m3. With a distance of 30 km, sand material can be delivered to the project site by dump trucks that can only be operated by 2 trip per day. The material is managed by using stockpile method. Decision of stockpile location requires multi person and multi criteria in a collaborative environment. The highest and the best use (HBU) criteria was used to construct a value-based decision hierarchy. Decision makers from five stakeholders analyzed the best of three locations by giving their own preference of development cost and HBU function. Analytical Hierarchy Process (AHP) based on satisfying options and cooperative game was applied for agreement options and coalition formation on collaborative decision. The result indicates that not all solutions become a possible location for the stockpile material. It shows the ‘best fit’ options process for all decision makers.

  9. Using an interlaboratory study to revise methods for conducting 10-d to 42-d water or sediment toxicity tests with Hyalella azteca

    USGS Publications Warehouse

    Ivey, Chris D.; Ingersoll, Christopher G.; Brumbaugh, William G.; Hammer, Edward J.; Mount, David R.; Hockett, J. Russell; Norberg-King, Teresa J.; Soucek, Dave; Taylor, Lisa

    2016-01-01

    Studies have been conducted to refine US Environmental Protection Agency, ASTM International, and Environment Canada standard methods for conducting 42-d reproduction tests with Hyalella azteca in water or in sediment. Modifications to the H. azteca method include better-defined ionic composition requirements for exposure water (i.e., >15 mg/L of chloride and >0.02 mg/L of bromide) and improved survival, growth, and reproduction with alternate diets provided as increased rations over time in water-only or whole-sediment toxicity tests. A total of 24 laboratories volunteered to participate in the present interlaboratory study evaluating the performance of H. azteca in 42-d studies in control sand or control sediment using the refined methods. Improved growth and reproduction of H. azteca was observed with 2 alternate diets of 1) ramped diatoms (Thalassiosira weissflogii) + ramped Tetramin or 2) yeast–cerophyll–trout chow (YCT) + ramped Tetramin, especially when compared with results from the traditional diet of 1.8 mg YCT/d. Laboratories were able to meet proposed test acceptability criteria and in most cases had lower variation in growth or reproduction compared with previous interlaboratory studies using the traditional YCT diet. Laboratory success in conducting 42-d H. azteca exposures benefited from adherence to several key requirements of the detailed testing, culturing, and handling methods. Results from the present interlaboratory study are being used to help revise standard methods for conducting 10-d to 42-d water or sediment toxicity exposures with H. azteca.

  10. [Frequency of Candida in root canals of teeth with primary and persistent endodontic infections].

    PubMed

    Bernal-Treviño, Angel; González-Amaro, Ana María; Méndez González, Verónica; Pozos-Guillen, Amaury

    Microbiological identification in endodontic infections has focused mainly on bacteria without giving much attention to yeasts, which, due to their virulence factors, can affect the outcomes of root canal treatment. To determine the frequency of Candida in anaerobic conditions in root canals with primary and persistent endodontic infection, as well as to evaluate a microbiological sampling method using aspiration compared to the traditional absorption method with paper points. Fifty microbiological samples were obtained from teeth of 47 patients requiring endodontic treatments, due to either primary or persistent infections. Two microbiological sampling methods were used: an aspiration method, and the traditional paper point absorption method. In each of these methods, two types of medium were used (M 1 -M 4 ). Samples were cultured under anaerobic conditions until reaching 0.5 McFarland turbidity, and then inoculated on Sabouraud dextrose, as well as on anaerobic enriched blood agar plates. Macroscopic and microscopic observations of the colonies were performed. The germ-tube test, growth on CHROMagar, and biochemical identification were performed on the isolated yeasts. Fungal infection was found in 18 (36%) samples out of the 50 teeth evaluated. In the 18 samples positive for fungal infection, 15 out of 36 (41.6%) teeth were taken from a primary infection, and 3 out of 14 (21.4%) from a persistent infection. The aspiration method using Sabouraud dextrose medium recovered a greater diversity of species. Yeasts frequency was higher in teeth with primary infections compared to teeth with persistent infections. The predominant yeast species was Candida albicans. The aspirating sampling method was more efficient in the recovery of Candida isolates than the traditional absorption method. Copyright © 2018 Asociación Española de Micología. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. Altering allergenicity of cow's milk by food processing for applications in infant formula.

    PubMed

    Golkar, Abdolkhalegh; Milani, Jafar M; Vasiljevic, Todor

    2018-04-16

    Cow's milk-based infant formulas have a long tradition in infant nutrition, although some infants are unable to use them due to presence of several known allergens. Various processing methods have been identified capable of reducing cow's milk protein allergenicity including thermal and non-thermal methods and their combinations. Heat treatment and enzymatic hydrolysis have been in production of hypoallergenic infant formulas. However, modulation of allergenic epitopes depends on the extent of heat treatment applied, which consequently may also reduce a nutritional value of these proteins. In addition, enzymatic hydrolysis may not target allergenic epitopes thus allergenicity may persist; however released peptides may have detrimental impact on taste and functional properties of final products. Modulation of allergenicity of milk proteins appears to require a concerted effort to minimize detrimental effects as clinical studies conducted on commercial hypoallergenic formulas demonstrated persistence of allergic symptoms. This article covers traditional and novel processing methods and their impact on reduction of cow's milk allergenicity in milk-based infant formulas.

  12. Hybrid LES/RANS simulation of a turbulent boundary layer over a rectangular cavity

    NASA Astrophysics Data System (ADS)

    Zhang, Qi; Haering, Sigfried; Oliver, Todd; Moser, Robert

    2016-11-01

    We report numerical investigations of a turbulent boundary layer over a rectangular cavity using a new hybrid RANS/LES model and the traditional Detached Eddy Simulation (DES). Our new hybrid method aims to address many of the shortcomings from the traditional DES. In the new method, RANS/LES blending controlled by a parameter that measures the ratio of the modeled subgrid kinetic energy to an estimate of the subgrid energy based on the resolved scales. The result is a hybrid method automatically resolves as much turbulence as can be supported by the grid and transitions appropriately from RANS to LES without the need for ad hoc delaying functions that are often required for DES. Further, the new model is designed to improve upon DES by accounting for the effects of grid anisotropy and inhomogeneity in the LES region. We present comparisons of the flow features inside the cavity and the pressure time history and spectra as computed using the new hybrid model and DES.

  13. A streaming multi-GPU implementation of image simulation algorithms for scanning transmission electron microscopy.

    PubMed

    Pryor, Alan; Ophus, Colin; Miao, Jianwei

    2017-01-01

    Simulation of atomic-resolution image formation in scanning transmission electron microscopy can require significant computation times using traditional methods. A recently developed method, termed plane-wave reciprocal-space interpolated scattering matrix (PRISM), demonstrates potential for significant acceleration of such simulations with negligible loss of accuracy. Here, we present a software package called Prismatic for parallelized simulation of image formation in scanning transmission electron microscopy (STEM) using both the PRISM and multislice methods. By distributing the workload between multiple CUDA-enabled GPUs and multicore processors, accelerations as high as 1000 × for PRISM and 15 × for multislice are achieved relative to traditional multislice implementations using a single 4-GPU machine. We demonstrate a potentially important application of Prismatic , using it to compute images for atomic electron tomography at sufficient speeds to include in the reconstruction pipeline. Prismatic is freely available both as an open-source CUDA/C++ package with a graphical user interface and as a Python package, PyPrismatic .

  14. Microdialysis as a New Technique for Extracting Phenolic Compounds from Extra Virgin Olive Oil.

    PubMed

    Bazzu, Gianfranco; Molinu, Maria Giovanna; Dore, Antonio; Serra, Pier Andrea

    2017-03-01

    The amount and composition of the phenolic components play a major role in determining the quality of olive oil. The traditional liquid-liquid extraction (LLE) method requires a time-consuming sample preparation to obtain the "phenolic profile" of extra virgin olive oil (EVOO). This study aimed to develop a microdialysis extraction (MDE) as an alternative to the LLE method to evaluate the phenolic components of EVOO. To this purpose, a microdialysis device and dialysis procedure were developed. "Dynamic-oil" microdialysis was performed using an extracting solution (80:20 methanol/water) flow rate of 2 μL min -1 and a constant EVOO stream of 4 μL min -1 . The results indicated a strong positive correlation between MDE and the LLE method, providing a very similar phenolic profile obtained with traditional LLE. In conclusion, the MDE approach, easier and quicker in comparison to LLE, provided a reliable procedure to determine the phenolic components used as a marker of the quality and traceability of EVOO.

  15. High-accuracy and real-time 3D positioning, tracking system for medical imaging applications based on 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Xue, Yuan; Cheng, Teng; Xu, Xiaohai; Gao, Zeren; Li, Qianqian; Liu, Xiaojing; Wang, Xing; Song, Rui; Ju, Xiangyang; Zhang, Qingchuan

    2017-01-01

    This paper presents a system for positioning markers and tracking the pose of a rigid object with 6 degrees of freedom in real-time using 3D digital image correlation, with two examples for medical imaging applications. Traditional DIC method was improved to meet the requirements of the real-time by simplifying the computations of integral pixel search. Experiments were carried out and the results indicated that the new method improved the computational efficiency by about 4-10 times in comparison with the traditional DIC method. The system was aimed for orthognathic surgery navigation in order to track the maxilla segment after LeFort I osteotomy. Experiments showed noise for the static point was at the level of 10-3 mm and the measurement accuracy was 0.009 mm. The system was demonstrated on skin surface shape evaluation of a hand for finger stretching exercises, which indicated a great potential on tracking muscle and skin movements.

  16. Life cycle assessment and economic analysis of a low concentrating photovoltaic system.

    PubMed

    De Feo, G; Forni, M; Petito, F; Renno, C

    2016-10-01

    Many new photovoltaic (PV) applications, such as the concentrating PV (CPV) systems, are appearing on the market. The main characteristic of CPV systems is to concentrate sunlight on a receiver by means of optical devices and to decrease the solar cells area required. A low CPV (LCPV) system allows optimizing the PV effect with high increase of generated electric power as well as decrease of active surface area. In this paper, an economic analysis and a life cycle assessment (LCA) study of a particular LCPV scheme is presented and its environmental impacts are compared with those of a PV traditional system. The LCA study was performed with the software tool SimaPro 8.0.2, using the Econinvent 3.1 database. A functional unit of 1 kWh of electricity produced was chosen. Carbon Footprint, Ecological Footprint and ReCiPe 2008 were the methods used to assess the environmental impacts of the LCPV plant compared with a corresponding traditional system. All the methods demonstrated the environmental convenience of the LCPV system. The innovative system allowed saving 16.9% of CO2 equivalent in comparison with the traditional PV plant. The environmental impacts saving was 17% in terms of Ecological Footprint, and, finally, 15.8% with the ReCiPe method.

  17. Estimating the Natural Flow Regime of Rivers With Long-Standing Development: The Northern Branch of the Rio Grande

    NASA Astrophysics Data System (ADS)

    Blythe, Todd L.; Schmidt, John C.

    2018-02-01

    An estimate of a river's natural flow regime is useful for water resource planning and ecosystem rehabilitation by providing insight into the predisturbance form and function of a river. The natural flow regime of most rivers has been perturbed by development during the 20th century and in some cases, before stream gaging began. The temporal resolution of natural flows estimated using traditional methods is typically not sufficient to evaluate cues that drive native ecosystem function. Additionally, these traditional methods are watershed specific and require large amounts of data to produce accurate results. We present a mass balance method that estimates natural flows at daily time step resolution for the northern branch of the Rio Grande, upstream from the Rio Conchos, that relies only on easily obtained streamflow data. Using an analytical change point method, we identified periods of the measured flow regime during the 20th century for comparison with the estimated natural flows. Our results highlight the significant deviation from natural conditions that occurred during the 20th century. The total annual flow of the northern branch is 95% lower than it would be in the absence of human use. The current 2 year flood has decreased by more than 60%, is shorter in duration, and peaks later in the year. When compared to unregulated flows estimated using traditional mass balance accounting methods, our approach provides similar results.

  18. The flip side of traditional nursing education: A literature review.

    PubMed

    Ward, Maria; Knowlton, Mary C; Laney, Candice W

    2018-03-01

    The flipped classroom (FC) andragogy purports an improvement of critical thinking and problem-solving skills in students. This literature review explores fourteen research studies and discusses outcome measures reported on the effectiveness of using this teaching modality. Students described the learning activities during the classroom meeting times as valuable and indicated the interaction and engagement were beneficial to their learning. Many students opined an increased comprehension of the subject matter. Overall, the FC required more work on the part of the students and the faculty, and the majority of students preferred the traditional classroom (TC) passive method of learning over the FC active learning andragogy as a result of the substantial time commitment required for preparation necessitated by the FC. Five of the fourteen studies evaluated student learning outcome measures; four studies showed an improvement in the FC environment compared to the TC and one reported the FC was at least as effective as the TC. Further studies with quantifiable outcome measures are required to determine the effectiveness of a FC on critical thinking and problem-solving skills of nursing students. Copyright © 2018. Published by Elsevier Ltd.

  19. Potential for yield improvement in combined rip-first and crosscut-first rough mill processing

    Treesearch

    Ed Thomas; Urs Buehlmann

    2016-01-01

    Traditionally, lumber cutting systems in rough mills have either first ripped lumber into wide strips and then crosscut the resulting strips into component lengths (rip-first), or first crosscut the lumber into component lengths, then ripped the segments to the required widths (crosscut-first). Each method has its advantages and disadvantages. Crosscut-first typically...

  20. Preparing a New Generation of Citizens and Scientists to Face Earth's Future

    ERIC Educational Resources Information Center

    Bralower, Timothy J.; Feiss, P. Geoffrey; Manduca, Cathryn A.

    2008-01-01

    As the research interests and the focus of traditional earth scientists are transformed, so too must education in earth system science at colleges and universities across the country change. The required change involves not only the methods used to teach this new science, but also the essential place of the earth sciences in the panoply of…

  1. Digital aerial sketchmapping and downlink communications: a new tool for fire managers

    Treesearch

    Everett Hinkley; Tom Zajkowski; Charlie Schrader-Patton

    2010-01-01

    Aerial sketchmapping is the geolocating of features that are seen on the ground below an aircraft and the subsequent recording of those features. Traditional aerial sketchmapping methods required hand-sketching on hardcopy maps or photos and the translation of that information to a digital file. In 1996, the U.S. Department of Agriculture (USDA) Forest Service embarked...

  2. Relationship of glucose values to sliding scale insulin (correctional insulin) dose delivery and meal time in acute care patients with diabetes mellitus.

    PubMed

    Trotter, Barbara; Conaway, Mark R; Burns, Suzanne M

    2013-01-01

    Findings of this study suggest the traditional sliding scale insulin (SSI) method does not improve target glucose values among adult medical inpatients. Timing of blood glucose (BC) measurement does affect the required SSI dose. BC measurement and insulin dose administration should be accomplished immediately prior to mealtime.

  3. Finding External Indicators of Load on a Web Server via Analysis of Black-Box Performance Measurements

    ERIC Educational Resources Information Center

    Chiarini, Marc A.

    2010-01-01

    Traditional methods for system performance analysis have long relied on a mix of queuing theory, detailed system knowledge, intuition, and trial-and-error. These approaches often require construction of incomplete gray-box models that can be costly to build and difficult to scale or generalize. In this thesis, we present a black-box analysis…

  4. Time series inversion of spectra from ground-based radiometers

    NASA Astrophysics Data System (ADS)

    Christensen, O. M.; Eriksson, P.

    2013-02-01

    Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the OSO water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.

  5. Ocean Wave Separation Using CEEMD-Wavelet in GPS Wave Measurement

    PubMed Central

    Wang, Junjie; He, Xiufeng; Ferreira, Vagner G.

    2015-01-01

    Monitoring ocean waves plays a crucial role in, for example, coastal environmental and protection studies. Traditional methods for measuring ocean waves are based on ultrasonic sensors and accelerometers. However, the Global Positioning System (GPS) has been introduced recently and has the advantage of being smaller, less expensive, and not requiring calibration in comparison with the traditional methods. Therefore, for accurately measuring ocean waves using GPS, further research on the separation of the wave signals from the vertical GPS-mounted carrier displacements is still necessary. In order to contribute to this topic, we present a novel method that combines complementary ensemble empirical mode decomposition (CEEMD) with a wavelet threshold denoising model (i.e., CEEMD-Wavelet). This method seeks to extract wave signals with less residual noise and without losing useful information. Compared with the wave parameters derived from the moving average skill, high pass filter and wave gauge, the results show that the accuracy of the wave parameters for the proposed method was improved with errors of about 2 cm and 0.2 s for mean wave height and mean period, respectively, verifying the validity of the proposed method. PMID:26262620

  6. Rotatable Small Permanent Magnet Array for Ultra-Low Field Nuclear Magnetic Resonance Instrumentation: A Concept Study

    PubMed Central

    Vegh, Viktor; Reutens, David C.

    2016-01-01

    Object We studied the feasibility of generating the variable magnetic fields required for ultra-low field nuclear magnetic resonance relaxometry with dynamically adjustable permanent magnets. Our motivation was to substitute traditional electromagnets by distributed permanent magnets, increasing system portability. Materials and Methods The finite element method (COMSOL®) was employed for the numerical study of a small permanent magnet array to calculate achievable magnetic field strength, homogeneity, switching time and magnetic forces. A manually operated prototype was simulated and constructed to validate the numerical approach and to verify the generated magnetic field. Results A concentric small permanent magnet array can be used to generate strong sample pre-polarisation and variable measurement fields for ultra-low field relaxometry via simple prescribed magnet rotations. Using the array, it is possible to achieve a pre-polarisation field strength above 100 mT and variable measurement fields ranging from 20–50 μT with 200 ppm absolute field homogeneity within a field-of-view of 5 x 5 x 5 cubic centimetres. Conclusions A dynamic small permanent magnet array can generate multiple highly homogeneous magnetic fields required in ultra-low field nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI) instruments. This design can significantly reduce the volume and energy requirements of traditional systems based on electromagnets, improving portability considerably. PMID:27271886

  7. Perceptions and experiences of allopathic health practitioners on collaboration with traditional health practitioners in post-apartheid South Africa

    PubMed Central

    Hendricks, Stephen J.; Mulaudzi, Mavis F.

    2016-01-01

    Background The indigenous health system was perceived to be a threat to the allopathic health system. It was associated with ‘witchcraft’, and actively discouraged, and repressed through prohibition laws. The introduction of the Traditional Health Practitioners Act No 22 of 2007 brought hope that those centuries of disrespect for traditional health systems would change. The study examined the perceptions and experiences of allopathic health practitioners on collaboration with traditional health practitioners in post-apartheid South Africa. Methods Qualitative descriptive research methodology was used to collect data from allopathic health practitioners employed by Limpopo’s Department of Health. In-depth focus group discussions and meetings were conducted between January and August 2014. Perceptions and experiences of working with traditional health practitioners were explored. Ethical clearance was obtained from the University of Pretoria and approval from the Department’s Research Committee. Results Dominant views were that the two health systems were not compatible with respect to the science involved and the source of knowledge. Overall, quality of health care will be compromised if traditional health practitioners are allowed to work in public health facilities. Conclusion Allopathic health practitioners do not appear ready to work with traditional health practitioners, citing challenges of quality of health care, differences regarding concept of sciences and source of knowledge; and lack of policy on collaboration. Lack of exposure to traditional medicine seems to impede opportunities to accept and work with traditional healers. Exposure and training at undergraduate level regarding the traditional health system is recommended. Policy guidelines on collaborations are urgently required. PMID:27380856

  8. Disruptive Technology: Saving Money and Inspiring Engagement in Professional Staff.

    PubMed

    McPherson, Penne; Talbot, Elizabeth

    Competent, efficient, and cost-effective delivery of professional development is a challenge in health care. Collaboration of teaching methodologies with academia and acute care offers fresh perspectives and delivery methods that can facilitate optimal outcomes. One multihospital system introduced the academic "flipped classroom" model to its acute care setting and integrated it into professional development requirements. The concept of the flipped classroom requires independent student engagement prior to classroom activities versus the traditional classroom lecture model. Results realized a cost savings in 2 years of $28,737 in addition to positive employee engagement.

  9. Utilisation of Pangolin (Manis sps) in traditional Yorubic medicine in Ijebu province, Ogun State, Nigeria

    PubMed Central

    2009-01-01

    Background Concern about the use of endangered and threatened species in traditional medicine escalated as populations of many species plummeted because of poaching for the medicinal trade. Nigeria is known for a long and valued tradition of using wild animals and plants for medicinal purposes. Despite this, studies on medicinal animals are still scarce when compared to those focusing on medicinal plants. Utilisation of wild animals in traditional Yorubic medical practices was indiscriminate as it involved threatened species. By touting the medicinal properties of these species, traditional medicine fuel continuing demand, thereby subjecting such species to further threats. This paper examined the use and commercialisation of pangolins for traditional medicinal purposes amongst the Ijebus, South-western Nigeria, and the implications of this utilisation for the conservation of this species. Methods Traditional Yorubic medical practitioners (tymps) (16) and dealers in traditional medicinal ingredients (56) in public markets in Ijebu province, Nigeria, were interviewed using open-ended questionnaires. The dynamic stock movement of pangolins in the stalls of dealers was also monitored to determine quantity of pangolins sold into the traditional Yorubic medicinal practices. Specific conditions treated and the parts required were also documented. Results A total of 178 whole pangolin carcasses were sold into traditional medical practices. Above 55% of respondents had just primary education, over 90% of respondents were not aware of either the conservation status of this species or the existence of any legal machinery regulating its trade and utilisation, while 14% admitted to giving contracts to hunters for deliberate search for this animal when needed. More than 98% of respondents have no other means of livelihood. The trade was female dominated while the healing practice had more males. Pangolins were used in various preparations to treat a total of 42 conditions. These include infertility, gastro-intestinal disorders, safe parturition, stomach ulcers, rheumatism and fibroid. Traditional Yorubic medicine also accommodated some situations that are out of the range of conventional medicine like boosting sales, conferring invisibility, removing bad luck, appeasing/wading off witches cum evil forces and money rituals. Some of these situations specifically require juvenile, or even pregnant female animals. Conclusion Traditional Yorubic medical practices eats deep into the reproductive base of the species, presently listed in Appendix II of CITES and Schedule I of the Nigerian Decree 11 (1985), both of which recommended strict control in sales and utilisation of this species. Its numerous medicinal values, folk culture and financial benefits of these activities are the main factors promoting the commercialisation and use of this species. Pharmacological studies on the various preparations are required to identify the bioactive compounds in them. There is a need for improved and urgent measures to conserve populations of this species in-situ. Massive education and enlightenment is urgently needed for the populace to have the necessary awareness and orientation about the conservation of this species. PMID:19961597

  10. Fast super-resolution estimation of DOA and DOD in bistatic MIMO Radar with off-grid targets

    NASA Astrophysics Data System (ADS)

    Zhang, Dong; Zhang, Yongshun; Zheng, Guimei; Feng, Cunqian; Tang, Jun

    2018-05-01

    In this paper, we focus on the problem of joint DOA and DOD estimation in Bistatic MIMO Radar using sparse reconstruction method. In traditional ways, we usually convert the 2D parameter estimation problem into 1D parameter estimation problem by Kronecker product which will enlarge the scale of the parameter estimation problem and bring more computational burden. Furthermore, it requires that the targets must fall on the predefined grids. In this paper, a 2D-off-grid model is built which can solve the grid mismatch problem of 2D parameters estimation. Then in order to solve the joint 2D sparse reconstruction problem directly and efficiently, three kinds of fast joint sparse matrix reconstruction methods are proposed which are Joint-2D-OMP algorithm, Joint-2D-SL0 algorithm and Joint-2D-SOONE algorithm. Simulation results demonstrate that our methods not only can improve the 2D parameter estimation accuracy but also reduce the computational complexity compared with the traditional Kronecker Compressed Sensing method.

  11. Synthesis of silica aerogel monoliths with controlled specific surface areas and pore sizes

    NASA Astrophysics Data System (ADS)

    Gao, Bingying; Lu, Shaoxiang; Kalulu, Mulenga; Oderinde, Olayinka; Ren, Lili

    2017-07-01

    To replace traditional preparation methods of silica aerogels, a small-molecule 1,2-epoxypropane (PO) has been introduced into the preparation process instead of using ammonia as the cross-linking agent, thus generating a lightweight, high porosity, and large surface area silica aerogel monolithic. We put forward a simple solution route for the chemical synthesis of silica aerogels, which was characterized by scanning electron microscopy (SEM), TEM, XRD, FTIR, thermogravimetric analysis (TGA) and the Brunauer-Emmett-Teller (BET) method In this paper, the effect of the amount of PO on the microstructure of silica aerogels is discussed. The BET surface areas and pore sizes of the resulting silica aerogels can be freely adjusted by changing the amount of PO, which will be helpful in promoting the development of silica aerogels to fabricate other porous materials with similar requirements. We also adopted a new organic solvent sublimation drying (OSSD) method to replace traditional expensive and dangerous drying methods such as critical point drying and freeze drying. This simple approach is easy to operate and has good repeatability, which will further facilitate actual applications of silica aerogels.

  12. On the Analytical Superiority of 1D NMR for Fingerprinting the Higher Order Structure of Protein Therapeutics Compared to Multidimensional NMR Methods.

    PubMed

    Poppe, Leszek; Jordan, John B; Rogers, Gary; Schnier, Paul D

    2015-06-02

    An important aspect in the analytical characterization of protein therapeutics is the comprehensive characterization of higher order structure (HOS). Nuclear magnetic resonance (NMR) is arguably the most sensitive method for fingerprinting HOS of a protein in solution. Traditionally, (1)H-(15)N or (1)H-(13)C correlation spectra are used as a "structural fingerprint" of HOS. Here, we demonstrate that protein fingerprint by line shape enhancement (PROFILE), a 1D (1)H NMR spectroscopy fingerprinting approach, is superior to traditional two-dimensional methods using monoclonal antibody samples and a heavily glycosylated protein therapeutic (Epoetin Alfa). PROFILE generates a high resolution structural fingerprint of a therapeutic protein in a fraction of the time required for a 2D NMR experiment. The cross-correlation analysis of PROFILE spectra allows one to distinguish contributions from HOS vs protein heterogeneity, which is difficult to accomplish by 2D NMR. We demonstrate that the major analytical limitation of two-dimensional methods is poor selectivity, which renders these approaches problematic for the purpose of fingerprinting large biological macromolecules.

  13. Wireless Zigbee strain gage sensor system for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Ide, Hiroshi; Abdi, Frank; Miraj, Rashid; Dang, Chau; Takahashi, Tatsuya; Sauer, Bruce

    2009-05-01

    A compact cell phone size radio frequency (ZigBee) wireless strain measurement sensor system to measure the structural strain deformation was developed. The developed system provides an accurate strain measurement data stream to the Internet for further Diagnostic and Prognostic (DPS) correlation. Existing methods of structural measurement by strain sensors (gauges) do not completely satisfy problems posed by continuous structural health monitoring. The need for efficient health monitoring methods with real-time requirements to bidirectional data flow from sensors and to a commanding device is becoming critical for keeping our daily life safety. The use of full-field strain measurement techniques could reduce costly experimental programs through better understanding of material behavior. Wireless sensor-network technology is a monitoring method that is estimated to grow rapidly providing potential for cost savings over traditional wired sensors. The many of currently available wireless monitoring methods have: the proactive and constant data rate character of the data streams rather than traditional reactive, event-driven data delivery; mostly static node placement on structures with limited number of nodes. Alpha STAR Electronics' wireless sensor network system, ASWN, addresses some of these deficiencies, making the system easier to operate. The ASWN strain measurement system utilizes off-the-shelf sensors, namely strain gauges, with an analog-to-digital converter/amplifier and ZigBee radio chips to keep cost lower. Strain data is captured by the sensor, converted to digital form and delivered to the ZigBee radio chip, which in turn broadcasts the information using wireless protocols to a Personal Data Assistant (PDA) or Laptop/Desktop computers. From here, data is forwarded to remote computers for higher-level analysis and feedback using traditional cellular and satellite communication or the Ethernet infrastructure. This system offers a compact size, lower cost, and temperature insensitivity for critical structural applications, which require immediate monitoring and feedback.

  14. SU-F-T-504: Non-Divergent Planning Method for Craniospinal Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperling, N; Bogue, J; Parsai, E

    2016-06-15

    Purpose: Traditional Craniospinal Irradiation (CSI) planning techniques require careful field placement to allow optimal divergence and field overlap at depth, and measurement of skin gap. The result of this is a necessary field overlap resulting in dose heterogeneity in the spinal canal. A novel, nondivergent field matching method has been developed to allow simple treatment planning and delivery without the need to measure skin gap. Methods: The CSI patient was simulated in the prone, and a plan was developed. Bilateral cranial fields were designed with couch and collimator rotation to eliminate divergence with the upper spine field and minimize anteriormore » divergence into the lenses. Spinal posterior-to-anterior fields were designed with the couch rotated to 90 degrees to allow gantry rotation to eliminate divergence at the match line, and the collimator rotated to 90 degrees to allow appropriate field blocking with the MLCs. A match line for the two spinal fields was placed and the gantry rotated to equal angles in opposite directions about the match line. Jaw positions were then defined to allow 1mm overlap at the match line to avoid cold spots. A traditional CSI plan was generated using diverging spinal fields, and a comparison between the two techniques was generated. Results: The non-divergent treatment plan was able to deliver a highly uniform dose to the spinal cord with a cold spot of only 95% and maximum point dose of 115.8%, as compared to traditional plan cold spots of 87% and hot spots of 132% of the prescription dose. Conclusion: A non-divergent method for planning CSI patients has been developed and clinically implemented. Planning requires some geometric manipulation in order to achieve an adequate dose distribution, however, it can help to manage cold spots and simplify the shifts needed between spinal fields.« less

  15. Relationships among video gaming proficiency and spatial orientation, laparoscopic, and traditional surgical skills of third-year veterinary students.

    PubMed

    Millard, Heather A Towle; Millard, Ralph P; Constable, Peter D; Freeman, Lyn J

    2014-02-01

    To determine the relationships among traditional and laparoscopic surgical skills, spatial analysis skills, and video gaming proficiency of third-year veterinary students. Prospective, randomized, controlled study. A convenience sample of 29 third-year veterinary students. The students had completed basic surgical skills training with inanimate objects but had no experience with soft tissue, orthopedic, or laparoscopic surgery; the spatial analysis test; or the video games that were used in the study. Scores for traditional surgical, laparoscopic, spatial analysis, and video gaming skills were determined, and associations among these were analyzed by means of Spearman's rank order correlation coefficient (rs). A significant positive association (rs = 0.40) was detected between summary scores for video game performance and laparoscopic skills, but not between video game performance and traditional surgical skills scores. Spatial analysis scores were positively (rs = 0.30) associated with video game performance scores; however, that result was not significant. Spatial analysis scores were not significantly associated with laparoscopic surgical skills scores. Traditional surgical skills scores were not significantly associated with laparoscopic skills or spatial analysis scores. Results of this study indicated video game performance of third-year veterinary students was predictive of laparoscopic but not traditional surgical skills, suggesting that laparoscopic performance may be improved with video gaming experience. Additional studies would be required to identify methods for improvement of traditional surgical skills.

  16. emcee: The MCMC Hammer

    NASA Astrophysics Data System (ADS)

    Foreman-Mackey, Daniel; Hogg, David W.; Lang, Dustin; Goodman, Jonathan

    2013-03-01

    We introduce a stable, well tested Python implementation of the affine-invariant ensemble sampler for Markov chain Monte Carlo (MCMC) proposed by Goodman & Weare (2010). The code is open source and has already been used in several published projects in the astrophysics literature. The algorithm behind emcee has several advantages over traditional MCMC sampling methods and it has excellent performance as measured by the autocorrelation time (or function calls per independent sample). One major advantage of the algorithm is that it requires hand-tuning of only 1 or 2 parameters compared to ˜N2 for a traditional algorithm in an N-dimensional parameter space. In this document, we describe the algorithm and the details of our implementation. Exploiting the parallelism of the ensemble method, emcee permits any user to take advantage of multiple CPU cores without extra effort. The code is available online at http://dan.iel.fm/emcee under the GNU General Public License v2.

  17. Innovative Perspectives of Integrated Chinese Medicine on H. pylori.

    PubMed

    Ye, Hui; Shi, Zong-Ming; Chen, Yao; Yu, Jing; Zhang, Xue-Zhi

    2018-06-08

    Helicobacter pylori (H. pylori) treatment requires the development of more effective therapies, mainly owing to the challenges posed by the bacterial resistance to antibiotics. In China, critically high infection and antibiotic resistance rates have limited the application of classic H. pylori eradication therapies. Consequently, researchers are attempting to find new solutions by drawing from traditional medicine. This article reviews basic scientific and clinical progress in the use of integrated Chinese and Western medicine (IM) to treat H. pylori; describes the conflicting results between in vivo and in vitro studies in this regard; discusses the observed clinical effects of IM, with emphasis on traditional patent medicines; and proposes a role for IM in both the diagnosis and treatment of H. pylori, including the use of tongue manifestation as an early diagnostic method and capitalizing on IM's direct and indirect methods for enhancing antibiotic effect.

  18. Complementary approaches to diagnosing marine diseases: a union of the modern and the classic

    PubMed Central

    Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman; House, Marcia; Mydlarz, Laura D.; Prager, Katherine C.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca

    2016-01-01

    Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease. PMID:26880839

  19. Research on Rigid Body Motion Tracing in Space based on NX MCD

    NASA Astrophysics Data System (ADS)

    Wang, Junjie; Dai, Chunxiang; Shi, Karen; Qin, Rongkang

    2018-03-01

    In the use of MCD (Mechatronics Concept Designer) which is a module belong to SIEMENS Ltd industrial design software UG (Unigraphics NX), user can define rigid body and kinematic joint to make objects move according to the existing plan in simulation. At this stage, user may have the desire to see the path of some points in the moving object intuitively. In response to this requirement, this paper will compute the pose through the transformation matrix which can be available from the solver engine, and then fit these sampling points through B-spline curve. Meanwhile, combined with the actual constraints of rigid bodies, the traditional equal interval sampling strategy was optimized. The result shown that this method could satisfy the demand and make up for the deficiency in traditional sampling method. User can still edit and model on this 3D curve. Expected result has been achieved.

  20. Complementary approaches to diagnosing marine diseases: a union of the modern and the classic

    USGS Publications Warehouse

    Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman G.; House, Marcia; Lafferty, Kevin D.; Mydlarz, Laura D.; Prager, Katherine C.; Sutherland, Kathryn P.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca

    2016-01-01

    Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease.

  1. Steering optical comb frequencies by rotating the polarization state

    NASA Astrophysics Data System (ADS)

    Zhang, Yanyan; Zhang, Xiaofei; Yan, Lulu; Zhang, Pan; Rao, Bingjie; Han, Wei; Guo, Wenge; Zhang, Shougang; Jiang, Haifeng

    2017-12-01

    Optical frequency combs, with precise control of repetition rate and carrier-envelope-offset frequency, have revolutionized many fields, such as fine optical spectroscopy, optical frequency standards, ultra-fast science research, ultra-stable microwave generation and precise ranging measurement. However, existing high bandwidth frequency control methods have small dynamic range, requiring complex hybrid control techniques. To overcome this limitation, we develop a new approach, where a home-made intra-cavity electro-optic modulator tunes polarization state of laser signal rather than only optical length of the cavity, to steer frequencies of a nonlinear-polarization-rotation mode-locked laser. By taking advantage of birefringence of the whole cavity, this approach results in not only broadband but also relative large-dynamic frequency control. Experimental results show that frequency control dynamic range increase at least one order in comparison with the traditional intra-cavity electro-optic modulator technique. In additional, this technique exhibits less side-effect than traditional frequency control methods.

  2. Rapid analysis of scattering from periodic dielectric structures using accelerated Cartesian expansions

    DOE PAGES

    Baczewski, Andrew David; Miller, Nicholas C.; Shanker, Balasubramaniam

    2012-03-22

    Here, the analysis of fields in periodic dielectric structures arise in numerous applications of recent interest, ranging from photonic bandgap structures and plasmonically active nanostructures to metamaterials. To achieve an accurate representation of the fields in these structures using numerical methods, dense spatial discretization is required. This, in turn, affects the cost of analysis, particularly for integral-equation-based methods, for which traditional iterative methods require Ο(Ν 2) operations, Ν being the number of spatial degrees of freedom. In this paper, we introduce a method for the rapid solution of volumetric electric field integral equations used in the analysis of doubly periodicmore » dielectric structures. The crux of our method is the accelerated Cartesian expansion algorithm, which is used to evaluate the requisite potentials in Ο(Ν) cost. Results are provided that corroborate our claims of acceleration without compromising accuracy, as well as the application of our method to a number of compelling photonics applications.« less

  3. Effects of expected-value information and display format on recognition of aircraft subsystem abnormalities

    NASA Technical Reports Server (NTRS)

    Palmer, Michael T.; Abbott, Kathy H.

    1994-01-01

    This study identifies improved methods to present system parameter information for detecting abnormal conditions and to identify system status. Two workstation experiments were conducted. The first experiment determined if including expected-value-range information in traditional parameter display formats affected subject performance. The second experiment determined if using a nontraditional parameter display format, which presented relative deviation from expected value, was better than traditional formats with expected-value ranges included. The inclusion of expected-value-range information onto traditional parameter formats was found to have essentially no effect. However, subjective results indicated support for including this information. The nontraditional column deviation parameter display format resulted in significantly fewer errors compared with traditional formats with expected-value-ranges included. In addition, error rates for the column deviation parameter display format remained stable as the scenario complexity increased, whereas error rates for the traditional parameter display formats with expected-value ranges increased. Subjective results also indicated that the subjects preferred this new format and thought that their performance was better with it. The column deviation parameter display format is recommended for display applications that require rapid recognition of out-of-tolerance conditions, especially for a large number of parameters.

  4. [Ancient methods of animal disease prevention in Belgium].

    PubMed

    Mammerickx, M

    1994-06-01

    The author describes traditional methods of animal disease control in Belgium and the evolution of these methods up to the present time. Evidence is drawn mainly from Belgian law. The principles of hygienic prophylaxis, which have required little modification over the passage of time, were set out at the beginning of the 18th century by Lancisi and Bates, physicians to Pope Clement XI and King George I of Great Britain, respectively. These principles were immediately incorporated into Belgian law. However, it was not until the second half of the 19th century that they were applied correctly and with success.

  5. Photobiomolecular deposition of metallic particles and films

    DOEpatents

    Hu, Zhong-Cheng

    2005-02-08

    The method of the invention is based on the unique electron-carrying function of a photocatalytic unit such as the photosynthesis system I (PSI) reaction center of the protein-chlorophyll complex isolated from chloroplasts. The method employs a photo-biomolecular metal deposition technique for precisely controlled nucleation and growth of metallic clusters/particles, e.g., platinum, palladium, and their alloys, etc., as well as for thin-film formation above the surface of a solid substrate. The photochemically mediated technique offers numerous advantages over traditional deposition methods including quantitative atom deposition control, high energy efficiency, and mild operating condition requirements.

  6. Photobiomolecular metallic particles and films

    DOEpatents

    Hu, Zhong-Cheng

    2003-05-06

    The method of the invention is based on the unique electron-carrying function of a photocatalytic unit such as the photosynthesis system I (PSI) reaction center of the protein-chlorophyll complex isolated from chloroplasts. The method employs a photo-biomolecular metal deposition technique for precisely controlled nucleation and growth of metallic clusters/particles, e.g., platinum, palladium, and their alloys, etc., as well as for thin-film formation above the surface of a solid substrate. The photochemically mediated technique offers numerous advantages over traditional deposition methods including quantitative atom deposition control, high energy efficiency, and mild operating condition requirements.

  7. Improved Linear Algebra Methods for Redshift Computation from Limited Spectrum Data - II

    NASA Technical Reports Server (NTRS)

    Foster, Leslie; Waagen, Alex; Aijaz, Nabella; Hurley, Michael; Luis, Apolo; Rinsky, Joel; Satyavolu, Chandrika; Gazis, Paul; Srivastava, Ashok; Way, Michael

    2008-01-01

    Given photometric broadband measurements of a galaxy, Gaussian processes may be used with a training set to solve the regression problem of approximating the redshift of this galaxy. However, in practice solving the traditional Gaussian processes equation is too slow and requires too much memory. We employed several methods to avoid this difficulty using algebraic manipulation and low-rank approximation, and were able to quickly approximate the redshifts in our testing data within 17 percent of the known true values using limited computational resources. The accuracy of one method, the V Formulation, is comparable to the accuracy of the best methods currently used for this problem.

  8. Quantum lithography beyond the diffraction limit via Rabi-oscillations

    NASA Astrophysics Data System (ADS)

    Liao, Zeyang; Al-Amri, Mohammad; Zubairy, M. Suhail

    2011-03-01

    We propose a quantum optical method to do the sub-wavelength lithography. Our method is similar to the traditional lithography but adding a critical step before dissociating the chemical bound of the photoresist. The subwavelength pattern is achieved by inducing the multi-Rabi-oscillation between the two atomic levels. The proposed method does not require multiphoton absorption and the entanglement of photons. This method is expected to be realizable using current technology. This work is supported by a grant from the Qatar National Research Fund (QNRF) under the NPRP project and a grant from the King Abdulaziz City for Science and Technology (KACST).

  9. Running and Metabolic Demands of Elite Rugby Union Assessed Using Traditional, Metabolic Power, and Heart Rate Monitoring Methods

    PubMed Central

    Dubois, Romain; Paillard, Thierry; Lyons, Mark; McGrath, David; Maurelli, Olivier; Prioux, Jacques

    2017-01-01

    The aims of this study were (1) to analyze elite rugby union game demands using 3 different approaches: traditional, metabolic and heart rate-based methods (2) to explore the relationship between these methods and (3) to explore positional differences between the backs and forwards players. Time motion analysis and game demands of fourteen professional players (24.1 ± 3.4 y), over 5 European challenge cup games, were analyzed. Thresholds of 14.4 km·h-1, 20 W.kg-1 and 85% of maximal heart rate (HRmax) were set for high-intensity efforts across the three methods. The mean % of HRmax was 80.6 ± 4.3 % while 42.2 ± 16.5% of game time was spent above 85% of HRmax with no significant differences between the forwards and the backs. Our findings also show that the backs cover greater distances at high-speed than forwards (% difference: +35.2 ± 6.6%; p<0.01) while the forwards cover more distance than the backs (+26.8 ± 5.7%; p<0.05) in moderate-speed zone (10-14.4 km·h-1). However, no significant difference in high-metabolic power distance was found between the backs and forwards. Indeed, the high-metabolic power distances were greater than high-speed running distances of 24.8 ± 17.1% for the backs, and 53.4 ± 16.0% for the forwards with a significant difference (+29.6 ± 6.0% for the forwards; p<0.001) between the two groups. Nevertheless, nearly perfect correlations were found between the total distance assessed using the traditional approach and the metabolic power approach (r = 0.98). Furthermore, there is a strong association (r = 0.93) between the high-speed running distance (assessed using the traditional approach) and the high-metabolic power distance. The HR monitoring methods demonstrate clearly the high physiological demands of professional rugby games. The traditional and the metabolic-power approaches shows a close correlation concerning their relative values, nevertheless the difference in absolute values especially for the high-intensity thresholds demonstrates that the metabolic power approach may represent an interesting alternative to the traditional approaches used in evaluating the high-intensity running efforts required in rugby union games. Key points Elite/professional rugby union players Heart rate monitoring during official games Metabolic power approach PMID:28344455

  10. Running and Metabolic Demands of Elite Rugby Union Assessed Using Traditional, Metabolic Power, and Heart Rate Monitoring Methods.

    PubMed

    Dubois, Romain; Paillard, Thierry; Lyons, Mark; McGrath, David; Maurelli, Olivier; Prioux, Jacques

    2017-03-01

    The aims of this study were (1) to analyze elite rugby union game demands using 3 different approaches: traditional, metabolic and heart rate-based methods (2) to explore the relationship between these methods and (3) to explore positional differences between the backs and forwards players. Time motion analysis and game demands of fourteen professional players (24.1 ± 3.4 y), over 5 European challenge cup games, were analyzed. Thresholds of 14.4 km·h -1 , 20 W.kg -1 and 85% of maximal heart rate (HR max ) were set for high-intensity efforts across the three methods. The mean % of HR max was 80.6 ± 4.3 % while 42.2 ± 16.5% of game time was spent above 85% of HR max with no significant differences between the forwards and the backs. Our findings also show that the backs cover greater distances at high-speed than forwards (% difference: +35.2 ± 6.6%; p<0.01) while the forwards cover more distance than the backs (+26.8 ± 5.7%; p<0.05) in moderate-speed zone (10-14.4 km·h -1 ). However, no significant difference in high-metabolic power distance was found between the backs and forwards. Indeed, the high-metabolic power distances were greater than high-speed running distances of 24.8 ± 17.1% for the backs, and 53.4 ± 16.0% for the forwards with a significant difference (+29.6 ± 6.0% for the forwards; p<0.001) between the two groups. Nevertheless, nearly perfect correlations were found between the total distance assessed using the traditional approach and the metabolic power approach (r = 0.98). Furthermore, there is a strong association (r = 0.93) between the high-speed running distance (assessed using the traditional approach) and the high-metabolic power distance. The HR monitoring methods demonstrate clearly the high physiological demands of professional rugby games. The traditional and the metabolic-power approaches shows a close correlation concerning their relative values, nevertheless the difference in absolute values especially for the high-intensity thresholds demonstrates that the metabolic power approach may represent an interesting alternative to the traditional approaches used in evaluating the high-intensity running efforts required in rugby union games.

  11. Route to one-step microstructure mold fabrication for PDMS microfluidic chip

    NASA Astrophysics Data System (ADS)

    Lv, Xiaoqing; Geng, Zhaoxin; Fan, Zhiyuan; Wang, Shicai; Su, Yue; Fang, Weihao; Pei, Weihua; Chen, Hongda

    2018-04-01

    The microstructure mold fabrication for PDMS microfluidic chip remains complex and time-consuming process requiring special equipment and protocols: photolithography and etching. Thus, a rapid and cost-effective method is highly needed. Comparing with the traditional microfluidic chip fabricating process based on the micro-electromechanical system (MEMS), this method is simple and easy to implement, and the whole fabrication process only requires 1-2 h. Different size of microstructure from 100 to 1000 μm was fabricated, and used to culture four kinds of breast cancer cell lines. Cell viability and morphology was assessed when they were cultured in the micro straight channels, micro square holes and the bonding PDMS-glass microfluidic chip. The experimental results indicate that the microfluidic chip is good and meet the experimental requirements. This method can greatly reduce the process time and cost of the microfluidic chip, and provide a simple and effective way for the structure design and in the field of biological microfabrications and microfluidic chips.

  12. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    DOE PAGES

    Haefner, A.; Gunter, D.; Plimley, B.; ...

    2014-11-03

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method withmore » electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.« less

  13. Alternative Attitude Commanding and Control for Precise Spacecraft Landing

    NASA Technical Reports Server (NTRS)

    Singh, Gurkirpal

    2004-01-01

    A report proposes an alternative method of control for precision landing on a remote planet. In the traditional method, the attitude of a spacecraft is required to track a commanded translational acceleration vector, which is generated at each time step by solving a two-point boundary value problem. No requirement of continuity is imposed on the acceleration. The translational acceleration does not necessarily vary smoothly. Tracking of a non-smooth acceleration causes the vehicle attitude to exhibit undesirable transients and poor pointing stability behavior. In the alternative method, the two-point boundary value problem is not solved at each time step. A smooth reference position profile is computed. The profile is recomputed only when the control errors get sufficiently large. The nominal attitude is still required to track the smooth reference acceleration command. A steering logic is proposed that controls the position and velocity errors about the reference profile by perturbing the attitude slightly about the nominal attitude. The overall pointing behavior is therefore smooth, greatly reducing the degree of pointing instability.

  14. A New Calibration Method for Commercial RGB-D Sensors

    PubMed Central

    Darwish, Walid; Tang, Shenjun; Li, Wenbin; Chen, Wu

    2017-01-01

    Commercial RGB-D sensors such as Kinect and Structure Sensors have been widely used in the game industry, where geometric fidelity is not of utmost importance. For applications in which high quality 3D is required, i.e., 3D building models of centimeter-level accuracy, accurate and reliable calibrations of these sensors are required. This paper presents a new model for calibrating the depth measurements of RGB-D sensors based on the structured light concept. Additionally, a new automatic method is proposed for the calibration of all RGB-D parameters, including internal calibration parameters for all cameras, the baseline between the infrared and RGB cameras, and the depth error model. When compared with traditional calibration methods, this new model shows a significant improvement in depth precision for both near and far ranges. PMID:28538695

  15. Allopathic and traditional health practitioners: A reply to Nemutandani, Hendricks and Mulaudzi

    PubMed Central

    2017-01-01

    An earlier paper in this journal reported on the perception and experience of 77 allopathic health practitioners (AHPs) and health managers about working together with South African traditional health practitioners (THPs). The paper stated that the abolishment of the Witchcraft Suppression Act of 1957 and the introduction of the Traditional Health Practitioners Act No. 22 of 2007 is a milestone in the development of traditional health knowledge, and for the eventual incorporation thereof into modern health care practices. The authors also comment that a decolonisation of mindset and a change of attitude is required to change one’s perception of traditional healer practices and to develop them parallel to allopathic health practice. This opinion paper is a response to the paper, to negate its claims about the Witchcraft Suppression Act of 1957 and to provide clarity on the Traditional Health Practitioners Act No. 22 of 2007 and related policies and regulations. Although this Act recognises THP, the Act and other regulations actually require THP to conform to practices analogous to those of AHP. It is rather a systematic and scientific ‘mindset’ that is required to develop THP parallel to AHP. The Traditional Health Practitioners Act of 2007 and the Draft Policy on African Traditional Medicine (TM) for South Africa dictate that a substantial THP sectoral transformation is required before there can be a parallel system. Legislation and regulations have excluded THP and African TM from operating (present and future) in the same space as AHP. PMID:28470077

  16. Maximum-likelihood methods in wavefront sensing: stochastic models and likelihood functions

    PubMed Central

    Barrett, Harrison H.; Dainty, Christopher; Lara, David

    2008-01-01

    Maximum-likelihood (ML) estimation in wavefront sensing requires careful attention to all noise sources and all factors that influence the sensor data. We present detailed probability density functions for the output of the image detector in a wavefront sensor, conditional not only on wavefront parameters but also on various nuisance parameters. Practical ways of dealing with nuisance parameters are described, and final expressions for likelihoods and Fisher information matrices are derived. The theory is illustrated by discussing Shack–Hartmann sensors, and computational requirements are discussed. Simulation results show that ML estimation can significantly increase the dynamic range of a Shack–Hartmann sensor with four detectors and that it can reduce the residual wavefront error when compared with traditional methods. PMID:17206255

  17. A comparison of traditional and engaging lecture methods in a large, professional-level course.

    PubMed

    Miller, Cynthia J; McNear, Jacquee; Metz, Michael J

    2013-12-01

    In engaging lectures, also referred to as broken or interactive lectures, students are given short periods of lecture followed by "breaks" that can consist of 1-min papers, problem sets, brainstorming sessions, or open discussion. While many studies have shown positive effects when engaging lectures are used in undergraduate settings, the literature surrounding use of the learning technique for professional students is inconclusive. The novelty of this study design allowed a direct comparison of engaging physiology lectures versus didactic lecture formats in the same cohort of 120 first-year School of Dentistry DMD students. All students were taught five physiological systems using traditional lecture methods and six physiological systems using engaging lecture methods. The use of engaging lectures led to a statistically significant higher average on unit exams compared with traditional didactic lectures (8.6% higher, P < 0.05). Furthermore, students demonstrated an improved long-term retention of information via higher scores on the comprehensive final exam (22.9% higher in engaging lecture sections, P < 0.05). Many qualitative improvements were also indicated via student surveys and evaluations, including an increased perceived effectiveness of lectures, decrease in distractions during lecture, and increased confidence with the material. The development of engaging lecture activities requires a significant amount of instructor preparation and limits the time available to provide traditional lectures. However, the positive results of this study suggest the need for a restructuring of the physiology curriculum to incorporate more engaging lectures to improve both the qualitative experiences and performance levels of professional students.

  18. Minimal residual method provides optimal regularization parameter for diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Jagannath, Ravi Prasad K.; Yalavarthy, Phaneendra K.

    2012-10-01

    The inverse problem in the diffuse optical tomography is known to be nonlinear, ill-posed, and sometimes under-determined, requiring regularization to obtain meaningful results, with Tikhonov-type regularization being the most popular one. The choice of this regularization parameter dictates the reconstructed optical image quality and is typically chosen empirically or based on prior experience. An automated method for optimal selection of regularization parameter that is based on regularized minimal residual method (MRM) is proposed and is compared with the traditional generalized cross-validation method. The results obtained using numerical and gelatin phantom data indicate that the MRM-based method is capable of providing the optimal regularization parameter.

  19. Minimal residual method provides optimal regularization parameter for diffuse optical tomography.

    PubMed

    Jagannath, Ravi Prasad K; Yalavarthy, Phaneendra K

    2012-10-01

    The inverse problem in the diffuse optical tomography is known to be nonlinear, ill-posed, and sometimes under-determined, requiring regularization to obtain meaningful results, with Tikhonov-type regularization being the most popular one. The choice of this regularization parameter dictates the reconstructed optical image quality and is typically chosen empirically or based on prior experience. An automated method for optimal selection of regularization parameter that is based on regularized minimal residual method (MRM) is proposed and is compared with the traditional generalized cross-validation method. The results obtained using numerical and gelatin phantom data indicate that the MRM-based method is capable of providing the optimal regularization parameter.

  20. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    PubMed Central

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-01-01

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121

  1. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    PubMed

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  2. Generalized query-based active learning to identify differentially methylated regions in DNA.

    PubMed

    Haque, Md Muksitul; Holder, Lawrence B; Skinner, Michael K; Cook, Diane J

    2013-01-01

    Active learning is a supervised learning technique that reduces the number of examples required for building a successful classifier, because it can choose the data it learns from. This technique holds promise for many biological domains in which classified examples are expensive and time-consuming to obtain. Most traditional active learning methods ask very specific queries to the Oracle (e.g., a human expert) to label an unlabeled example. The example may consist of numerous features, many of which are irrelevant. Removing such features will create a shorter query with only relevant features, and it will be easier for the Oracle to answer. We propose a generalized query-based active learning (GQAL) approach that constructs generalized queries based on multiple instances. By constructing appropriately generalized queries, we can achieve higher accuracy compared to traditional active learning methods. We apply our active learning method to find differentially DNA methylated regions (DMRs). DMRs are DNA locations in the genome that are known to be involved in tissue differentiation, epigenetic regulation, and disease. We also apply our method on 13 other data sets and show that our method is better than another popular active learning technique.

  3. PCR-mediated site-directed mutagenesis.

    PubMed

    Carey, Michael F; Peterson, Craig L; Smale, Stephen T

    2013-08-01

    Unlike traditional site-directed mutagenesis, this protocol requires only a single PCR step using full plasmid amplification to generate point mutants. The method can introduce small mutations into promoter sites and is even better suited for introducing single or double mutations into proteins. It is elegant in its simplicity and can be applied quite easily in any laboratory using standard protein expression vectors and commercially available reagents.

  4. Is a Team-based Learning Approach to Anatomy Teaching Superior to Didactic Lecturing?

    PubMed

    Ghorbani, Naghme; Karbalay-Doust, Saied; Noorafshan, Ali

    2014-02-01

    Team-based learning (TBL) is used in the medical field to implement interactive learning in small groups. The learning of anatomy and its subsequent application requires the students to recall a great deal of factual content. The aims of this study were to evaluate the students' satisfaction, engagement and knowledge gain in anatomy through the medium of TBL in comparison to the traditional lecture method. This study, carried out from February to June 2012, included 30 physical therapy students of the Shiraz University of Medical Science, School of Rehabilitation Sciences. Classic TBL techniques were modified to cover lower limb anatomy topics in the first year of the physical therapy curriculum. Anatomy lectures were replaced with TBL, which required the preparation of assigned content, specific discussion topics, an individual self-assessment test (IRAT) and the analysis of discussion topics. The teams then subsequently retook the assessment test as a group (GRAT). The first eight weeks of the curriculum were taught using traditional didactic lecturing, while during the second eight weeks the modified TBL method was used. The students evaluated these sessions through a questionnaire. The impact of TBL on student engagement and educational achievement was determined using numerical data, including the IRAT, GRAT and final examination scores. Students had a higher satisfaction rate with the TBL teaching according to the Likert scale. Additionally, higher scores were obtained in the TBL-based final examination in comparison to the lecture-based midterm exam. The students' responses showed that the TBL technique could be used alone or in conjunction with traditional didactic lecturing in order to teach anatomy more effectively.

  5. Using Facebook ads with traditional paper mailings to recruit adolescent girls for a clinical trial

    PubMed Central

    Schwinn, Traci; Hopkins, Jessica; Schinke, Steven P; Liu, Xiang

    2016-01-01

    Introduction Clinical trials require sufficient samples recruited within limited time and budget constraints. Trials with minors are additionally burdened by the requirement for youth assent and parental permission. This paper details the use of Facebook ads and traditional paper mailings to enroll 797 adolescent girls for a longitudinal, web-based, drug abuse prevention trial. Data on sample representativeness and retention are also provided. Methods Facebook ads appeared on the pages of females aged 13 or 14 years who reside in the U.S. Ads linked girls to a recruitment website. Girls who wanted more information submitted contact information and were mailed information packets to their homes containing, among other things, youth assent and parent permission forms. Returned forms were verified for accuracy and validity. Results The Facebook ad campaign reached 2,267,848 girls and had a unique click-through rate of 3.0%. The campaign cost $41,202.37 with an average cost of $51.70 per enrolled girl. Information packets were mailed to 1,873 girls. Approximately one-half of girls returned the forms, and 797 girls were enrolled. The Facebook campaign's success varied by ad type, month, and day of the week. Baseline data revealed comparability to national data on demographic and substance use variables. Conclusions Results suggest that Facebook ads provide a useful initial point of access to unparalleled numbers of adolescents. Clinical trials may benefit from a two-fold recruitment strategy that uses online ads to attract interested adolescents followed by traditional recruitment methods to communicate detailed information to adolescents and parents. PMID:27835860

  6. Oral anatomy laboratory examinations in a physical therapy program.

    PubMed

    Fabrizio, Philip A

    2013-01-01

    The process of creating and administering traditional tagged anatomy laboratory examinations is time consuming for instructors and limits laboratory access for students. Depending on class size and the number of class, sections, creating, administering, and breaking down a tagged laboratory examination may involve one to two eight-hour days. During the time that a tagged examination is being created, student productivity may be reduced as the anatomy laboratory is inaccessible to students. Further, the type of questions that can be asked in a tagged laboratory examination may limit student assessment to lower level cognitive abilities and may limit the instructors' ability to assess the students' understanding of anatomical and clinical concepts. Anatomy is a foundational science in the Physical Therapy curriculum and a thorough understanding of anatomy is necessary to progress through the subsequent clinical courses. Physical therapy curricula have evolved to reflect the changing role of physical therapists to primary caregivers by introducing a greater scope of clinical courses earlier in the curriculum. Physical therapy students must have a thorough understanding of clinical anatomy early in the education process. However, traditional anatomy examination methods may not be reflective of the clinical thought processes required of physical therapy students. Traditional laboratory examination methods also reduce student productivity by limiting access during examination set-up and breakdown. To provide a greater complexity of questions and reduced overall laboratory time required for examinations, the Physical Therapy Program at Mercer University has introduced oral laboratory examinations for the gross anatomy course series. © 2012 American Association of Anatomists.

  7. MindEdit: A P300-based text editor for mobile devices.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2017-01-01

    Practical application of Brain-Computer Interfaces (BCIs) requires that the whole BCI system be portable. The mobility of BCI systems involves two aspects: making the electroencephalography (EEG) recording devices portable, and developing software applications with low computational complexity to be able to run on low computational-power devices such as tablets and smartphones. This paper addresses the development of MindEdit; a P300-based text editor for Android-based devices. Given the limited resources of mobile devices and their limited computational power, a novel ensemble classifier is utilized that uses Principal Component Analysis (PCA) features to identify P300 evoked potentials from EEG recordings. PCA computations in the proposed method are channel-based as opposed to concatenating all channels as in traditional feature extraction methods; thus, this method has less computational complexity compared to traditional P300 detection methods. The performance of the method is demonstrated on data recorded from MindEdit on an Android tablet using the Emotiv wireless neuroheadset. Results demonstrate the capability of the introduced PCA ensemble classifier to classify P300 data with maximum average accuracy of 78.37±16.09% for cross-validation data and 77.5±19.69% for online test data using only 10 trials per symbol and a 33-character training dataset. Our analysis indicates that the introduced method outperforms traditional feature extraction methods. For a faster operation of MindEdit, a variable number of trials scheme is introduced that resulted in an online average accuracy of 64.17±19.6% and a maximum bitrate of 6.25bit/min. These results demonstrate the efficacy of using the developed BCI application with mobile devices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The Effects of Problem-Solving Teaching on Creative Thinking among District 2 High School Students in Sari City.

    PubMed

    Nozari, Ali Yazdanpanah; Siamian, Hasan

    2014-12-01

    Nowadays, regarding the learners' needs and social conditions, it is obviously needed to revise and reconsider the traditional methods and approaches in teaching. The problem solving approach is one of the new ways in Teaching and learning process. This study aimed at studying and examining the effect of "problem-solving" approach on creative thinking of high school female students. An experimental method is used for this research. In this research, 342 out of 3047 female-students from Sari high schools were randomly selected. These 342 students were divided into two groups (experimental and control) in which there were seven classrooms. The total number of students in every group was about 171. After testing them with Jamal Abedi creativity test, it was revealed that two groups were equal in creativity score. The tests were done through Requirements. The experimental group was taught by problem solving method for three months while the control group was taught by traditional method. The research results showed that using descriptive indices and t-test for the two independent sample groups in which problem solving teaching method was used in teaching processes had an effect on creativity level in comparison with traditional method used in the control group. Considering the results of this study, the application of problem-solving teaching methods increased the creativity and its components (fluidity, expansion, originality and flexibility) in learners, therefore, it is recommended that students be encouraged to take classes on frequent responses on various topics (variability) and draw attention on different issues, and expand their analysis on elements in particular courses like art (expansion). To enhance the learner's mental flexibility and attention to various aspects, they are encouraged to provide a variety of responses.

  9. Validation of Milliflex® Quantum for Bioburden Testing of Pharmaceutical Products.

    PubMed

    Gordon, Oliver; Goverde, Marcel; Staerk, Alexandra; Roesti, David

    2017-01-01

    This article reports the validation strategy used to demonstrate that the Milliflex ® Quantum yielded non-inferior results to the traditional bioburden method. It was validated according to USP <1223>, European Pharmacopoeia 5.1.6, and Parenteral Drug Association Technical Report No. 33 and comprised the validation parameters robustness, ruggedness, repeatability, specificity, limit of detection and quantification, accuracy, precision, linearity, range, and equivalence in routine operation. For the validation, a combination of pharmacopeial ATCC strains as well as a broad selection of in-house isolates were used. In-house isolates were used in stressed state. Results were statistically evaluated regarding the pharmacopeial acceptance criterion of ≥70% recovery compared to the traditional method. Post-hoc test power calculations verified the appropriateness of the used sample size to detect such a difference. Furthermore, equivalence tests verified non-inferiority of the rapid method as compared to the traditional method. In conclusion, the rapid bioburden on basis of the Milliflex ® Quantum was successfully validated as alternative method to the traditional bioburden test. LAY ABSTRACT: Pharmaceutical drug products must fulfill specified quality criteria regarding their microbial content in order to ensure patient safety. Drugs that are delivered into the body via injection, infusion, or implantation must be sterile (i.e., devoid of living microorganisms). Bioburden testing measures the levels of microbes present in the bulk solution of a drug before sterilization, and thus it provides important information for manufacturing a safe product. In general, bioburden testing has to be performed using the methods described in the pharmacopoeias (membrane filtration or plate count). These methods are well established and validated regarding their effectiveness; however, the incubation time required to visually identify microbial colonies is long. Thus, alternative methods that detect microbial contamination faster will improve control over the manufacturing process and speed up product release. Before alternative methods may be used, they must undergo a side-by-side comparison with pharmacopeial methods. In this comparison, referred to as validation, it must be shown in a statistically verified manner that the effectiveness of the alternative method is at least equivalent to that of the pharmacopeial methods. Here we describe the successful validation of an alternative bioburden testing method based on fluorescent staining of growing microorganisms applying the Milliflex ® Quantum system by MilliporeSigma. © PDA, Inc. 2017.

  10. Single-Site Nissen Fundoplication Versus Laparoscopic Nissen Fundoplication

    PubMed Central

    Sharp, Nicole E.; Vassaur, John

    2014-01-01

    Background: Advances in minimally invasive surgery have led to the emergence of single-incision laparoscopic surgery (SILS). The purpose of this study is to assess the feasibility of SILS Nissen fundoplication and compare its outcomes with traditional laparoscopic Nissen fundoplication. Methods: This is a retrospective study of 33 patients who underwent Nissen fundoplication between January 2009 and September 2010. Results: There were 15 SILS and 18 traditional laparoscopic Nissen fundoplication procedures performed. The mean operative time was 129 and 182 minutes in the traditional laparoscopic and single-incision groups, respectively (P = .019). There were no conversions in the traditional laparoscopic group, whereas 6 of the 15 patients in the SILS group required conversion by insertion of 2 to 4 additional ports (P = .0004). At short-term follow-up, recurrence rates were similar between both groups. To date, there have been no reoperations. Conclusions: SILS Nissen fundoplication is both safe and feasible. Short-term outcomes are comparable with standard laparoscopic Nissen fundoplication. Challenges related to the single-incision Nissen fundoplication include overcoming the lengthy learning curve and decreasing the need for additional trocars. PMID:25392613

  11. Uncertainty propagation for statistical impact prediction of space debris

    NASA Astrophysics Data System (ADS)

    Hoogendoorn, R.; Mooij, E.; Geul, J.

    2018-01-01

    Predictions of the impact time and location of space debris in a decaying trajectory are highly influenced by uncertainties. The traditional Monte Carlo (MC) method can be used to perform accurate statistical impact predictions, but requires a large computational effort. A method is investigated that directly propagates a Probability Density Function (PDF) in time, which has the potential to obtain more accurate results with less computational effort. The decaying trajectory of Delta-K rocket stages was used to test the methods using a six degrees-of-freedom state model. The PDF of the state of the body was propagated in time to obtain impact-time distributions. This Direct PDF Propagation (DPP) method results in a multi-dimensional scattered dataset of the PDF of the state, which is highly challenging to process. No accurate results could be obtained, because of the structure of the DPP data and the high dimensionality. Therefore, the DPP method is less suitable for practical uncontrolled entry problems and the traditional MC method remains superior. Additionally, the MC method was used with two improved uncertainty models to obtain impact-time distributions, which were validated using observations of true impacts. For one of the two uncertainty models, statistically more valid impact-time distributions were obtained than in previous research.

  12. Perspectives in musculoskeletal injury management by traditional bone setters in Ashanti, Ghana

    PubMed Central

    Owusu-Ansah, Frances E.; Dogbe, Joslin A.; Morgan, Julia; Sarpong, Kofi

    2015-01-01

    Background The popularity of the services of traditional bone setters (TBS) in Ghana as an alternative health care requires exploration and documentation of the perspectives of providers and users. Objective To explore and document the perspectives of providers and users of the services of TBS in the management of musculoskeletal injuries in the Ashanti region, Ghana. Methods From the social constructivist and qualitative approach, in-depth interviews were used to explore the perspectives of eight TBS and 16 users of their services, selected purposively through snowballing. Thematic content analysis (TCA) was employed. Results High recovery rate, warm reception, prompt attention, and the relatively lower charges, are reported to motivate the patronage of the services of TBS for the management of fractures in the legs, arms, ribs, joint bones dislocations, waist and spinal cord problems. The TBS combined traditional and orthodox procedures, using plant and animal-based materials, beliefs, spirituality (God-given) and physical therapy in the management of musculoskeletal injuries. No adverse experience was reported by either the providers or users of the traditional management methods. Conclusion With plant and animal-based materials, TBS are observed to combine traditional and orthodox procedures to confidently manage musculoskeletal injuries to the satisfaction of their highly motivated patrons. Although over 60% of the TBS attribute the healing power behind their practice to God, the rest do not discount the role of spiritual therapy. Further studies expanded to include the perspectives of non-users of the services of the TBS will authenticate the findings of this study. PMID:28730018

  13. Darkfield Adapter for Whole Slide Imaging: Adapting a Darkfield Internal Reflection Illumination System to Extend WSI Applications

    PubMed Central

    Kawano, Yoshihiro; Higgins, Christopher; Yamamoto, Yasuhito; Nyhus, Julie; Bernard, Amy; Dong, Hong-Wei; Karten, Harvey J.; Schilling, Tobias

    2013-01-01

    We present a new method for whole slide darkfield imaging. Whole Slide Imaging (WSI), also sometimes called virtual slide or virtual microscopy technology, produces images that simultaneously provide high resolution and a wide field of observation that can encompass the entire section, extending far beyond any single field of view. For example, a brain slice can be imaged so that both overall morphology and individual neuronal detail can be seen. We extended the capabilities of traditional whole slide systems and developed a prototype system for darkfield internal reflection illumination (DIRI). Our darkfield system uses an ultra-thin light-emitting diode (LED) light source to illuminate slide specimens from the edge of the slide. We used a new type of side illumination, a variation on the internal reflection method, to illuminate the specimen and create a darkfield image. This system has four main advantages over traditional darkfield: (1) no oil condenser is required for high resolution imaging (2) there is less scatter from dust and dirt on the slide specimen (3) there is less halo, providing a more natural darkfield contrast image, and (4) the motorized system produces darkfield, brightfield and fluorescence images. The WSI method sometimes allows us to image using fewer stains. For instance, diaminobenzidine (DAB) and fluorescent staining are helpful tools for observing protein localization and volume in tissues. However, these methods usually require counter-staining in order to visualize tissue structure, limiting the accuracy of localization of labeled cells within the complex multiple regions of typical neurohistological preparations. Darkfield imaging works on the basis of light scattering from refractive index mismatches in the sample. It is a label-free method of producing contrast in a sample. We propose that adapting darkfield imaging to WSI is very useful, particularly when researchers require additional structural information without the use of further staining. PMID:23520500

  14. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and alpha-spectrometry.

    PubMed

    Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.

  15. Assessment of a 1H high-resolution magic angle spinning NMR spectroscopy procedure for free sugars quantification in intact plant tissue.

    PubMed

    Delgado-Goñi, Teresa; Campo, Sonia; Martín-Sitjar, Juana; Cabañas, Miquel E; San Segundo, Blanca; Arús, Carles

    2013-08-01

    In most plants, sucrose is the primary product of photosynthesis, the transport form of assimilated carbon, and also one of the main factors determining sweetness in fresh fruits. Traditional methods for sugar quantification (mainly sucrose, glucose and fructose) require obtaining crude plant extracts, which sometimes involve substantial sample manipulation, making the process time-consuming and increasing the risk of sample degradation. Here, we describe and validate a fast method to determine sugar content in intact plant tissue by using high-resolution magic angle spinning nuclear magnetic resonance spectroscopy (HR-MAS NMR). The HR-MAS NMR method was used for quantifying sucrose, glucose and fructose in mesocarp tissues from melon fruits (Cucumis melo var. reticulatus and Cucumis melo var. cantalupensis). The resulting sugar content varied among individual melons, ranging from 1.4 to 7.3 g of sucrose, 0.4-2.5 g of glucose; and 0.73-2.83 g of fructose (values per 100 g fw). These values were in agreement with those described in the literature for melon fruit tissue, and no significant differences were found when comparing them with those obtained using the traditional, enzymatic procedure, on melon tissue extracts. The HR-MAS NMR method offers a fast (usually <30 min) and sensitive method for sugar quantification in intact plant tissues, it requires a small amount of tissue (typically 50 mg fw) and avoids the interferences and risks associated with obtaining plant extracts. Furthermore, this method might also allow the quantification of additional metabolites detectable in the plant tissue NMR spectrum.

  16. Ubiquitous Total Station Development using Smartphone, RSSI and Laser Sensor providing service to Ubi-GIS

    NASA Astrophysics Data System (ADS)

    Shoushtari, M. A.; Sadeghi-Niaraki, H.

    2014-10-01

    The growing trend in technological advances and Micro Electro Mechanical Systems (MEMS) has targeted for intelligent human lives. Accordingly, Ubiquitous Computing Approach was proposed by Mark Weiser. This paper proposes an ubiquitous surveying solution in Geometrics and surveying field. Ubiquitous Surveying provides cost-effective, smart and available surveying techniques while traditional surveying equipment are so expensive and have small availability specially in indoor and daily surveying jobs. In order to have a smart surveying instrument, different information technology methods and tools like Triangle method, Received Signal Strength Indicator (RSSI) method and laser sensor are used. These new ways in combine with surveying equations introduces a modern surveying equipment called Ubi-Total Station that also employed different sensors embedded in smartphone and mobile stand. RSSI-based localization and Triangle method technique are easy and well known methods to predict the position of an unknown node in indoor environments whereas additional measures are required for a sufficient accuracy. In this paper the main goal is to introduce the Ubiquitous Total Station as a development in smart and ubiquitous GIS. In order to public use of the surveying equipment, design and implementation of this instrument has been done. Conceptual model of Smartphone-based system is designed for this study and based on this model, an Android application as a first sample is developed. Finally the evaluations shows that absolute errors in X and Y calculation are 0.028 and 0.057 meter respectively. Also RMSE of 0.26 was calculated in RSSI method for distance measurement. The high price of traditional equipment and their requirement for professional surveyors has given way to intelligent surveying. In the suggested system, smartphones can be used as tools for positioning and coordinating geometric information of objects.

  17. Prevalence of depressive symptoms among medical students taught using problem-based learning versus traditional methods.

    PubMed

    Aragão, José Aderval; Freire, Marianna Ribeiro de Menezes; Nolasco Farias, Lucas Guimarães; Diniz, Sarah Santana; Sant'anna Aragão, Felipe Matheus; Sant'anna Aragão, Iapunira Catarina; Lima, Tarcisio Brandão; Reis, Francisco Prado

    2018-06-01

    To compare depressive symptoms among medical students taught using problem-based learning (PBL) and the traditional method. Beck's Depression Inventory was applied to 215 medical students. The prevalence of depression was calculated as the number of individuals with depression divided by the total number in the sample from each course, with 95% confidence intervals. The statistical significance level used was 5% (p ≤ .05). Among the 215 students, 52.1% were male and 47.9% were female; and 51.6% were being taught using PBL methodology and 48.4% using traditional methods. The prevalence of depression was 29.73% with PBL and 22.12% with traditional methods. There was higher prevalence among females: 32.8% with PBL and 23.1% with traditional methods. The prevalence of depression with PBL among students up to 21 years of age was 29.4% and among those over 21 years, 32.1%. With traditional methods among students up to 21 years of age, it was 16.7%%, and among those over 21 years, 30.1%. The prevalence of depression with PBL was highest among students in the second semester and with traditional methods, in the eighth. Depressive symptoms were highly prevalent among students taught both with PBL and with traditional methods.

  18. Home blood-pressure monitoring in a hypertensive pregnant population: cost minimisation study.

    PubMed

    Xydopoulos, G; Perry, H; Sheehan, E; Thilaganathan, B; Fordham, R; Khalil, A

    2018-03-08

    Traditional monitoring of blood pressure in hypertensive pregnant women requires frequent visits to the maternity outpatient services. Home blood-pressure monitoring (HBPM) could offer a cost-saving alternative that is acceptable to patients. The main objective of this study was to undertake a health economic analysis of HBPM compared with traditional monitoring in hypertensive pregnant women. This was a case-control study. Cases were pregnant women with hypertension who had HBPM with or without the adjunct of a smartphone app, via a specially designed pathway. The control group were managed as per existing hospital guidelines. Specific outcome measures were the number of outpatient visits, inpatient bed stays and investigations performed. Maternal, fetal and neonatal adverse outcomes were also recorded. Health economic analysis was performed using two methods: direct cost comparison of the study dataset and process scenario modelling. There were 108 women in the HBPM group, of whom 29 recorded their results on the smartphone app (App-HBPM) and 79 in their notes (Non-app HBPM). The control group comprised of 58 patients. There were significantly more women with chronic hypertension in the HBPM group (49.1% vs 25.9%, P = 0.004). The HBPM group had significantly longer duration of monitoring (9 weeks vs 5 weeks P = 0.004) and started monitoring from an earlier gestation (30 weeks vs 33.6 weeks, P = 0.001). Despite these differences, the mean saving per week for HBPM compared with the control group was £200.69. For the App-HBPM cohort, the saving per week compared with the control group was £286.53. The process modelling method predicted savings of between £98.32 and £245.80 per week using HBPM compared to the traditional monitoring. HBPM in hypertensive pregnancies appears to be cost-saving compared with traditional monitoring, without compromising maternal, fetal or neonatal safety. Larger studies are required to confirm these findings. This article is protected by copyright. All rights reserved.

  19. Results of a Formal Methods Demonstration Project

    NASA Technical Reports Server (NTRS)

    Kelly, J.; Covington, R.; Hamilton, D.

    1994-01-01

    This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.

  20. Is traditional contraceptive use in Moldova associated with poverty and isolation?

    PubMed

    Lyons-Amos, Mark J; Durrant, Gabriele B; Padmadas, Sabu S

    2011-05-01

    This study investigates the correlates of traditional contraceptive use in Moldova, a poor country in Europe with one of the highest proportions of traditional contraceptive method users. The high reliance on traditional methods, particularly in the context of sub-replacement level fertility rate, has not been systematically evaluated in demographic research. Using cross-sectional data on a sub-sample of 6039 sexually experienced women from the 2005 Moldovan Demographic and Health Survey, this study hypothesizes that (a) economic and spatial disadvantages increase the likelihood of traditional method use, and (b) high exposure to family planning/reproductive health (FP/RH) programmes increases the propensity to modern method use. Multilevel multinomial models are used to examine the correlates of traditional method use controlling for exposure to sexual activity, socioeconomic and demographic characteristics and data structure. The results show that economic disadvantage increases the probability of traditional method use, but the overall effect is small. Although higher family planning media exposure decreases the reliance on traditional methods among younger women, it has only a marginal effect in increasing modern method use among older women. Family planning programmes designed to encourage women to switch from traditional to modern methods have some success--although the effect is considerably reduced in regions outside of the capital Chisinau. The study concludes that FP/RH efforts directed towards the poorest may have limited impact, but interventions targeted at older women could reduce the burden of unwanted pregnancies and abortions. Addressing differentials in accessing modern methods could improve uptake in rural areas.

  1. Holonic Rationale and Bio-inspiration on Design of Complex Emergent and Evolvable Systems

    NASA Astrophysics Data System (ADS)

    Leitao, Paulo

    Traditional centralized and rigid control structures are becoming inflexible to face the requirements of reconfigurability, responsiveness and robustness, imposed by customer demands in the current global economy. The Holonic Manufacturing Systems (HMS) paradigm, which was pointed out as a suitable solution to face these requirements, translates the concepts inherited from social organizations and biology to the manufacturing world. It offers an alternative way of designing adaptive systems where the traditional centralized control is replaced by decentralization over distributed and autonomous entities organized in hierarchical structures formed by intermediate stable forms. In spite of its enormous potential, methods regarding the self-adaptation and self-organization of complex systems are still missing. This paper discusses how the insights from biology in connection with new fields of computer science can be useful to enhance the holonic design aiming to achieve more self-adaptive and evolvable systems. Special attention is devoted to the discussion of emergent behavior and self-organization concepts, and the way they can be combined with the holonic rationale.

  2. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18O-Labeling Method for Quantitative Proteomics

    PubMed Central

    López-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-01-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min with a minimized amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from the bacteria Shewanella oneidensis, and mouse plasma, as well as 18O labeling of such complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, rapid, and thus well-suited for automation. PMID:19555078

  3. The outlook for precipitation measurements from space

    NASA Technical Reports Server (NTRS)

    Atlas, D.; Eckerman, J.; Meneghini, R.; Moore, R. K.

    1981-01-01

    To provide useful precipitation measurements from space, two requirements must be met: adequate spatial and temporal sampling of the storm and sufficient accuracy in the estimate of precipitation intensity. Although presently no single instrument or method completely satisfies both requirements, the visible/IR, microwave radiometer and radar methods can be used in a complementary manner. Visible/IR instruments provide good temporal sampling and rain area depiction, but recourse must be made to microwave measurements for quantitative rainfall estimates. The inadequacy of microwave radiometer measurements over land suggests, in turn, the use of radar. Several recently developed attenuating-wavelength radar methods are discussed in terms of their accuracy, dynamic range and system implementation. Traditionally, the requirements of high resolution and adequate dynamic range led to fairly costly and complex radar systems. Some simplications and cost reduction can be made; however, by using K-band wavelengths which have the advantages of greater sensitivity at the low rain rates and higher resolution capabilities. Several recently proposed methods of this kind are reviewed in terms of accuracy and system implementation. Finally, an adaptive-pointing multi-sensor instrument is described that would exploit certain advantages of the IR, radiometric and radar methods.

  4. An evaluation of semi-automated methods for collecting ecosystem-level data in temperate marine systems.

    PubMed

    Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L

    2017-07-01

    Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.

  5. Requirements controlled design: A method for discovery of discontinuous system boundaries in the requirements hyperspace

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Peter Michael

    The drive toward robust systems design, especially with respect to system affordability throughout the system life-cycle, has led to the development of several advanced design methods. While these methods have been extremely successful in satisfying the needs for which they have been developed, they inherently leave a critical area unaddressed. None of them fully considers the effect of requirements on the selection of solution systems. The goal of all of current modern design methodologies is to bring knowledge forward in the design process to the regions where more design freedom is available and design changes cost less. Therefore, it seems reasonable to consider the point in the design process where the greatest restrictions are placed on the final design, the point in which the system level requirements are set. Historically the requirements have been treated as something handed down from above. However, neither the customer nor the solution provider completely understood all of the options that are available in the broader requirements space. If a method were developed that provided the ability to understand the full scope of the requirements space, it would allow for a better comparison of potential solution systems with respect to both the current and potential future requirements. The key to a requirements conscious method is to treat requirements differently from the traditional approach. The method proposed herein is known as Requirements Controlled Design (RCD). By treating the requirements as a set of variables that control the behavior of the system, instead of variables that only define the response of the system, it is possible to determine a-priori what portions of the requirements space that any given system is capable of satisfying. Additionally, it should be possible to identify which systems can satisfy a given set of requirements and the locations where a small change in one or more requirements poses a significant risk to a design program. This thesis puts forth the theory and methodology to enable RCD, and details and validates a specific method called the Modified Strength Pareto Evolutionary Algorithm (MSPEA).

  6. Method and apparatus for energy efficient self-aeration in chemical, biochemical, and wastewater treatment processes

    DOEpatents

    Gao, Johnway [Richland, WA; Skeen, Rodney S [Pendleton, OR

    2002-05-28

    The present invention is a pulse spilling self-aerator (PSSA) that has the potential to greatly lower the installation, operation, and maintenance cost associated with aerating and mixing aqueous solutions. Currently, large quantities of low-pressure air are required in aeration systems to support many biochemical production processes and wastewater treatment plants. Oxygen is traditionally supplied and mixed by a compressor or blower and a mechanical agitator. These systems have high-energy requirements and high installation and maintenance costs. The PSSA provides a mixing and aeration capability that can increase operational efficiency and reduce overall cost.

  7. Traditional Galactagogue Foods and Their Connection to Human Milk Volume in Thai Breastfeeding Mothers.

    PubMed

    Buntuchai, Ganokwun; Pavadhgul, Patcharanee; Kittipichai, Wirin; Satheannoppakao, Warapone

    2017-08-01

    Thai traditional galactagogue consumption is still observed today. However, there are few scientific studies that describe this practice. Research aim: The aim of this study was to describe the connection between traditional galactagogue consumption and human milk volume. Self-reported maternal surveys ( N = 36) were conducted of mothers and their infants who breastfeed exclusively. The mothers were interviewed about traditional galactagogue consumption and intake of protein-rich foods using a semiquantitative food-frequency questionnaire. They were also assessed for energy and nutrient intake using the 24-hr dietary recall method. Their infants were between 1 and 3 months of age and were test weighed for 24 hr to measure their mother's own milk volume. Partial correlation was used to test the relationship between galactagogue consumption and milk volume by controlling the infants' birth weight, weight-for-age, maternal energy, and carbohydrate intake. The results revealed that consumption of some traditional galactagogues was significantly correlated to human milk volume, including banana flower, lemon basil, Thai basil, bottle gourd, and pumpkin ( p < .05). Furthermore, there were significant correlations between consumption of some kinds of protein and milk volume, including egg tofu, chicken, fish, and seafood ( p < .05). Maternal energy and carbohydrate intake were related to milk volume ( p < .05), but protein intake was not. Certain kinds of traditional galactagogues and proteins are associated with human milk volume. However, studies related to the active ingredients in these galactagogues are required to secure a recommendation about use of traditional galactagogues among breastfeeding mothers.

  8. Mapping seagrass and colonized hard bottom in Springs Coast, Florida using WorldView-2 satellite imagery

    NASA Astrophysics Data System (ADS)

    Baumstark, René; Duffey, Renee; Pu, Ruiliang

    2016-11-01

    The offshore extent of seagrass habitat along the West Florida (USA) coast represents an important corridor for inshore-offshore migration of economically important fish and shellfish. Surviving at the fringe of light requirements, offshore seagrass beds are sensitive to changes in water clarity. Beyond and intermingled with the offshore seagrass areas are large swaths of colonized hard bottom. These offshore habitats of the West Florida coast have lacked mapping efforts needed for status and trends monitoring. The objective of this study was to propose an object-based classification method for mapping offshore habitats and to compare results to traditional photo-interpreted maps. Benthic maps were created from WorldView-2 satellite imagery using an Object Based Image Analysis (OBIA) method and a visual photo-interpretation method. A logistic regression analysis identified depth and distance from shore as significant parameters for discriminating spectrally similar seagrass and colonized hard bottom features. Seagrass, colonized hard bottom and unconsolidated sediment (sand) were mapped with 78% overall accuracy using the OBIA method compared to 71% overall accuracy using the photo-interpretation method. This study suggests an alternative for mapping deeper, offshore habitats capable of producing higher thematic and spatial resolution maps compared to those created with the traditional photo-interpretation method.

  9. Time series inversion of spectra from ground-based radiometers

    NASA Astrophysics Data System (ADS)

    Christensen, O. M.; Eriksson, P.

    2013-07-01

    Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument, which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the Onsala Space Observatory (OSO) water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.

  10. Comparative study of the ''Misgav Ladach'' and traditional Pfannenstiel surgical techniques for cesarean section.

    PubMed

    Belci, D; Kos, M; Zoricić, D; Kuharić, L; Slivar, A; Begić-Razem, E; Grdinić, I

    2007-06-01

    The aim of this study was to evaluate the advantages of the Misgav Ladach surgical technique compared to traditional cesarean section. A prospective randomized trial of 111 women undergoing cesarean section was carried out in the Pula General Hospital. Forty-nine operations were performed using the Pfannenstiel method of cesarean section, 55 by the Misgav Ladach method and 7 by lower midline laparotomy. It was proved that the cases where the Misgav Ladach method was implemented, compared to the Pfannenstiel method, showed a significantly shorter delivery/extraction and operative time (P=0.0009), the incision pain on the second postoperative day was significantly lower (0.021), we recorded a quicker stand up and walking time (P=0.013), significantly fewer analgesic injections and a shorter duration of analgesia were required (P=0.0009) and the bowel function was restored to normal sooner (P=0.001). The Misgav Ladach method of cesarean section has advantages over the Pfannenstiel method in so far as it is significantly quicker to perform, with diminished postoperative pain and less use of postoperative analgesics. The recovery of physiologic function is faster. No differences were found in intraoperative bleeding, maternal morbidity, scar appearance, uterus postoperative involution and the assessment of the inflammation response to the operative technique.

  11. Localized diabatization applied to excitons in molecular crystals

    NASA Astrophysics Data System (ADS)

    Jin, Zuxin; Subotnik, Joseph E.

    2017-06-01

    Traditional ab initio electronic structure calculations of periodic systems yield delocalized eigenstates that should be understood as adiabatic states. For example, excitons are bands of extended states which superimpose localized excitations on every lattice site. However, in general, in order to study the effects of nuclear motion on exciton transport, it is standard to work with a localized description of excitons, especially in a hopping regime; even in a band regime, a localized description can be helpful. To extract localized excitons from a band requires essentially a diabatization procedure. In this paper, three distinct methods are proposed for such localized diabatization: (i) a simple projection method, (ii) a more general Pipek-Mezey localization scheme, and (iii) a variant of Boys diabatization. Approaches (i) and (ii) require localized, single-particle Wannier orbitals, while approach (iii) has no such dependence. These methods should be very useful for studying energy transfer through solids with ab initio calculations.

  12. A flow-cytometry-based method to simplify the analysis and quantification of protein association to chromatin in mammalian cells

    PubMed Central

    Forment, Josep V.; Jackson, Stephen P.

    2016-01-01

    Protein accumulation on chromatin has traditionally been studied using immunofluorescence microscopy or biochemical cellular fractionation followed by western immunoblot analysis. As a way to improve the reproducibility of this kind of analysis, make it easier to quantify and allow a stream-lined application in high-throughput screens, we recently combined a classical immunofluorescence microscopy detection technique with flow cytometry1. In addition to the features described above, and by combining it with detection of both DNA content and DNA replication, this method allows unequivocal and direct assignment of cell-cycle distribution of protein association to chromatin without the need for cell culture synchronization. Furthermore, it is relatively quick (no more than a working day from sample collection to quantification), requires less starting material compared to standard biochemical fractionation methods and overcomes the need for flat, adherent cell types that are required for immunofluorescence microscopy. PMID:26226461

  13. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  14. Are the classic diagnostic methods in mycology still state of the art?

    PubMed

    Wiegand, Cornelia; Bauer, Andrea; Brasch, Jochen; Nenoff, Pietro; Schaller, Martin; Mayser, Peter; Hipler, Uta-Christina; Elsner, Peter

    2016-05-01

    The diagnostic workup of cutaneous fungal infections is traditionally based on microscopic KOH preparations as well as culturing of the causative organism from sample material. Another possible option is the detection of fungal elements by dermatohistology. If performed correctly, these methods are generally suitable for the diagnosis of mycoses. However, the advent of personalized medicine and the tasks arising therefrom require new procedures marked by simplicity, specificity, and swiftness. The additional use of DNA-based molecular techniques further enhances sensitivity and diagnostic specificity, and reduces the diagnostic interval to 24-48 hours, compared to weeks required for conventional mycological methods. Given the steady evolution in the field of personalized medicine, simple analytical PCR-based systems are conceivable, which allow for instant diagnosis of dermatophytes in the dermatology office (point-of-care tests). © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  15. Utility and potential of rapid epidemic intelligence from internet-based sources.

    PubMed

    Yan, S J; Chughtai, A A; Macintyre, C R

    2017-10-01

    Rapid epidemic detection is an important objective of surveillance to enable timely intervention, but traditional validated surveillance data may not be available in the required timeframe for acute epidemic control. Increasing volumes of data on the Internet have prompted interest in methods that could use unstructured sources to enhance traditional disease surveillance and gain rapid epidemic intelligence. We aimed to summarise Internet-based methods that use freely-accessible, unstructured data for epidemic surveillance and explore their timeliness and accuracy outcomes. Steps outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist were used to guide a systematic review of research related to the use of informal or unstructured data by Internet-based intelligence methods for surveillance. We identified 84 articles published between 2006-2016 relating to Internet-based public health surveillance methods. Studies used search queries, social media posts and approaches derived from existing Internet-based systems for early epidemic alerts and real-time monitoring. Most studies noted improved timeliness compared to official reporting, such as in the 2014 Ebola epidemic where epidemic alerts were generated first from ProMED-mail. Internet-based methods showed variable correlation strength with official datasets, with some methods showing reasonable accuracy. The proliferation of publicly available information on the Internet provided a new avenue for epidemic intelligence. Methodologies have been developed to collect Internet data and some systems are already used to enhance the timeliness of traditional surveillance systems. To improve the utility of Internet-based systems, the key attributes of timeliness and data accuracy should be included in future evaluations of surveillance systems. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Trace gas retrieval for limb DOAS under changing atmospheric conditions: The X-gas scaling method vs optimal estimation

    NASA Astrophysics Data System (ADS)

    Hueneke, Tilman; Grossmann, Katja; Knecht, Matthias; Raecke, Rasmus; Stutz, Jochen; Werner, Bodo; Pfeilsticker, Klaus

    2016-04-01

    Changing atmospheric conditions during DOAS measurements from fast moving aircraft platforms pose a challenge for trace gas retrievals. Traditional inversion techniques to retrieve trace gas concentrations from limb scattered UV/vis spectroscopy, like optimal estimation, require a-priori information on Mie extinction (e.g., aerosol concentration and cloud cover) and albedo, which determine the atmospheric radiative transfer. In contrast to satellite applications, cloud filters can not be applied because they would strongly reduce the usable amount of expensively gathered measurement data. In contrast to ground-based MAX-DOAS applications, an aerosol retrieval based on O4 is not able to constrain the radiative transfer in air-borne applications due to the rapidly decreasing amount of O4 with altitude. Furthermore, the assumption of a constant cloud cover is not valid for fast moving aircrafts, thus requiring 2D or even 3D treatment of the radiative transfer. Therefore, traditional techniques are not applicable for most of the data gathered by fast moving aircraft platforms. In order to circumvent these limitations, we have been developing the so-called X-gas scaling method. By utilising a proxy gas X (e.g. O3, O4, …), whose concentration is either a priori known or simultaneously in-situ measured as well as remotely measured, an effective absorption length for the target gas is inferred. In this presentation, we discuss the strengths and weaknesses of the novel approach along with some sample cases. A particular strength of the X-gas scaling method is its insensitivity towards the aerosol abundance and cloud cover as well as wavelength dependent effects, whereas its sensitivity towards the profiles of both gases requires a priori information on their shapes.

  17. Development of a fast PCR protocol enabling rapid generation of AmpFℓSTR® Identifiler® profiles for genotyping of human DNA

    PubMed Central

    2012-01-01

    Background Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Methods Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. Results The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. Conclusions The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations. PMID:22394458

  18. Evaluation of the use of nodal methods for MTR neutronic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reitsma, F.; Mueller, E.Z.

    1997-08-01

    Although modern nodal methods are used extensively in the nuclear power industry, their use for research reactor analysis has been very limited. The suitability of nodal methods for material testing reactor analysis is investigated with the emphasis on the modelling of the core region (fuel assemblies). The nodal approach`s performance is compared with that of the traditional finite-difference fine mesh approach. The advantages of using nodal methods coupled with integrated cross section generation systems are highlighted, especially with respect to data preparation, simplicity of use and the possibility of performing a great variety of reactor calculations subject to strict timemore » limitations such as are required for the RERTR program.« less

  19. Modeling of biological intelligence for SCM system optimization.

    PubMed

    Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang

    2012-01-01

    This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms.

  20. Fabrication of Josephson Junction without shadow evaporation

    NASA Astrophysics Data System (ADS)

    Wu, Xian; Ku, Hsiangsheng; Long, Junling; Pappas, David

    We developed a new method of fabricating Josephson Junction (Al/AlOX/Al) without shadow evaporation. Statistics from room temperature junction resistance and measurement of qubits are presented. Unlike the traditional ``Dolan Bridge'' technique, this method requires two individual lithographies and straight evaporations of Al. Argon RF plasma is used to remove native AlOX after the first evaporation, followed by oxidation and second Al evaporation. Junction resistance measured at room temperature shows linear dependence on Pox (oxidation pressure), √{tox} (oxidation time), and inverse proportional to junction area. We have seen 100% yield of qubits made with this method. This method is promising because it eliminates angle dependence during Junction fabrication, facilitates large scale qubits fabrication.

  1. Modeling of Biological Intelligence for SCM System Optimization

    PubMed Central

    Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang

    2012-01-01

    This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms. PMID:22162724

  2. Succession planning and leadership development: critical business strategies for healthcare organizations.

    PubMed

    Collins, Sandra K; Collins, Kevin S

    2007-01-01

    As labor shortages intensify, succession planning and leadership development have become strategic initiatives requiring rigorous consideration. Traditional methods of replacing personnel will not accommodate the vacancies expected to plague healthcare organizations. Managers should focus on identifying potential gaps of key personnel and adapting programs to accommodate organizational need. Attention should be placed on capturing the intellectual capital existent in the organization and developing diverse groups of leadership candidates.

  3. Design, experiments and simulation of voltage transformers on the basis of a differential input D-dot sensor.

    PubMed

    Wang, Jingang; Gao, Can; Yang, Jie

    2014-07-17

    Currently available traditional electromagnetic voltage sensors fail to meet the measurement requirements of the smart grid, because of low accuracy in the static and dynamic ranges and the occurrence of ferromagnetic resonance attributed to overvoltage and output short circuit. This work develops a new non-contact high-bandwidth voltage measurement system for power equipment. This system aims at the miniaturization and non-contact measurement of the smart grid. After traditional D-dot voltage probe analysis, an improved method is proposed. For the sensor to work in a self-integrating pattern, the differential input pattern is adopted for circuit design, and grounding is removed. To prove the structure design, circuit component parameters, and insulation characteristics, Ansoft Maxwell software is used for the simulation. Moreover, the new probe was tested on a 10 kV high-voltage test platform for steady-state error and transient behavior. Experimental results ascertain that the root mean square values of measured voltage are precise and that the phase error is small. The D-dot voltage sensor not only meets the requirement of high accuracy but also exhibits satisfactory transient response. This sensor can meet the intelligence, miniaturization, and convenience requirements of the smart grid.

  4. A new method for noninvasive measurement of pulmonary gas exchange using expired gas.

    PubMed

    West, John B; Prisk, G Kim

    2018-01-01

    Measurement of the gas exchange efficiency of the lung is often required in the practice of pulmonary medicine and in other settings. The traditional standard is the values of the PO2, PCO2, and pH of arterial blood. However arterial puncture requires technical expertise, is invasive, uncomfortable for the patient, and expensive. Here we describe how the composition of expired gas can be used in conjunction with pulse oximetry to obtain useful measures of gas exchange efficiency. The new procedure is noninvasive, well tolerated by the patient, and takes only a few minutes. It could be particularly useful when repeated measurements of pulmonary gas exchange are required. One product of the procedure is the difference between the PO2 of end-tidal alveolar gas and the calculated PO2 of arterial blood. This measurement is related to the classical alveolar-arterial PO2 difference based on ideal alveolar gas. However that traditional index is heavily influenced by lung units with low ventilation-perfusion ratios, whereas the new index has a broader physiological basis because it includes contributions from the whole lung. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Combined preoperative traction with instrumented posterior occipitocervical fusion for severe ventral brainstem compression secondary to displaced os odontoideum: technical report of 2 cases.

    PubMed

    Abd-El-Barr, Muhammad M; Snyder, Brian D; Emans, John B; Proctor, Mark R; Hedequist, Daniel

    2016-12-01

    Severe os odontoideum causing ventral brainstem compression is a rare and difficult entity to treat. It is generally accepted that severe os odontoideum causing ventral brainstem compression and neurological deficits warrants surgical treatment. This often requires both anterior and posterior procedures. Anterior approaches to the craniocervical junction are fraught with complications, including infection and risk of injury to neurovascular structures. External traction systems traditionally require long-term bedrest. The authors report 2 cases of severe ventral brainstem compression secondary to displaced os odontoideum and describe their use of extended preoperative halo vest traction to reduce the severe kyphosis and improve neurological function, followed by posterior occipitocervical fusion. Postoperatively both patients showed remarkable improvements in their neurological function and kyphotic deformity. Preoperative halo vest traction combined with posterior occipitocervical fusion appears to be a safe and effective method to treat brainstem compression by severe os odontoideum. It allows for adequate decompression of ventral neural structures and improvement of neurological function, but it is not hindered by the risks of anterior surgical approaches and does not restrict patients to strict bedrest as traditional traction systems. This method of halo vest traction and posterior-only approaches may be transferable to other cervical instability issues with both anterior and posterior pathologies.

  6. Quantitative DIC microscopy using an off-axis self-interference approach.

    PubMed

    Fu, Dan; Oh, Seungeun; Choi, Wonshik; Yamauchi, Toyohiko; Dorn, August; Yaqoob, Zahid; Dasari, Ramachandra R; Feld, Michael S

    2010-07-15

    Traditional Normarski differential interference contrast (DIC) microscopy is a very powerful method for imaging nonstained biological samples. However, one of its major limitations is the nonquantitative nature of the imaging. To overcome this problem, we developed a quantitative DIC microscopy method based on off-axis sample self-interference. The digital holography algorithm is applied to obtain quantitative phase gradients in orthogonal directions, which leads to a quantitative phase image through a spiral integration of the phase gradients. This method is practically simple to implement on any standard microscope without stringent requirements on polarization optics. Optical sectioning can be obtained through enlarged illumination NA.

  7. Multidisciplinary Design Techniques Applied to Conceptual Aerospace Vehicle Design. Ph.D. Thesis Final Technical Report

    NASA Technical Reports Server (NTRS)

    Olds, John Robert; Walberg, Gerald D.

    1993-01-01

    Multidisciplinary design optimization (MDO) is an emerging discipline within aerospace engineering. Its goal is to bring structure and efficiency to the complex design process associated with advanced aerospace launch vehicles. Aerospace vehicles generally require input from a variety of traditional aerospace disciplines - aerodynamics, structures, performance, etc. As such, traditional optimization methods cannot always be applied. Several multidisciplinary techniques and methods were proposed as potentially applicable to this class of design problem. Among the candidate options are calculus-based (or gradient-based) optimization schemes and parametric schemes based on design of experiments theory. A brief overview of several applicable multidisciplinary design optimization methods is included. Methods from the calculus-based class and the parametric class are reviewed, but the research application reported focuses on methods from the parametric class. A vehicle of current interest was chosen as a test application for this research. The rocket-based combined-cycle (RBCC) single-stage-to-orbit (SSTO) launch vehicle combines elements of rocket and airbreathing propulsion in an attempt to produce an attractive option for launching medium sized payloads into low earth orbit. The RBCC SSTO presents a particularly difficult problem for traditional one-variable-at-a-time optimization methods because of the lack of an adequate experience base and the highly coupled nature of the design variables. MDO, however, with it's structured approach to design, is well suited to this problem. The result of the application of Taguchi methods, central composite designs, and response surface methods to the design optimization of the RBCC SSTO are presented. Attention is given to the aspect of Taguchi methods that attempts to locate a 'robust' design - that is, a design that is least sensitive to uncontrollable influences on the design. Near-optimum minimum dry weight solutions are determined for the vehicle. A summary and evaluation of the various parametric MDO methods employed in the research are included. Recommendations for additional research are provided.

  8. Bioaccumulation Using Surrogate Samplers (Bass): Evaluation Of A Passive Sampler As An Alternative Monitoring Tool For Environmental Contaminants At The Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paller, M.; Knox, A.; Kuhne, W.

    2015-10-15

    DOE sites conduct traditional environmental monitoring programs that require collecting, processing, and analyzing water, sediment, and fish samples. However, recently developed passive sampling technologies, such as Diffusive Gradient in Thin films (DGT), may measure the chemical phases that are available and toxic to organisms (the bioavailable fraction), thereby producing more accurate and economical results than traditional methods.  Our laboratory study showed that dissolved copper concentrations measured by DGT probes were strongly correlated with the uptake of copper by Lumbriculus variegatus, an aquatic worm, and with concentrations of copper measured by conventional methods.  Dissolved copper concentrations in DGT probes increased with timemore » of exposure, paralleling the increase in copper with time that ocurred in Lumbriculus.  Additional studies with a combination of seven dissolved metals showed similar results.  These findings support the use of DGT as a biomimetic monitoring tool and provide a basis for refinement of these methods for cost-effective environmental monitoring at DOE sites.« less

  9. System identification of a small low-cost unmanned aerial vehicle using flight data from low-cost sensors

    NASA Astrophysics Data System (ADS)

    Hoffer, Nathan Von

    Remote sensing has traditionally been done with satellites and manned aircraft. While. these methods can yield useful scientificc data, satellites and manned aircraft have limitations in data frequency, process time, and real time re-tasking. Small low-cost unmanned aerial vehicles (UAVs) provide greater possibilities for personal scientic research than traditional remote sensing platforms. Precision aerial data requires an accurate vehicle dynamics model for controller development, robust flight characteristics, and fault tolerance. One method of developing a model is system identification (system ID). In this thesis system ID of a small low-cost fixed-wing T-tail UAV is conducted. The linerized longitudinal equations of motion are derived from first principles. Foundations of Recursive Least Squares (RLS) are presented along with RLS with an Error Filtering Online Learning scheme (EFOL). Sensors, data collection, data consistency checking, and data processing are described. Batch least squares (BLS) and BLS with EFOL are used to identify aerodynamic coecoefficients of the UAV. Results of these two methods with flight data are discussed.

  10. Scaling images using their background ratio. An application in statistical comparisons of images.

    PubMed

    Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J

    2003-06-07

    Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.

  11. A Design Problem of Assembly Line Systems using Genetic Algorithm under the BTO Environment

    NASA Astrophysics Data System (ADS)

    Abe, Kazuaki; Yamada, Tetsuo; Matsui, Masayuki

    Under the BTO environment, stochastic assembly lines require design methods which shorten not only the production lead time but also the ready time for the line design. We propose a design method for Assembly Line Systems (ALS) in Yamada et al. (2001) by using Genetic Algorithm (GA) and Adam-Eve GA, in which all design variables are determined in consideration of constraints such as line length related to the production lead time. First, an ALS model with a line length constraint is introduced, and an optimal design problem is set to maximize the net reward under shorter lead time. Next, a simulation optimization method is developed using Adam-Eve GA and traditional GA. Finally, an optimal design example is shown and discussed by comparing the 2-stage design by Yamada et al. (2001) and both the GA designs. It is shown that the Adam-Eve GA is superior to the traditional GA design in terms of computational time though there is only a slight difference in terms of net reward.

  12. Quantification of uncertainties in the performance of smart composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1993-01-01

    A composite wing with spars, bulkheads, and built-in control devices is evaluated using a method for the probabilistic assessment of smart composite structures. Structural responses (such as change in angle of attack, vertical displacements, and stresses in regular plies with traditional materials and in control plies with mixed traditional and actuation materials) are probabilistically assessed to quantify their respective scatter. Probabilistic sensitivity factors are computed to identify those parameters that have a significant influence on a specific structural response. Results show that the uncertainties in the responses of smart composite structures can be quantified. Responses such as structural deformation, ply stresses, frequencies, and buckling loads in the presence of defects can be reliably controlled to satisfy specified design requirements.

  13. Implicit Kalman filtering

    NASA Technical Reports Server (NTRS)

    Skliar, M.; Ramirez, W. F.

    1997-01-01

    For an implicitly defined discrete system, a new algorithm for Kalman filtering is developed and an efficient numerical implementation scheme is proposed. Unlike the traditional explicit approach, the implicit filter can be readily applied to ill-conditioned systems and allows for generalization to descriptor systems. The implementation of the implicit filter depends on the solution of the congruence matrix equation (A1)(Px)(AT1) = Py. We develop a general iterative method for the solution of this equation, and prove necessary and sufficient conditions for convergence. It is shown that when the system matrices of an implicit system are sparse, the implicit Kalman filter requires significantly less computer time and storage to implement as compared to the traditional explicit Kalman filter. Simulation results are presented to illustrate and substantiate the theoretical developments.

  14. Determination of aflatoxins and ochratoxin A in high-sugar-content traditional Turkish foods by affinity column cleanup and LC fluorescence detection.

    PubMed

    Senyuva, Hamide Z; Cimen, Dilek; Gilbert, John

    2009-01-01

    The effectiveness of an affinity column cleanup procedure followed by LC with fluorescence detection was established for the determination of aflatoxins and ochratoxin A in high-sugar-content traditional Turkish foods. Traditional foods, such as baklava (finely layered pastry filled with nuts and steeped in syrup), halvah (containing sesame paste and pistachios), cevizli sucuk (a confection made of grape juice boiled and dried on strings of nuts), Turkish delight (containing hazelnuts, pistachios, or walnuts), and pişmaniye (candy made of sugar, butter, and flour), were tested, and the performance of the method was established with spiked samples. To examine the robustness of the methodology, baklava was prepared from raw materials and spiked at the initial stage of dry ingredients and through subsequent stages of preparation of dough, after cooking, and after addition of syrup and nuts. For all products, the analytical method required grinding the composite foodstuff under liquid nitrogen to form a fine powder, which was then thoroughly mixed before subsampling. After vortex extraction into methanol-water (aflatoxins) and aqueous sodium bicarbonate (ochratoxin A), the sample was filtered, diluted with phosphate-buffered saline, and then passed through either an aflatoxin or ochratoxin A affinity column before HPLC analysis with fluorescence detection (using post-column bromination for the aflatoxins). In all the traditional Turkish products, the recovery of aflatoxin B1 ranged from 77 to 98%, and LODs were <0.1 microg/kg. For ochratoxin A, the recoveries were from 88 to 93% and LODs were similarly <0.1 microLg/kg. Despite the complex nature of these traditional Turkish foods, which frequently contain products from sugar caramelization, there was no evidence of any interfering co-extractives, and the method has proved to be robust enough to be used for food control purposes.

  15. Nursing benefits of using an automated injection system for ictal brain single photon emission computed tomography.

    PubMed

    Vonhofen, Geraldine; Evangelista, Tonya; Lordeon, Patricia

    2012-04-01

    The traditional method of administering radioactive isotopes to pediatric patients undergoing ictal brain single photon emission computed tomography testing has been by manual injections. This method presents certain challenges for nursing, including time requirements and safety risks. This quality improvement project discusses the implementation of an automated injection system for isotope administration and its impact on staffing, safety, and nursing satisfaction. It was conducted in an epilepsy monitoring unit at a large urban pediatric facility. Results of this project showed a decrease in the number of nurses exposed to radiation and improved nursing satisfaction with the use of the automated injection system. In addition, there was a decrease in the number of nursing hours required during ictal brain single photon emission computed tomography testing.

  16. Medical students’ attitudes and perspectives regarding novel computer-based practical spot tests compared to traditional practical spot tests

    PubMed Central

    Wijerathne, Buddhika; Rathnayake, Geetha

    2013-01-01

    Background Most universities currently practice traditional practical spot tests to evaluate students. However, traditional methods have several disadvantages. Computer-based examination techniques are becoming more popular among medical educators worldwide. Therefore incorporating the computer interface in practical spot testing is a novel concept that may minimize the shortcomings of traditional methods. Assessing students’ attitudes and perspectives is vital in understanding how students perceive the novel method. Methods One hundred and sixty medical students were randomly allocated to either a computer-based spot test (n=80) or a traditional spot test (n=80). The students rated their attitudes and perspectives regarding the spot test method soon after the test. The results were described comparatively. Results Students had higher positive attitudes towards the computer-based practical spot test compared to the traditional spot test. Their recommendations to introduce the novel practical spot test method for future exams and to other universities were statistically significantly higher. Conclusions The computer-based practical spot test is viewed as more acceptable to students than the traditional spot test. PMID:26451213

  17. Physico-Chemical and Structural Characteristics of Vegetables Cooked Under Sous-Vide, Cook-Vide, and Conventional Boiling.

    PubMed

    Iborra-Bernad, C; García-Segovia, P; Martínez-Monzó, J

    2015-08-01

    In this paper, physico-chemical and structural properties of cut and cooked purple-flesh potato, green bean pods, and carrots have been studied. Three different cooking methods have been applied: traditional cooking (boiling water at 100 °C), cook-vide (at 80 and 90 °C) and sous-vide (at 80 °C and 90 °C). Similar firmness was obtained in potato applying the same cooking time using traditional cooking (100 °C), and cook-vide and sous-vide at 90 °C, while in green beans and carrots the application of the sous-vide (90 °C) required longer cooking times than cook-vide (90 °C) and traditional cooking (100 °C). Losses in anthocyanins (for purple-flesh potatoes) and ascorbic acid (for green beans) were higher applying traditional cooking. β-Carotene extraction increased in carrots with traditional cooking and cook-vide (P < 0.05). Cryo-SEM micrographs suggested higher swelling pressure of starch in potatoes cells cooked in contact with water, such as traditional cooking and cook-vide. Traditional cooking was the most aggressive treatment in green beans because the secondary walls were reduced compared with sous-vide and cook-vide. Sous-vide preserved organelles in the carrot cells, which could explain the lower extraction of β-carotene compared with cook-vide and traditional cooking. Sous-vide cooking of purple-flesh potato is recommended to maintain its high anthocyanin content. Traditional boiling could be recommended for carrots because increase β-carotenes availability. For green beans, cook-vide, and sous-vide provided products with higher ascorbic acid content. © 2015 Institute of Food Technologists®

  18. Effective methods of teaching and learning in anatomy as a basic science: A BEME systematic review: BEME guide no. 44.

    PubMed

    Losco, C Dominique; Grant, William D; Armson, Anthony; Meyer, Amanda J; Walker, Bruce F

    2017-03-01

    Anatomy is a subject essential to medical practice, yet time committed to teaching is on the decline, and resources required to teach anatomy is costly, particularly dissection. Advances in technology are a potential solution to the problem, while maintaining the quality of teaching required for eventual clinical application. To identify methods used to teach anatomy, including those demonstrated to enhance knowledge acquisition and retention. PubMed, CINAHL, ERIC, Academic OneFile, ProQuest, SAGE journals and Scopus were search from the earliest entry of each database to 31 August 2015. All included articles were assessed for methodological quality and low quality articles were excluded from the study. Studies were evaluated by assessment scores, qualitative outcomes where included as well as a modified Kirkpatrick model. A total of 17,820 articles were initially identified, with 29 included in the review. The review found a wide variety of teaching interventions represented in the range of studies, with CAI/CAL studies predominating in terms of teaching interventions, followed by simulation. In addition to this, CAI/CAL and simulation studies demonstrated better results overall compared to traditional teaching methods and there is evidence to support CAI/CAL as a partial replacement for dissection or a valuable tool in conjunction with dissection. This review provides evidence in support of the use of alternatives to traditional teaching methods in anatomy, in particular, the use of CAI/CAL with a number of high quality, low risk of bias studies supporting this.

  19. Pain Experience and Behavior Management in Pediatric Dentistry: A Comparison between Traditional Local Anesthesia and the Wand Computerized Delivery System.

    PubMed

    Garret-Bernardin, Annelyse; Cantile, Tiziana; D'Antò, Vincenzo; Galanakis, Alexandros; Fauxpoint, Gabriel; Ferrazzano, Gianmaria Fabrizio; De Rosa, Sara; Vallogini, Giulia; Romeo, Umberto; Galeotti, Angela

    2017-01-01

    Aim. To evaluate the pain experience and behavior during dental injection, using the Wand computerized delivery system versus conventional local anesthesia in children and adolescents. Methods. An observational crossover split mouth study was performed on 67 patients (aged 7 to 15 years), requiring local anesthesia for dental treatments in both sides of the dental arch. Patients received both types of injections in two separate appointments, one with the use of a Computer Delivery System (the Wand STA system) and one with the traditional syringe. The following data were recorded: pain rating; changes in heart rate; level of collaboration; patient satisfaction. The data were analyzed using ANOVA for quantitative outcomes and nonparametric analysis (Kruskal-Wallis) for qualitative parameters. Results. The use of the Wand system determined significantly lower pain ratings and lower increase of heart rate than the traditional syringe. During injection, the number of patients showing a relaxed behavior was higher with the Wand than with the traditional local anesthesia. The patient level of satisfaction was higher with the Wand compared to the conventional local anesthesia. Conclusions. The Wand system may provide a less painful injection when compared to the conventional local anesthesia and it seemed to be better tolerated with respect to a traditional syringe.

  20. Pain Experience and Behavior Management in Pediatric Dentistry: A Comparison between Traditional Local Anesthesia and the Wand Computerized Delivery System

    PubMed Central

    D'Antò, Vincenzo; Fauxpoint, Gabriel; De Rosa, Sara; Vallogini, Giulia

    2017-01-01

    Aim. To evaluate the pain experience and behavior during dental injection, using the Wand computerized delivery system versus conventional local anesthesia in children and adolescents. Methods. An observational crossover split mouth study was performed on 67 patients (aged 7 to 15 years), requiring local anesthesia for dental treatments in both sides of the dental arch. Patients received both types of injections in two separate appointments, one with the use of a Computer Delivery System (the Wand STA system) and one with the traditional syringe. The following data were recorded: pain rating; changes in heart rate; level of collaboration; patient satisfaction. The data were analyzed using ANOVA for quantitative outcomes and nonparametric analysis (Kruskal–Wallis) for qualitative parameters. Results. The use of the Wand system determined significantly lower pain ratings and lower increase of heart rate than the traditional syringe. During injection, the number of patients showing a relaxed behavior was higher with the Wand than with the traditional local anesthesia. The patient level of satisfaction was higher with the Wand compared to the conventional local anesthesia. Conclusions. The Wand system may provide a less painful injection when compared to the conventional local anesthesia and it seemed to be better tolerated with respect to a traditional syringe. PMID:28293129

  1. Josephson frequency meter for millimeter and submillimeter wavelengths

    NASA Technical Reports Server (NTRS)

    Anischenko, S. E.; Larkin, S. Y.; Chaikovsky, V. I.; Kabayev, P. V.; Kamyshin, V. V.

    1995-01-01

    Frequency measurements of electromagnetic oscillations of millimeter and submillimeter wavebands with frequency growth due to a number of reasons become more and more difficult. First, these frequencies are considered to be cutoffs for semiconductor converting devices and one has to use optical measurement methods instead of traditional ones with frequency transfer. Second, resonance measurement methods are characterized by using relatively narrow bands and optical ones are limited in frequency and time resolution due to the limited range and velocity of movement of their mechanical elements as well as the efficiency of these optical techniques decrease with the increase of wavelength due to diffraction losses. That requires a priori information on the radiation frequency band of the source involved. Method of measuring frequency of harmonic microwave signals in millimeter and submillimeter wavebands based on the ac Josephson effect in superconducting contacts is devoid of all the above drawbacks. This approach offers a number of major advantages over the more traditional measurement methods, that is one based on frequency conversion, resonance and interferometric techniques. It can be characterized by high potential accuracy, wide range of frequencies measured, prompt measurement and the opportunity to obtain a panoramic display of the results as well as full automation of the measuring process.

  2. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  3. Efficient measurement of large light source near-field color and luminance distributions for optical design and simulation

    NASA Astrophysics Data System (ADS)

    Kostal, Hubert; Kreysar, Douglas; Rykowski, Ronald

    2009-08-01

    The color and luminance distributions of large light sources are difficult to measure because of the size of the source and the physical space required for the measurement. We describe a method for the measurement of large light sources in a limited space that efficiently overcomes the physical limitations of traditional far-field measurement techniques. This method uses a calibrated, high dynamic range imaging colorimeter and a goniometric system to move the light source through an automated measurement sequence in the imaging colorimeter's field-of-view. The measurement is performed from within the near-field of the light source, enabling a compact measurement set-up. This method generates a detailed near-field color and luminance distribution model that can be directly converted to ray sets for optical design and that can be extrapolated to far-field distributions for illumination design. The measurements obtained show excellent correlation to traditional imaging colorimeter and photogoniometer measurement methods. The near-field goniometer approach that we describe is broadly applicable to general lighting systems, can be deployed in a compact laboratory space, and provides full near-field data for optical design and simulation.

  4. Abundance and diversity of microbial inhabitants in European spacecraft-associated clean rooms.

    PubMed

    Stieglmeier, Michaela; Rettberg, Petra; Barczyk, Simon; Bohmeier, Maria; Pukall, Rüdiger; Wirth, Reinhard; Moissl-Eichinger, Christine

    2012-06-01

    The determination of the microbial load of a spacecraft en route to interesting extraterrestrial environments is mandatory and currently based on the culturable, heat-shock-surviving portion of microbial contaminants. Our study compared these classical bioburden measurements as required by NASA's and ESA's guidelines for the microbial examination of flight hardware, with molecular analysis methods (16S rRNA gene cloning and quantitative PCR) to further develop our understanding of the diversity and abundance of the microbial communities of spacecraft-associated clean rooms. Three samplings of the Herschel Space Observatory and its surrounding clean rooms were performed in two different European facilities. Molecular analyses detected a broad diversity of microbes typically found in the human microbiome with three bacterial genera (Staphylococcus, Propionibacterium, and Brevundimonas) common to all three locations. Bioburden measurements revealed a low, but heterogeneous, abundance of spore-forming and other heat-resistant microorganisms. Total cell numbers estimated by quantitative real-time PCR were typically 3 orders of magnitude greater than those determined by viable counts, which indicates a tendency for traditional methods to underestimate the extent of clean room bioburden. Furthermore, the molecular methods allowed the detection of a much broader diversity than traditional culture-based methods.

  5. MFAHP: A novel method on the performance evaluation of the industrial wireless networked control system

    NASA Astrophysics Data System (ADS)

    Wu, Linqin; Xu, Sheng; Jiang, Dezhi

    2015-12-01

    Industrial wireless networked control system has been widely used, and how to evaluate the performance of the wireless network is of great significance. In this paper, considering the shortcoming of the existing performance evaluation methods, a comprehensive performance evaluation method of networks multi-indexes fuzzy analytic hierarchy process (MFAHP) combined with the fuzzy mathematics and the traditional analytic hierarchy process (AHP) is presented. The method can overcome that the performance evaluation is not comprehensive and subjective. Experiments show that the method can reflect the network performance of real condition. It has direct guiding role on protocol selection, network cabling, and node setting, and can meet the requirements of different occasions by modifying the underlying parameters.

  6. Tuberculosis patients' knowledge and beliefs about tuberculosis: a mixed methods study from the Pacific Island nation of Vanuatu.

    PubMed

    Viney, Kerri A; Johnson, Penelope; Tagaro, Markleen; Fanai, Saen; Linh, Nguyen N; Kelly, Paul; Harley, David; Sleigh, Adrian

    2014-05-17

    The setting for this study was the Pacific island nation of Vanuatu, an archipelago of 82 islands, located in the South Pacific Ocean. Our objective was to assess the knowledge, attitudes and practices of tuberculosis (TB) patients towards TB. This was a descriptive study using qualitative and quantitative methods. Quantitative analysis was based on the responses provided to closed questions, and we present frequencies to describe the TB patients' knowledge, attitudes and practice relating to TB. Qualitative analysis was based on open questions permitting fuller explanations. We used thematic analysis and developed a posteriori inductive categories to draw conclusions. Thirty five TB patients were interviewed; 22 (63%) were male. They attributed TB to cigarettes, kava, alcohol, contaminated food, sharing eating utensils and "kastom" (the local term for the traditional way of life, but also for sorcery). Most (94%) did not attribute TB to a bacterial cause. However, almost all TB patients (89%) thought that TB was best treated at a hospital with antibiotics. Three quarters (74%) experienced stigma after their TB diagnosis. Seeking health care from a traditional healer was common; 54% of TB patients stated that they would first consult a traditional healer for any illness. When seeking a diagnosis for signs and symptoms of TB, 34% first consulted a traditional healer. Patients cited cost, distance and beliefs about TB causation as reasons for first consulting a traditional healer or going to the hospital. Of the TB patients who consulted a traditional healer first, there was an average of two weeks delay before they consulted the health service. In some cases, however, the delay was up to six years. The majority of the TB patients interviewed did not attribute TB to a bacterial cause. Consulting a traditional healer for health care, including while seeking a diagnosis for TB symptoms, was common and may have delayed diagnosis. People require better information about TB to correct commonly held misperceptions about the disease. Traditional healers could also be engaged with the national TB programme, in order to refer people with signs and symptoms of TB to the nearest health service.

  7. Efficient statistically accurate algorithms for the Fokker-Planck equation in large dimensions

    NASA Astrophysics Data System (ADS)

    Chen, Nan; Majda, Andrew J.

    2018-02-01

    Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace and is therefore computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O (100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.

  8. Probability techniques for reliability analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Wetherhold, Robert C.; Ucci, Anthony M.

    1994-01-01

    Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.

  9. Comparative analysis of cryopreservation methods in Chlamydomonas reinhardtii.

    PubMed

    Scarbrough, Chasity; Wirschell, Maureen

    2016-10-01

    Chlamydomonas is a model organism used for studies of many important biological processes. Traditionally, strains have been propagated on solid agar, which requires routine passaging for long-term maintenance. Cryopreservation of Chlamydomonas is possible, yet long-term viability is highly variable. Thus, improved cryopreservation methods for Chlamydomonas are an important requirement for sustained study of genetically defined strains. Here, we tested a commercial cryopreservation kit and directly compared it's effectiveness to a methanol-based method. We also tested thaw-back procedures comparing the growth of cells in liquid culture or on solid agar media. We demonstrated that methanol was the superior cryopreservation method for Chlamydomonas compared to the commercial kit and that post-thaw culture conditions dramatically affect viability. We also demonstrated that cryopreserved cells could be successfully thawed and plated directly onto solid agar plates. Our findings have important implications for the long-term storage of Chlamydomonas that can likely be extended to other algal species. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Welding methods for joining thermoplastic polymers for the hermetic enclosure of medical devices.

    PubMed

    Amanat, Negin; James, Natalie L; McKenzie, David R

    2010-09-01

    New high performance polymers have been developed that challenge traditional encapsulation materials for permanent active medical implants. The gold standard for hermetic encapsulation for implants is a titanium enclosure which is sealed using laser welding. Polymers may be an alternative encapsulation material. Although many polymers are biocompatible, and permeability of polymers may be reduced to acceptable levels, the ability to create a hermetic join with an extended life remains the barrier to widespread acceptance of polymers for this application. This article provides an overview of the current techniques used for direct bonding of polymers, with a focus on thermoplastics. Thermal bonding methods are feasible, but some take too long and/or require two stage processing. Some methods are not suitable because of excessive heat load which may be delivered to sensitive components within the capsule. Laser welding is presented as the method of choice; however the establishment of suitable laser process parameters will require significant research. 2010. Published by Elsevier Ltd.

  11. Rapid determination of thermodynamic parameters from one-dimensional programmed-temperature gas chromatography for use in retention time prediction in comprehensive multidimensional chromatography.

    PubMed

    McGinitie, Teague M; Ebrahimi-Najafabadi, Heshmatollah; Harynuk, James J

    2014-01-17

    A new method for estimating the thermodynamic parameters of ΔH(T0), ΔS(T0), and ΔCP for use in thermodynamic modeling of GC×GC separations has been developed. The method is an alternative to the traditional isothermal separations required to fit a three-parameter thermodynamic model to retention data. Herein, a non-linear optimization technique is used to estimate the parameters from a series of temperature-programmed separations using the Nelder-Mead simplex algorithm. With this method, the time required to obtain estimates of thermodynamic parameters a series of analytes is significantly reduced. This new method allows for precise predictions of retention time with the average error being only 0.2s for 1D separations. Predictions for GC×GC separations were also in agreement with experimental measurements; having an average relative error of 0.37% for (1)tr and 2.1% for (2)tr. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Simplified Dynamic Analysis of Grinders Spindle Node

    NASA Astrophysics Data System (ADS)

    Demec, Peter

    2014-12-01

    The contribution deals with the simplified dynamic analysis of surface grinding machine spindle node. Dynamic analysis is based on the use of the transfer matrix method, which is essentially a matrix form of method of initial parameters. The advantage of the described method, despite the seemingly complex mathematical apparatus, is primarily, that it does not require for solve the problem of costly commercial software using finite element method. All calculations can be made for example in MS Excel, which is advantageous especially in the initial stages of constructing of spindle node for the rapid assessment of the suitability its design. After detailing the entire structure of spindle node is then also necessary to perform the refined dynamic analysis in the environment of FEM, which it requires the necessary skills and experience and it is therefore economically difficult. This work was developed within grant project KEGA No. 023TUKE-4/2012 Creation of a comprehensive educational - teaching material for the article Production technique using a combination of traditional and modern information technology and e-learning.

  13. Assimilating concentration observations for transport and dispersion modeling in a meandering wind field

    NASA Astrophysics Data System (ADS)

    Haupt, Sue Ellen; Beyer-Lout, Anke; Long, Kerrie J.; Young, George S.

    Assimilating concentration data into an atmospheric transport and dispersion model can provide information to improve downwind concentration forecasts. The forecast model is typically a one-way coupled set of equations: the meteorological equations impact the concentration, but the concentration does not generally affect the meteorological field. Thus, indirect methods of using concentration data to influence the meteorological variables are required. The problem studied here involves a simple wind field forcing Gaussian dispersion. Two methods of assimilating concentration data to infer the wind direction are demonstrated. The first method is Lagrangian in nature and treats the puff as an entity using feature extraction coupled with nudging. The second method is an Eulerian field approach akin to traditional variational approaches, but minimizes the error by using a genetic algorithm (GA) to directly optimize the match between observations and predictions. Both methods show success at inferring the wind field. The GA-variational method, however, is more accurate but requires more computational time. Dynamic assimilation of a continuous release modeled by a Gaussian plume is also demonstrated using the genetic algorithm approach.

  14. Interactive searching of facial image databases

    NASA Astrophysics Data System (ADS)

    Nicholls, Robert A.; Shepherd, John W.; Shepherd, Jean

    1995-09-01

    A set of psychological facial descriptors has been devised to enable computerized searching of criminal photograph albums. The descriptors have been used to encode image databased of up to twelve thousand images. Using a system called FACES, the databases are searched by translating a witness' verbal description into corresponding facial descriptors. Trials of FACES have shown that this coding scheme is more productive and efficient than searching traditional photograph albums. An alternative method of searching the encoded database using a genetic algorithm is currenly being tested. The genetic search method does not require the witness to verbalize a description of the target but merely to indicate a degree of similarity between the target and a limited selection of images from the database. The major drawback of FACES is that is requires a manual encoding of images. Research is being undertaken to automate the process, however, it will require an algorithm which can predict human descriptive values. Alternatives to human derived coding schemes exist using statistical classifications of images. Since databases encoded using statistical classifiers do not have an obvious direct mapping to human derived descriptors, a search method which does not require the entry of human descriptors is required. A genetic search algorithm is being tested for such a purpose.

  15. A scoping review to explore the suitability of interactive voice response to conduct automated performance measurement of the patient's experience in primary care.

    PubMed

    Falconi, Michael; Johnston, Sharon; Hogg, William

    2016-05-01

    Practice-based performance measurement is fundamental for improvement and accountability in primary care. Traditional performance measurement of the patient's experience is often too costly and cumbersome for most practices. This scoping review explores the literature on the use of interactive voice response (IVR) telephone surveys to identify lessons for its use for collecting data on patient-reported outcome measures at the primary care practice level. The literature suggests IVR could potentially increase the capacity to reach more representative patient samples and those traditionally most difficult to engage. There is potential for long-term cost effectiveness and significant decrease of the burden on practices involved in collecting patient survey data. Challenges such as low response rates, mode effects, high initial set-up costs and maintenance fees, are also reported and require careful attention. This review suggests IVR may be a feasible alternative to traditional patient data collection methods, which should be further explored.

  16. Analysis and application of intelligence network based on FTTH

    NASA Astrophysics Data System (ADS)

    Feng, Xiancheng; Yun, Xiang

    2008-12-01

    With the continued rapid growth of Internet, new network service emerges in endless stream, especially the increase of network game, meeting TV, video on demand, etc. The bandwidth requirement increase continuously. Network technique, optical device technical development is swift and violent. FTTH supports all present and future service with enormous bandwidth, including traditional telecommunication service, traditional data service and traditional TV service, and the future digital TV and VOD. With huge bandwidth of FTTH, it wins the final solution of broadband network, becomes the final goal of development of optical access network. Firstly, it introduces the main service which FTTH supports, main analysis key technology such as FTTH system composition way, topological structure, multiplexing, optical cable and device. It focus two kinds of realization methods - PON, P2P technology. Then it proposed that the solution of FTTH can support comprehensive access (service such as broadband data, voice, video and narrowband private line). Finally, it shows the engineering application for FTTH in the district and building. It brings enormous economic benefits and social benefit.

  17. Corning HYPERFlask® for viral amplification and production of diagnostic reagents.

    PubMed

    Kearney, Brian J; Voorhees, Matthew A; Williams, Priscilla L; Olschner, Scott P; Rossi, Cynthia A; Schoepp, Randal J

    2017-04-01

    Viral preparations are essential components in diagnostic research and development. The production of large quantities of virus traditionally is done by infecting numerous tissue culture flasks or roller bottles, which require large incubators and/or roller bottle racks. The Corning HYPERFlask ® is a multilayer flask that uses a gas permeable film to provide gas exchange between the cells and culture medium and the atmospheric environment. This study evaluated the suitability of the HYPERFlask for production of Lassa, Ebola, Bundibugyo, Reston, and Marburg viruses and compared it to more traditional methods using tissue culture flasks and roller bottles. The HYPERFlask produced cultures were equivalent in virus titer and indistinguishable in immunodiagnostic assays. The use of the Corning HYPERFlask for viral production is a viable alternative to traditional tissue culture flasks and roller bottles. HYPERFlasks allow for large volumes of virus to be produced in a small space without specialized equipment. Copyright © 2016. Published by Elsevier B.V.

  18. Possibility of reconstruction of dental plaster cast from 3D digital study models

    PubMed Central

    2013-01-01

    Objectives To compare traditional plaster casts, digital models and 3D printed copies of dental plaster casts based on various criteria. To determine whether 3D printed copies obtained using open source system RepRap can replace traditional plaster casts in dental practice. To compare and contrast the qualities of two possible 3D printing options – open source system RepRap and commercially available 3D printing. Design and settings A method comparison study on 10 dental plaster casts from the Orthodontic department, Department of Stomatology, 2nd medical Faulty, Charles University Prague, Czech Republic. Material and methods Each of 10 plaster casts were scanned by inEos Blue scanner and the printed on 3D printer RepRap [10 models] and ProJet HD3000 3D printer [1 model]. Linear measurements between selected points on the dental arches of upper and lower jaws on plaster casts and its 3D copy were recorded and statistically analyzed. Results 3D printed copies have many advantages over traditional plaster casts. The precision and accuracy of the RepRap 3D printed copies of plaster casts were confirmed based on the statistical analysis. Although the commercially available 3D printing enables to print more details than the RepRap system, it is expensive and for the purpose of clinical use can be replaced by the cheaper prints obtained from RepRap printed copies. Conclusions Scanning of the traditional plaster casts to obtain a digital model offers a pragmatic approach. The scans can subsequently be used as a template to print the plaster casts as required. Using 3D printers can replace traditional plaster casts primarily due to their accuracy and price. PMID:23721330

  19. Randomized evaluation of a web based interview process for urology resident selection.

    PubMed

    Shah, Satyan K; Arora, Sanjeev; Skipper, Betty; Kalishman, Summers; Timm, T Craig; Smith, Anthony Y

    2012-04-01

    We determined whether a web based interview process for resident selection could effectively replace the traditional on-site interview. For the 2010 to 2011 match cycle, applicants to the University of New Mexico urology residency program were randomized to participate in a web based interview process via Skype or a traditional on-site interview process. Both methods included interviews with the faculty, a tour of facilities and the opportunity to ask current residents any questions. To maintain fairness the applicants were then reinterviewed via the opposite process several weeks later. We assessed comparative effectiveness, cost, convenience and satisfaction using anonymous surveys largely scored on a 5-point Likert scale. Of 39 total participants (33 applicants and 6 faculty) 95% completed the surveys. The web based interview was less costly to applicants (mean $171 vs $364, p=0.05) and required less time away from school (10% missing 1 or more days vs 30%, p=0.04) compared to traditional on-site interview. However, applicants perceived the web based interview process as less effective than traditional on-site interview, with a mean 6-item summative effectiveness score of 21.3 vs 25.6 (p=0.003). Applicants and faculty favored continuing the web based interview process in the future as an adjunct to on-site interviews. Residency interviews can be successfully conducted via the Internet. The web based interview process reduced costs and improved convenience. The findings of this study support the use of videoconferencing as an adjunct to traditional interview methods rather than as a replacement. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  20. Traditional measures of normal anal sphincter function using high-resolution anorectal manometry (HRAM) in 115 healthy volunteers.

    PubMed

    Carrington, E V; Brokjaer, A; Craven, H; Zarate, N; Horrocks, E J; Palit, S; Jackson, W; Duthie, G S; Knowles, C H; Lunniss, P J; Scott, S M

    2014-05-01

    High-resolution anorectal manometry (HRAM) is a relatively new method for collection and interpretation of data relevant to sphincteric function, and for the first time allows a global appreciation of the anorectum as a functional unit. Historically, traditional anal manometry has been plagued by lack of standardization and healthy volunteer data of variable quality. The aims of this study were: (i) to obtain normative data sets for traditional measures of anorectal function using HRAM in healthy subjects and; (ii) to qualitatively describe novel physiological phenomena, which may be of future relevance when this method is applied to patients. 115 healthy subjects (96 female) underwent HRAM using a 10 channel, 12F solid-state catheter. Measurements were performed during rest, squeeze, cough, and simulated defecation (push). Data were displayed as color contour plots and analysed using a commercially available manometric system (Solar GI HRM v9.1, Medical Measurement Systems). Associations between age, gender and parity were subsequently explored. HRAM color contour plots provided clear delineation of the high-pressure zone within the anal canal and showed recruitment during maneuvers that altered intra-anal pressures. Automated analysis produced quantitative data, which have been presented on the basis of gender and parity due to the effect of these covariates on some sphincter functions. In line with traditional manometry, some age and gender differences were seen. Males had a greater functional anal canal length and anal pressures during the cough maneuver. Parity in females was associated with reduced squeeze increments. The study provides a large healthy volunteer dataset and parameters of traditional measures of anorectal function. A number of novel phenomena are appreciated, the significance of which will require further analysis and comparisons with patient populations. © 2014 John Wiley & Sons Ltd.

  1. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  2. A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.

    PubMed

    Faya, Paul; Stamey, James D; Seaman, John W

    2017-01-01

    For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.

  3. Reflectance Prediction Modelling for Residual-Based Hyperspectral Image Coding

    PubMed Central

    Xiao, Rui; Gao, Junbin; Bossomaier, Terry

    2016-01-01

    A Hyperspectral (HS) image provides observational powers beyond human vision capability but represents more than 100 times the data compared to a traditional image. To transmit and store the huge volume of an HS image, we argue that a fundamental shift is required from the existing “original pixel intensity”-based coding approaches using traditional image coders (e.g., JPEG2000) to the “residual”-based approaches using a video coder for better compression performance. A modified video coder is required to exploit spatial-spectral redundancy using pixel-level reflectance modelling due to the different characteristics of HS images in their spectral and shape domain of panchromatic imagery compared to traditional videos. In this paper a novel coding framework using Reflectance Prediction Modelling (RPM) in the latest video coding standard High Efficiency Video Coding (HEVC) for HS images is proposed. An HS image presents a wealth of data where every pixel is considered a vector for different spectral bands. By quantitative comparison and analysis of pixel vector distribution along spectral bands, we conclude that modelling can predict the distribution and correlation of the pixel vectors for different bands. To exploit distribution of the known pixel vector, we estimate a predicted current spectral band from the previous bands using Gaussian mixture-based modelling. The predicted band is used as the additional reference band together with the immediate previous band when we apply the HEVC. Every spectral band of an HS image is treated like it is an individual frame of a video. In this paper, we compare the proposed method with mainstream encoders. The experimental results are fully justified by three types of HS dataset with different wavelength ranges. The proposed method outperforms the existing mainstream HS encoders in terms of rate-distortion performance of HS image compression. PMID:27695102

  4. The economic costs of routine INR monitoring in infants and children--examining point-of-care devices used within the home setting compared to traditional anticoagulation clinic monitoring.

    PubMed

    Gaw, James R; Crowley, Steven; Monagle, Paul; Jones, Sophie; Newall, Fiona

    2013-07-01

    The use of point-of-care (POC) devices within the home for routine INR monitoring has demonstrated reliability, safety and effectiveness in the management of infants and children requiring long-term warfarin therapy. However, a comprehensive cost-analysis of using this method of management, compared to attending anticoagulation clinics has not been reported. The aim of this study was to compare the estimated societal costs of attending anticoagulation clinics for routine INR monitoring to using a POC test in the home. This study used a comparative before-and-after design that included 60 infants and children managed via the Haematology department at a tertiary paediatric centre. Each participant was exposed to both modes of management at various times for a period of ≥3 months. A questionnaire, consisting of 25 questions was sent to families to complete and return. Data collected included: the frequency of monitoring, mode of travel to and from clinics, total time consumed, and primary carer's income level. The home monitoring cohort saved a total of 1 hour 19 minutes per INR test compared to attending anticoagulation clinics and had a cost saving to society of $66.83 (AUD) per INR test compared to traditional care; incorporating health sector costs, travel expenses and lost time. The traditional model of care requires a considerable investment of time per test from both child and carer. Home INR monitoring in infants and children provides greater societal economic benefits compared to traditional models. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Traditional and Cognitive Job Analyses as Tools for Understanding the Skills Gap.

    ERIC Educational Resources Information Center

    Hanser, Lawrence M.

    Traditional methods of job and task analysis may be categorized as worker-oriented methods focusing on general human behaviors performed by workers in jobs or as job-oriented methods focusing on the technologies involved in jobs. The ability of both types of traditional methods to identify, understand, and communicate the skills needed in high…

  6. 21 CFR 11.2 - Implementation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; ELECTRONIC SIGNATURES General Provisions § 11.2 Implementation. (a) For records required to be maintained but... signatures in lieu of traditional signatures, in whole or in part, provided that the requirements of this... paper records or electronic signatures in lieu of traditional signatures, in whole or in part, provided...

  7. Change detection technique for muscle tone during static stretching by continuous muscle viscoelasticity monitoring using wearable indentation tester.

    PubMed

    Okamura, Naomi; Kobayashi, Yo; Sugano, Shigeki; Fujie, Masakatsu G

    2017-07-01

    Static stretching is widely performed to decrease muscle tone as a part of rehabilitation protocols. Finding out the optimal duration of static stretching is important to minimize the time required for rehabilitation therapy and it would be helpful for maintaining the patient's motivation towards daily rehabilitation tasks. Several studies have been conducted for the evaluation of static stretching; however, the recommended duration of static stretching varies widely between 15-30 s in general, because the traditional methods for the assessment of muscle tone do not monitor the continuous change in the target muscle's state. We have developed a method to monitor the viscoelasticity of one muscle continuously during static stretching, using a wearable indentation tester. In this study, we investigated a suitable signal processing method to detect the time required to change the muscle tone, utilizing the data collected using a wearable indentation tester. By calculating a viscoelastic index with a certain time window, we confirmed that the stretching duration required to bring about a decrease in muscle tone could be obtained with an accuracy in the order of 1 s.

  8. The 3D dynamics of the Cosserat rod as applied to continuum robotics

    NASA Astrophysics Data System (ADS)

    Jones, Charles Rees

    2011-12-01

    In the effort to simulate the biologically inspired continuum robot's dynamic capabilities, researchers have been faced with the daunting task of simulating---in real-time---the complete three dimensional dynamics of the "beam-like" structure which includes the three "stiff" degrees-of-freedom transverse and dilational shear. Therefore, researchers have traditionally limited the difficulty of the problem with simplifying assumptions. This study, however, puts forward a solution which makes no simplifying assumptions and trades off only the real-time requirement of the desired solution. The solution is a Finite Difference Time Domain method employing an explicit single step method with cheap right hands sides. The cheap right hand sides are the result of a rather ingenious formulation of the classical beam called the Cosserat rod by, first, the Cosserat brothers and, later, Stuart S. Antman which results in five nonlinear but uncoupled equations that require only multiplication and addition. The method is therefore suitable for hardware implementation thus moving the real-time requirement from a software solution to a hardware solution.

  9. SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, C; Wessels, B; Hamilton, H

    2014-06-01

    Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of amore » number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT.« less

  10. Multitasking the Davidson algorithm for the large, sparse eigenvalue problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Umar, V.M.; Fischer, C.F.

    1989-01-01

    The authors report how the Davidson algorithm, developed for handling the eigenvalue problem for large and sparse matrices arising in quantum chemistry, was modified for use in atomic structure calculations. To date these calculations have used traditional eigenvalue methods, which limit the range of feasible calculations because of their excessive memory requirements and unsatisfactory performance attributed to time-consuming and costly processing of zero valued elements. The replacement of a traditional matrix eigenvalue method by the Davidson algorithm reduced these limitations. Significant speedup was found, which varied with the size of the underlying problem and its sparsity. Furthermore, the range ofmore » matrix sizes that can be manipulated efficiently was expended by more than one order or magnitude. On the CRAY X-MP the code was vectorized and the importance of gather/scatter analyzed. A parallelized version of the algorithm obtained an additional 35% reduction in execution time. Speedup due to vectorization and concurrency was also measured on the Alliant FX/8.« less

  11. A Concept of Thermographic Method for Non-Destructive Testing of Polymeric Composite Structures Using Self-Heating Effect

    PubMed Central

    2017-01-01

    Traditional techniques of active thermography require an external source of energy used for excitation, usually in the form of high power lamps or ultrasonic devices. In this paper, the author presents an alternative approach based on the self-heating effect observable in polymer-based structures during cyclic loading. The presented approach is based on, firstly, determination of bending resonance frequencies of a tested structure, and then, on excitation of a structure with a multi-harmonic signal constructed from the harmonics with frequencies of determined resonances. Following this, heating-up of a tested structure occurs in the location of stress concentration and mechanical energy dissipation due to the viscoelastic response of a structure. By applying multi-harmonic signal, one ensures coverage of the structure by such heated regions. The concept is verified experimentally on artificially damaged composite specimens. The results demonstrate the presented approach and indicate its potential, especially when traditional methods of excitation with an external structure for thermographic inspection cannot be applied. PMID:29283430

  12. Adaptive tracking control of leader-following linear multi-agent systems with external disturbances

    NASA Astrophysics Data System (ADS)

    Lin, Hanquan; Wei, Qinglai; Liu, Derong; Ma, Hongwen

    2016-10-01

    In this paper, the consensus problem for leader-following linear multi-agent systems with external disturbances is investigated. Brownian motions are used to describe exogenous disturbances. A distributed tracking controller based on Riccati inequalities with an adaptive law for adjusting coupling weights between neighbouring agents is designed for leader-following multi-agent systems under fixed and switching topologies. In traditional distributed static controllers, the coupling weights depend on the communication graph. However, coupling weights associated with the feedback gain matrix in our method are updated by state errors between neighbouring agents. We further present the stability analysis of leader-following multi-agent systems with stochastic disturbances under switching topology. Most traditional literature requires the graph to be connected all the time, while the communication graph is only assumed to be jointly connected in this paper. The design technique is based on Riccati inequalities and algebraic graph theory. Finally, simulations are given to show the validity of our method.

  13. A Concept of Thermographic Method for Non-Destructive Testing of Polymeric Composite Structures Using Self-Heating Effect.

    PubMed

    Katunin, Andrzej

    2017-12-28

    Traditional techniques of active thermography require an external source of energy used for excitation, usually in the form of high power lamps or ultrasonic devices. In this paper, the author presents an alternative approach based on the self-heating effect observable in polymer-based structures during cyclic loading. The presented approach is based on, firstly, determination of bending resonance frequencies of a tested structure, and then, on excitation of a structure with a multi-harmonic signal constructed from the harmonics with frequencies of determined resonances. Following this, heating-up of a tested structure occurs in the location of stress concentration and mechanical energy dissipation due to the viscoelastic response of a structure. By applying multi-harmonic signal, one ensures coverage of the structure by such heated regions. The concept is verified experimentally on artificially damaged composite specimens. The results demonstrate the presented approach and indicate its potential, especially when traditional methods of excitation with an external structure for thermographic inspection cannot be applied.

  14. Adjacent slice prostate cancer prediction to inform MALDI imaging biomarker analysis

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Cazares, Lisa; Nyalwidhe, Julius; Troyer, Dean; Semmes, O. John; Li, Jiang; McKenzie, Frederic D.

    2010-03-01

    Prostate cancer is the second most common type of cancer among men in US [1]. Traditionally, prostate cancer diagnosis is made by the analysis of prostate-specific antigen (PSA) levels and histopathological images of biopsy samples under microscopes. Proteomic biomarkers can improve upon these methods. MALDI molecular spectra imaging is used to visualize protein/peptide concentrations across biopsy samples to search for biomarker candidates. Unfortunately, traditional processing methods require histopathological examination on one slice of a biopsy sample while the adjacent slice is subjected to the tissue destroying desorption and ionization processes of MALDI. The highest confidence tumor regions gained from the histopathological analysis are then mapped to the MALDI spectra data to estimate the regions for biomarker identification from the MALDI imaging. This paper describes a process to provide a significantly better estimate of the cancer tumor to be mapped onto the MALDI imaging spectra coordinates using the high confidence region to predict the true area of the tumor on the adjacent MALDI imaged slice.

  15. Neural Net Gains Estimation Based on an Equivalent Model

    PubMed Central

    Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory

    2016-01-01

    A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system. PMID:27366146

  16. Neural Net Gains Estimation Based on an Equivalent Model.

    PubMed

    Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory

    2016-01-01

    A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system.

  17. Zero-fringe demodulation method based on location-dependent birefringence dispersion in polarized low-coherence interferometry.

    PubMed

    Wang, Shuang; Liu, Tiegen; Jiang, Junfeng; Liu, Kun; Yin, Jinde; Qin, Zunqi; Zou, Shengliang

    2014-04-01

    We present a high precision and fast speed demodulation method for a polarized low-coherence interferometer with location-dependent birefringence dispersion. Based on the characteristics of location-dependent birefringence dispersion and five-step phase-shifting technology, the method accurately retrieves the peak position of zero-fringe at the central wavelength, which avoids the fringe order ambiguity. The method processes data only in the spatial domain and reduces the computational load greatly. We successfully demonstrated the effectiveness of the proposed method in an optical fiber Fabry-Perot barometric pressure sensing experiment system. Measurement precision of 0.091 kPa was realized in the pressure range of 160 kPa, and computation time was improved by 10 times compared to the traditional phase-based method that requires Fourier transform operation.

  18. Astronomical Distance Determination in the Space Age. Secondary Distance Indicators

    NASA Astrophysics Data System (ADS)

    Czerny, Bożena; Beaton, Rachael; Bejger, Michał; Cackett, Edward; Dall'Ora, Massimo; Holanda, R. F. L.; Jensen, Joseph B.; Jha, Saurabh W.; Lusso, Elisabeta; Minezaki, Takeo; Risaliti, Guido; Salaris, Maurizio; Toonen, Silvia; Yoshii, Yuzuru

    2018-02-01

    The formal division of the distance indicators into primary and secondary leads to difficulties in description of methods which can actually be used in two ways: with, and without the support of the other methods for scaling. Thus instead of concentrating on the scaling requirement we concentrate on all methods of distance determination to extragalactic sources which are designated, at least formally, to use for individual sources. Among those, the Supernovae Ia is clearly the leader due to its enormous success in determination of the expansion rate of the Universe. However, new methods are rapidly developing, and there is also a progress in more traditional methods. We give a general overview of the methods but we mostly concentrate on the most recent developments in each field, and future expectations.

  19. A finite element: Boundary integral method for electromagnetic scattering. Ph.D. Thesis Technical Report, Feb. - Sep. 1992

    NASA Technical Reports Server (NTRS)

    Collins, J. D.; Volakis, John L.

    1992-01-01

    A method that combines the finite element and boundary integral techniques for the numerical solution of electromagnetic scattering problems is presented. The finite element method is well known for requiring a low order storage and for its capability to model inhomogeneous structures. Of particular emphasis in this work is the reduction of the storage requirement by terminating the finite element mesh on a boundary in a fashion which renders the boundary integrals in convolutional form. The fast Fourier transform is then used to evaluate these integrals in a conjugate gradient solver, without a need to generate the actual matrix. This method has a marked advantage over traditional integral equation approaches with respect to the storage requirement of highly inhomogeneous structures. Rectangular, circular, and ogival mesh termination boundaries are examined for two-dimensional scattering. In the case of axially symmetric structures, the boundary integral matrix storage is reduced by exploiting matrix symmetries and solving the resulting system via the conjugate gradient method. In each case several results are presented for various scatterers aimed at validating the method and providing an assessment of its capabilities. Important in methods incorporating boundary integral equations is the issue of internal resonance. A method is implemented for their removal, and is shown to be effective in the two-dimensional and three-dimensional applications.

  20. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  1. Mathematical decision theory applied to land capability: a case study in the community of madrid.

    PubMed

    Antón, J M; Saa-Requejo, A; Grau, J B; Gallardo, J; Díaz, M C; Andina, Diego; Sanchez, M E; Tarquis, A M

    2014-03-01

    In land evaluation science, a standard data set is obtained for each land unit to determine the land capability class for various uses, such as different farming systems, forestry, or the conservation or suitability of a specific crop. In this study, we used mathematical decision theory (MDT) methods to address this task. Mathematical decision theory has been used in areas such as management, finance, industrial design, rural development, the environment, and projects for future welfare to study quality and aptness problems using several criteria. We also review MDT applications in soil science and discuss the suitability of MDT methods for dealing simultaneously with a number of problems. The aim of the work was to show how MDT can be used to obtain a valid land quality index and to compare this with a traditional land capability method. Therefore, an additive classification method was applied to obtain a land quality index for 122 land units that were compiled for a case study of the Community of Madrid, Spain, and the results were compared with a previously assigned land capability class using traditional methods based on the minimum requirements for land attributes. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  2. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    PubMed

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  3. A pratical deconvolution algorithm in multi-fiber spectra extraction

    NASA Astrophysics Data System (ADS)

    Zhang, Haotong; Li, Guangwei; Bai, Zhongrui

    2015-08-01

    Deconvolution algorithm is a very promising method in multi-fiber spectroscopy data reduction, the method can extract spectra to the photo noise level as well as improve the spectral resolution, but as mentioned in Bolton & Schlegel (2010), it is limited by its huge computation requirement and thus can not be implemented directly in actual data reduction. We develop a practical algorithm to solve the computation problem. The new algorithm can deconvolve a 2D fiber spectral image of any size with actual PSFs, which may vary with positions. We further consider the influence of noise, which is thought to be an intrinsic ill-posed problem in deconvolution algorithms. We modify our method with a Tikhonov regularization item to depress the method induced noise. A series of simulations based on LAMOST data are carried out to test our method under more real situations with poisson noise and extreme cross talk, i.e., the fiber-to-fiber distance is comparable to the FWHM of the fiber profile. Compared with the results of traditional extraction methods, i.e., the Aperture Extraction Method and the Profile Fitting Method, our method shows both higher S/N and spectral resolution. The computaion time for a noise added image with 250 fibers and 4k pixels in wavelength direction, is about 2 hours when the fiber cross talk is not in the extreme case and 3.5 hours for the extreme fiber cross talk. We finally apply our method to real LAMOST data. We find that the 1D spectrum extracted by our method has both higher SNR and resolution than the traditional methods, but there are still some suspicious weak features possibly caused by the noise sensitivity of the method around the strong emission lines. How to further attenuate the noise influence will be the topic of our future work. As we have demonstrated, multi-fiber spectra extracted by our method will have higher resolution and signal to noise ratio thus will provide more accurate information (such as higher radial velocity and metallicity measurement accuracy in stellar physics) to astronomers than traditional methods.

  4. Crystallization of the Large Membrane Protein Complex Photosystem I in a Microfluidic Channel

    PubMed Central

    Abdallah, Bahige G.; Kupitz, Christopher; Fromme, Petra; Ros, Alexandra

    2014-01-01

    Traditional macroscale protein crystallization is accomplished non-trivially by exploring a range of protein concentrations and buffers in solution until a suitable combination is attained. This methodology is time consuming and resource intensive, hindering protein structure determination. Even more difficulties arise when crystallizing large membrane protein complexes such as photosystem I (PSI) due to their large unit cells dominated by solvent and complex characteristics that call for even stricter buffer requirements. Structure determination techniques tailored for these ‘difficult to crystallize’ proteins such as femtosecond nanocrystallography are being developed, yet still need specific crystal characteristics. Here, we demonstrate a simple and robust method to screen protein crystallization conditions at low ionic strength in a microfluidic device. This is realized in one microfluidic experiment using low sample amounts, unlike traditional methods where each solution condition is set up separately. Second harmonic generation microscopy via Second Order Nonlinear Imaging of Chiral Crystals (SONICC) was applied for the detection of nanometer and micrometer sized PSI crystals within microchannels. To develop a crystallization phase diagram, crystals imaged with SONICC at specific channel locations were correlated to protein and salt concentrations determined by numerical simulations of the time-dependent diffusion process along the channel. Our method demonstrated that a portion of the PSI crystallization phase diagram could be reconstructed in excellent agreement with crystallization conditions determined by traditional methods. We postulate that this approach could be utilized to efficiently study and optimize crystallization conditions for a wide range of proteins that are poorly understood to date. PMID:24191698

  5. Drifting Apart or Converging? Grades among Non-Traditional and Traditional Students over the Course of Their Studies: A Case Study from Germany

    ERIC Educational Resources Information Center

    Brändle, Tobias; Lengfeld, Holger

    2017-01-01

    Since 2009, German universities were opened by law to freshmen who do not possess the traditional graduation certificate required for entry into University, but who are rather vocationally qualified. In this article, we track the grades of these so-called non-traditional students and compare them to those of traditional students using a…

  6. Surface characterization of nanomaterials and nanoparticles: Important needs and challenging opportunities

    PubMed Central

    Baer, Donald R.; Engelhard, Mark H.; Johnson, Grant E.; Laskin, Julia; Lai, Jinfeng; Mueller, Karl; Munusamy, Prabhakaran; Thevuthasan, Suntharampillai; Wang, Hongfei; Washton, Nancy; Elder, Alison; Baisch, Brittany L.; Karakoti, Ajay; Kuchibhatla, Satyanarayana V. N. T.; Moon, DaeWon

    2013-01-01

    This review examines characterization challenges inherently associated with understanding nanomaterials and the roles surface and interface characterization methods can play in meeting some of the challenges. In parts of the research community, there is growing recognition that studies and published reports on the properties and behaviors of nanomaterials often have reported inadequate or incomplete characterization. As a consequence, the true value of the data in these reports is, at best, uncertain. With the increasing importance of nanomaterials in fundamental research and technological applications, it is desirable that researchers from the wide variety of disciplines involved recognize the nature of these often unexpected challenges associated with reproducible synthesis and characterization of nanomaterials, including the difficulties of maintaining desired materials properties during handling and processing due to their dynamic nature. It is equally valuable for researchers to understand how characterization approaches (surface and otherwise) can help to minimize synthesis surprises and to determine how (and how quickly) materials and properties change in different environments. Appropriate application of traditional surface sensitive analysis methods (including x-ray photoelectron and Auger electron spectroscopies, scanning probe microscopy, and secondary ion mass spectroscopy) can provide information that helps address several of the analysis needs. In many circumstances, extensions of traditional data analysis can provide considerably more information than normally obtained from the data collected. Less common or evolving methods with surface selectivity (e.g., some variations of nuclear magnetic resonance, sum frequency generation, and low and medium energy ion scattering) can provide information about surfaces or interfaces in working environments (operando or in situ) or information not provided by more traditional methods. Although these methods may require instrumentation or expertise not generally available, they can be particularly useful in addressing specific questions, and examples of their use in nanomaterial research are presented. PMID:24482557

  7. Surface characterization of nanomaterials and nanoparticles: Important needs and challenging opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Donald R.; Engelhard, Mark H.; Johnson, Grant E.

    2013-09-15

    This review examines characterization challenges inherently associated with understanding nanomaterials and the roles surface and interface characterization methods can play in meeting some of the challenges. In parts of the research community, there is growing recognition that studies and published reports on the properties and behaviors of nanomaterials often have reported inadequate or incomplete characterization. As a consequence, the true value of the data in these reports is, at best, uncertain. With the increasing importance of nanomaterials in fundamental research and technological applications, it is desirable that researchers from the wide variety of disciplines involved recognize the nature of thesemore » often unexpected challenges associated with reproducible synthesis and characterization of nanomaterials, including the difficulties of maintaining desired materials properties during handling and processing due to their dynamic nature. It is equally valuable for researchers to understand how characterization approaches (surface and otherwise) can help to minimize synthesis surprises and to determine how (and how quickly) materials and properties change in different environments. Appropriate application of traditional surface sensitive analysis methods (including x-ray photoelectron and Auger electron spectroscopies, scanning probe microscopy, and secondary ion mass spectroscopy) can provide information that helps address several of the analysis needs. In many circumstances, extensions of traditional data analysis can provide considerably more information than normally obtained from the data collected. Less common or evolving methods with surface selectivity (e.g., some variations of nuclear magnetic resonance, sum frequency generation, and low and medium energy ion scattering) can provide information about surfaces or interfaces in working environments (operando or in situ) or information not provided by more traditional methods. Although these methods may require instrumentation or expertise not generally available, they can be particularly useful in addressing specific questions, and examples of their use in nanomaterial research are presented.« less

  8. Online selective kernel-based temporal difference learning.

    PubMed

    Chen, Xingguo; Gao, Yang; Wang, Ruili

    2013-12-01

    In this paper, an online selective kernel-based temporal difference (OSKTD) learning algorithm is proposed to deal with large scale and/or continuous reinforcement learning problems. OSKTD includes two online procedures: online sparsification and parameter updating for the selective kernel-based value function. A new sparsification method (i.e., a kernel distance-based online sparsification method) is proposed based on selective ensemble learning, which is computationally less complex compared with other sparsification methods. With the proposed sparsification method, the sparsified dictionary of samples is constructed online by checking if a sample needs to be added to the sparsified dictionary. In addition, based on local validity, a selective kernel-based value function is proposed to select the best samples from the sample dictionary for the selective kernel-based value function approximator. The parameters of the selective kernel-based value function are iteratively updated by using the temporal difference (TD) learning algorithm combined with the gradient descent technique. The complexity of the online sparsification procedure in the OSKTD algorithm is O(n). In addition, two typical experiments (Maze and Mountain Car) are used to compare with both traditional and up-to-date O(n) algorithms (GTD, GTD2, and TDC using the kernel-based value function), and the results demonstrate the effectiveness of our proposed algorithm. In the Maze problem, OSKTD converges to an optimal policy and converges faster than both traditional and up-to-date algorithms. In the Mountain Car problem, OSKTD converges, requires less computation time compared with other sparsification methods, gets a better local optima than the traditional algorithms, and converges much faster than the up-to-date algorithms. In addition, OSKTD can reach a competitive ultimate optima compared with the up-to-date algorithms.

  9. A comparison of two prospective risk analysis methods: Traditional FMEA and a modified healthcare FMEA.

    PubMed

    Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya

    2016-12-01

    To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.

  10. Formulation of an aloe-based product according to Iranian traditional medicine and development of its analysis method.

    PubMed

    Moein, Elham; Hajimehdipoor, Homa; Toliyat, Tayebeh; Choopani, Rasool; Hamzeloo-Moghadam, Maryam

    2017-08-29

    Currently, people are more interested to traditional medicine. The traditional formulations should be converted to modern drug delivery systems to be more acceptable for the patients. In the present investigation, a poly herbal medicine "Ayarij-e-Faiqra" (AF) based on Iranian traditional medicine (ITM) has been formulated and its quality control parameters have been developed. The main ingredients of AF including barks of Cinnamomum zeylanicum Blume and Cinnamomum cassia J. Presl, the rhizomes of Nardostachys jatamansi DC., the fruits of Piper cubeba L.f., the flowers of Rosa damascena Herrm., the oleo gum resin of Pistacia terebinthus L. and Aloe spp. dried juice were powdered and used for preparing seven tablet formulations of the herbal mixture. Flowability of the different formulated powders was examined and the best formulations were selected (F6&F7). The tablets were prepared from the selected formulations compared according to the physical characteristics and finally, F7 was selected and coated. Physicochemical characters of core and coated AF tablets were determined and the HPLC method for quantitation of aloin as a marker of tablets was selected and verified according to selectivity, linearity, precision, recovery, LOD and LOQ. The results showed that core and coated AF tablets were in agreement with USP requirements for herbal drugs. They had acceptable appearance, disintegration time, friability, hardness, dissolution behavior, weight variation and content uniformity. The amount of aloin in tablets was found 123.1 mg/tab. The HPLC method for aloin determination in AF tablets was verified according to selectivity, linearity (5-500 μg/ml, r 2 :0.9999), precision (RSD: 1.62%), recovery (108.0%), LOD & LOQ (0.0053 & 0.0161 μg/ml). The formulated tablets could be a good substitute for powder and capsules of AF in ITM clinics with a feasible and precise method for its quality control. Ayarij-e-Faiqra formulation.

  11. Evaluation of antioxidant, antibacterial, and antidiabetic potential of two traditional medicinal plants of India: Swertia cordata and Swertia chirayita

    PubMed Central

    Roy, Priyanka; Abdulsalam, Fatima I.; Pandey, D. K.; Bhattacharjee, Aniruddha; Eruvaram, Naveen Reddy; Malik, Tabarak

    2015-01-01

    Background: Swertia cordata and Swertia chirayita are temperate Himalayan medicinal plants used as potent herbal drugs in Indian traditional systems of medicine (Ayurvedic, Unani and Siddha). Objective: Assessment of Antioxidant, antibacterial, and antidiabetic potential of Swertia cordata and Swertia chirayita. Materials and Methods: Phytochemicals of methanolic and aqueous extracts of the two Swertia species were analyzed. The antioxidant potential of all the extracts was assessed by measuring total phenolic content, total flavonoid content and free radical scavenging potential was assessed by 1,1-diphenyl-2-picrilhydrazyl (DPPH) assay, antibacterial activity was assessed against various pathogenic and nonpathogenic bacteria in vitro by Kirby-Bauer agar well diffusion method and antidiabetic activity was assessed by α-amylase inhibition. Results: Methanolic leaf extracts of both the species of Swertia contain significant antibacterial as well as anti-diabetic potential, whereas methanolic root extracts of both species were found to have potential antioxidant activity. However, Swertia chirayita showed better activities than Swertia cordata although both species have good reputation in traditional Indian medicine. Conclusion: Both the species are having high medicinal potential in terms of their antioxidant, antibacterial and antidiabetic activities. Studies are required to further elucidate antioxidant, anti-diabetic and antibacterial potentials using various in-vitro, in-vivo biochemical and molecular biology techniques. PMID:26109789

  12. Emerging importance of geographical indications and designations of origin - authenticating geo-authentic botanicals and implications for phytotherapy.

    PubMed

    Brinckmann, J A

    2013-11-01

    Pharmacopoeial monographs providing specifications for composition, identity, purity, quality, and strength of a botanical are developed based on analysis of presumably authenticated botanical reference materials. The specimens should represent the quality traditionally specified for the intended use, which may require different standards for medicinal versus food use. Development of quality standards monographs may occur through collaboration between a sponsor company or industry association and a pharmacopoeial expert committee. The sponsor may base proposed standards and methods on their own preferred botanical supply which may, or may not, be geo-authentic and/or correspond to qualities defined in traditional medicine formularies and pharmacopoeias. Geo-authentic botanicals are those with specific germplasm, cultivated or collected in their traditional production regions, of a specified biological age at maturity, with specific production techniques and processing methods. Consequences of developing new monographs that specify characteristics of an 'introduced' cultivated species or of a material obtained from one unique origin could lead to exclusion of geo-authentic herbs and may have therapeutic implications for clinical practice. In this review, specifications of selected medicinal plants with either a geo-authentic or geographical indication designation are discussed and compared against official pharmacopoeial standards for same genus and species regardless of origin. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-07-31

    real - time systems . This was accomplished by extending techniques, based on automata theory and temporal logic, that have been successful for the verification of time-independent reactive systems. As system specification lanmaage for embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous environment variables. As requirements specification languages, we introduced temporal logics with clock variables for expressing timing constraints.

  14. Reducing covert self-injurious behavior maintained by automatic reinforcement through a variable momentary DRO procedure.

    PubMed

    Toussaint, Karen A; Tiger, Jeffrey H

    2012-01-01

    Covert self-injurious behavior (i.e., behavior that occurs in the absence of other people) can be difficult to treat. Traditional treatments typically have involved sophisticated methods of observation and often have employed positive punishment procedures. The current study evaluated the effectiveness of a variable momentary differential reinforcement contingency in the treatment of covert self-injury. Neither positive punishment nor extinction was required to produce decreased skin picking.

  15. Comparison of cast materials for the treatment of congenital idiopathic clubfoot using the Ponseti method: a prospective randomized controlled trial

    PubMed Central

    Hui, Catherine; Joughin, Elaine; Nettel-Aguirre, Alberto; Goldstein, Simon; Harder, James; Kiefer, Gerhard; Parsons, David; Brauer, Carmen; Howard, Jason

    2014-01-01

    Background The Ponseti method of congenital idiopathic clubfoot correction has traditionally specified plaster of Paris (POP) as the cast material of choice; however, there are negative aspects to using POP. We sought to determine the influence of cast material (POP v. semirigid fibreglass [SRF]) on clubfoot correction using the Ponseti method. Methods Patients were randomized to POP or SRF before undergoing the Ponseti method. The primary outcome measure was the number of casts required for clubfoot correction. Secondary outcome measures included the number of casts by severity, ease of cast removal, need for Achilles tenotomy, brace compliance, deformity relapse, need for repeat casting and need for ancillary surgical procedures. Results We enrolled 30 patients: 12 randomized to POP and 18 to SRF. There was no difference in the number of casts required for clubfoot correction between the groups (p = 0.13). According to parents, removal of POP was more difficult (p < 0.001), more time consuming (p < 0.001) and required more than 1 method (p < 0.001). At a final follow-up of 30.8 months, the mean times to deformity relapse requiring repeat casting, surgery or both were 18.7 and 16.4 months for the SRF and POP groups, respectively. Conclusion There was no significant difference in the number of casts required for correction of clubfoot between the 2 materials, but SRF resulted in a more favourable parental experience, which cannot be ignored as it may have a positive impact on psychological well-being despite the increased cost associated. PMID:25078929

  16. VOLATILE CONSTITUENTS OF GINGER OIL PREPARED ACCORDING TO IRANIAN TRADITIONAL MEDICINE AND CONVENTIONAL METHOD: A COMPARATIVE STUDY.

    PubMed

    Shirooye, Pantea; Mokaberinejad, Roshanak; Ara, Leila; Hamzeloo-Moghadam, Maryam

    2016-01-01

    Herbal medicines formulated as oils were believed to possess more powerful effects than their original plants in Iranian Traditional Medicine (ITM). One of the popular oils suggested for treatment of various indications was ginger oil. In the present study, to suggest a more convenient method of oil preparation (compared to the traditional method), ginger oil has been prepared according to both the traditional and conventional maceration methods and the volatile oil constituents have been compared. Ginger oil was obtained in sesame oil according to both the traditional way and the conventional (maceration) methods. The volatile oil of dried ginger and both oils were obtained by hydro-distillation and analyzed by gas chromatography/mass spectroscopy. Fifty five, fifty nine and fifty one components consisting 94 %, 94 % and 98 % of the total compounds were identified in the volatile oil of ginger, traditional and conventional oils, respectively. The most dominant compounds of the traditional and conventional oils were almost similar; however they were different from ginger essential oil which has also been to possess limited amounts of anti-inflammatory components. It was concluded that ginger oil could be prepared through maceration method and used for indications mentioned in ITM.

  17. Comparison of traditional trigger tool to data warehouse based screening for identifying hospital adverse events.

    PubMed

    O'Leary, Kevin J; Devisetty, Vikram K; Patel, Amitkumar R; Malkenson, David; Sama, Pradeep; Thompson, William K; Landler, Matthew P; Barnard, Cynthia; Williams, Mark V

    2013-02-01

    Research supports medical record review using screening triggers as the optimal method to detect hospital adverse events (AE), yet the method is labour-intensive. This study compared a traditional trigger tool with an enterprise data warehouse (EDW) based screening method to detect AEs. We created 51 automated queries based on 33 traditional triggers from prior research, and then applied them to 250 randomly selected medical patients hospitalised between 1 September 2009 and 31 August 2010. Two physicians each abstracted records from half the patients using a traditional trigger tool and then performed targeted abstractions for patients with positive EDW queries in the complementary half of the sample. A third physician confirmed presence of AEs and assessed preventability and severity. Traditional trigger tool and EDW based screening identified 54 (22%) and 53 (21%) patients with one or more AE. Overall, 140 (56%) patients had one or more positive EDW screens (total 366 positive screens). Of the 137 AEs detected by at least one method, 86 (63%) were detected by a traditional trigger tool, 97 (71%) by EDW based screening and 46 (34%) by both methods. Of the 11 total preventable AEs, 6 (55%) were detected by traditional trigger tool, 7 (64%) by EDW based screening and 2 (18%) by both methods. Of the 43 total serious AEs, 28 (65%) were detected by traditional trigger tool, 29 (67%) by EDW based screening and 14 (33%) by both. We found relatively poor agreement between traditional trigger tool and EDW based screening with only approximately a third of all AEs detected by both methods. A combination of complementary methods is the optimal approach to detecting AEs among hospitalised patients.

  18. Remote sensing as a source of land cover information utilized in the universal soil loss equation

    NASA Technical Reports Server (NTRS)

    Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.; Scarpace, F. L.

    1979-01-01

    In this study, methods for gathering the land use/land cover information required by the USLE were investigated with medium altitude, multi-date color and color infrared 70-mm positive transparencies using human and computer-based interpretation techniques. Successful results, which compare favorably with traditional field study methods, were obtained within the test site watershed with airphoto data sources and human airphoto interpretation techniques. Computer-based interpretation techniques were not capable of identifying soil conservation practices but were successful to varying degrees in gathering other types of desired land use/land cover information.

  19. Patterns of behavior in online homework for introductory physics

    NASA Astrophysics Data System (ADS)

    Fredericks, Colin

    Student activity in online homework was obtained from courses in physics in 2003 and 2005. This data was analyzed through a variety of methods, including principal component analysis, Pearson's r correlation, and comparison to performance measures such as detailed exam scores. Through this analysis it was determined which measured homework behaviors were associated with high exam scores and course grades. It was also determined that homework problems requiring analysis can have an impact on certain types of exam problems where traditional homework does not. Suggestions are given for future research and possible use of these methods in other contexts.

  20. Identifying Issue Frames in Text

    PubMed Central

    Sagi, Eyal; Diermeier, Daniel; Kaufmann, Stefan

    2013-01-01

    Framing, the effect of context on cognitive processes, is a prominent topic of research in psychology and public opinion research. Research on framing has traditionally relied on controlled experiments and manually annotated document collections. In this paper we present a method that allows for quantifying the relative strengths of competing linguistic frames based on corpus analysis. This method requires little human intervention and can therefore be efficiently applied to large bodies of text. We demonstrate its effectiveness by tracking changes in the framing of terror over time and comparing the framing of abortion by Democrats and Republicans in the U.S. PMID:23874909

  1. [Methodology of determination of the time of death and outlooks for the further development].

    PubMed

    Novikov, P I; Vlasov, A Iu; Shved, E F; Natsentov, E O; Korshunov, N V; Belykh, S A

    2004-01-01

    A methodological analysis of diagnosing the prescription of death coming (PDC) is described in the paper. Key philosophic fundamentals for further novel and more effective methods of PDC determination are elucidated. Main requirement applicable to postmortem diagnosis are defined. Different methods of modeling the postmortem process are demonstrated by the example of cadaver cooling, i.e. in real time, by analogue computer systems and by mathematic modeling. The traditional empiric and the adaptive approaches are comparatively analyzed in modeling the postmortem processes for the PDC diagnosis. A variety of promising trends for further related research is outlined.

  2. [Closed-loop management model of clinical investigational product for new drug of traditional Chinese medicine].

    PubMed

    Wu, Ping; Zhang, Jian-Wu

    2013-09-01

    This paper discussed the management regulations and technical requirements of clinical investigational product for new drug of traditional Chinese medicine, analyzed some common problems on the management of them, and proposed the establishment of closed-loop management model and management requirements in various aspects.

  3. Extra-Required Service: A Radical Shift in Frontline Geriatric Caregiving

    ERIC Educational Resources Information Center

    Clarke, Egerton

    2011-01-01

    Much research examines the professional nursing practices of traditional and modern caregivers, but it remains unclear whether the delivery of extra-required services is diminished as the caregiver moves from traditional to modern community. Building on the classical works of sociologists Ferdinand Tonnies, Max Weber, and Emile Durkheim, this…

  4. Cost-effectiveness of home telemedical cardiotocography compared with traditional outpatient monitoring.

    PubMed

    Tõrõk, M; Kovács, F; Doszpod, J

    2000-01-01

    We compared the cost of passive sensor telemedical non-stress cardiotocography performed at home and the same test performed by traditional equipment in an outpatient clinic in the Budapest area. The costs were calculated using two years' registered budget data from the home monitoring service in Budapest and the outpatient clinic of the department of obstetrics and gynaecology at the Haynal Imre University of Health Sciences. The traditional test at the university outpatient clinic cost 3652 forint for the health-care and 1000 forint in additional expenses for the patient (travel and time off work). This means that the total cost for each test in the clinic was 4652 forint. The cost of home telemedical cardiotocography was 1500 forint per test, but each test took 2.1 times as long. For a more realistic comparison between the two methods, we adjusted the cost to take account of the extra length of time that home monitoring required. The adjusted cost for home care was 3150 forint, some 32% lower than in the clinic. Passive sensor telemedical non-stress cardiotocography at home was therefore less expensive than the same test performed in the traditional way in an outpatient clinic.

  5. Seaweed cultivation: Traditional way and its reformation

    NASA Astrophysics Data System (ADS)

    Fei, Xiu-Geng; Bao, Ying; Lu, Shan

    1999-09-01

    Seaweed cultivation or phycoculture has been developed rather fast in recent years. The total production of cultivated seaweed at present is about 6250×103 tons fresh weight. The total cultivation area is estimated as 200×103 hectare. The annual total value of cultivated seaweeds has been estimated to be more than 3 billion US dollars. Phycoculture provides many job opportunities for the coastal region people, has the potential to improve marine environments and thus even induce global change. All traditional cultivation methods and techniques are based on or start from the individual plant or the cultivated seaweed population. Modern biological science and biotechnology achievements have benefited agriculture a lot, but traditional seaweed cultivation has not changed much since its founding. This is because seaweed cultivation has been quite conservative for quite a long period and has accumulated many problems requiring solution. Four main problems might be the most universal ones holding back further development of the industry. New ways of seaweed cultivation must be developed, new techniques must be perfected, and new problems solved. This paper mainly discusses the main problems of traditional seaweed cultivation at present and its possible further development and reformation in the future.

  6. Exploring Non-Traditional Learning Methods in Virtual and Real-World Environments

    ERIC Educational Resources Information Center

    Lukman, Rebeka; Krajnc, Majda

    2012-01-01

    This paper identifies the commonalities and differences within non-traditional learning methods regarding virtual and real-world environments. The non-traditional learning methods in real-world have been introduced within the following courses: Process Balances, Process Calculation, and Process Synthesis, and within the virtual environment through…

  7. Improving Nursing Students' Learning Outcomes in Fundamentals of Nursing Course through Combination of Traditional and e-Learning Methods

    PubMed Central

    Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin

    2018-01-01

    Background: Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. Materials and Methods: A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Results: Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Conclusions: Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills. PMID:29861761

  8. Traditional fertility regulation among the Yoruba of southwestern Nigeria. I. A study of prevalence, attitudes, practice and methods.

    PubMed

    Jinadu, M K; Olusi, S O; Ajuwon, B

    1997-03-01

    This study was conducted among Yoruba women and traditional healers with the aim of identifying and describing the practice, preparation, and administration of traditional contraceptives. The data were obtained in 1990 from a random sample of 1,400 women of childbearing age and 42 traditional healers in Nigeria's Oranmiyan area, using questionnaires and in-depth interviews. Findings revealed that knowledge of the traditional contraceptives is nearly universal among the Yoruba population, and the traditional contraceptive prevalence rate is 7.1 percent. The use of traditional contraceptives was significantly more common among uneducated women and among women aged 20 to 29 years old. Findings also revealed the existence of four main varieties of traditional contraceptive devices, the methods of preparation of the traditional contraceptives, varieties of herbal and animal products used, methods of administration, and taboos against usage. The easy accessibility of traditional medical practitioners and the belief that traditional contraceptive devices are devoid of complications, especially among those experienced with modern contraceptive devices, were the main reasons women cited for patronizing the traditional practitioners. The paper concludes with policy implications for family planning programmers in Nigeria.

  9. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  10. Comparison of spike-sorting algorithms for future hardware implementation.

    PubMed

    Gibson, Sarah; Judy, Jack W; Markovic, Dejan

    2008-01-01

    Applications such as brain-machine interfaces require hardware spike sorting in order to (1) obtain single-unit activity and (2) perform data reduction for wireless transmission of data. Such systems must be low-power, low-area, high-accuracy, automatic, and able to operate in real time. Several detection and feature extraction algorithms for spike sorting are described briefly and evaluated in terms of accuracy versus computational complexity. The nonlinear energy operator method is chosen as the optimal spike detection algorithm, being most robust over noise and relatively simple. The discrete derivatives method [1] is chosen as the optimal feature extraction method, maintaining high accuracy across SNRs with a complexity orders of magnitude less than that of traditional methods such as PCA.

  11. Infrared coagulation: a new treatment for hemorrhoids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leicester, R.J.; Nicholls, R.J.; Mann, C.V.

    Many methods, which have effectively reduced the number of patients requiring hospital admission, have been described for the outpatient treatment of hemorrhoids. However, complications have been reported, and the methods are often associated with unpleasant side effects. In 1977 Neiger et al. described a new method that used infrared coagulation, which produced minimal side effects. The authors have conducted a prospective, randomized trial to evaluate infrared coagulation compared with more traditional methods of treatment. The authors' results show that it may be more effective than injection sclerotherapy in treating non-prolapsing hemorrhoids and that it compares favorably with rubber band ligationmore » in most prolapsing hemorrhoids. No complications occurred, and significantly fewer patients experienced pain after infrared coagulation (P . less than 0.001).« less

  12. Off-pump repair of a post-infarct ventricular septal defect: the 'Hamburger procedure'

    PubMed Central

    Barker, Thomas A; Ng, Alexander; Morgan, Ian S

    2006-01-01

    We report a novel off-pump technique for the surgical closure of post-infarct ventricular septal defects (VSDs). The case report describes the peri-operative management of a 76 year old lady who underwent the 'Hamburger procedure' for closure of her apical VSD. Refractory cardiogenic shock meant that traditional patch repairs requiring cardiopulmonary bypass would be poorly tolerated. We show that echocardiography guided off-pump posterior-anterior septal plication is a safe, effective method for closing post-infarct VSDs in unstable patients. More experience is required to ascertain whether this technique will become an accepted alternative to patch repairs. PMID:16722552

  13. The Integration of Voice and Dance Techniques in Musical Theatre: Anatomical Considerations.

    PubMed

    Morton, Jennie

    2015-06-01

    Musical theatre performers are required to be proficient in the three artistic disciplines of dancing, singing, and acting, although in today's modern productions, there is often a requirement to incorporate other skills such as acrobatics and the playing of an instrument. This article focuses on the issues faced by performers when dancing and voicing simultaneously, as it is between these two disciplines where we see the greatest pedagogical divide in terms of breath management and muscle recruitment patterns. The traditional teaching methods of dance and voice techniques are examined, areas of conflict highlighted, and solutions proposed through an exploration of the relevant anatomy.

  14. Raymond's Paragraph System: an alternative format for the organization of gross pathology reports and its implementation in an academic teaching hospital.

    PubMed

    Dayton, Annette S; Ro, Jae Y; Schwartz, Mary R; Ayala, Alberto G; Raymond, A Kevin

    2009-02-01

    Traditionally organized gross pathology reports, which are widely used in pathology resident and pathologists' assistant training programs, may not offer the most efficient method of communicating pertinent information to treating physicians. Instructional materials for teaching gross pathology dictation are limited and the teaching methods used are inconsistent. Raymond's Paragraph System, a gross pathology report formatting system, was developed for use at a cancer center and has been implemented at The Methodist Hospital, Houston, Tex, an academic medical center. Unlike traditionally organized reports in which everything is normally dictated in 1 long paragraph, this system separates the dictation into multiple paragraphs creating an organized and comprehensible report. Recent literature regarding formatting of pathology reports focuses primarily on the organization of specimen diagnoses and overall report layout. However, little literature is available that highlights organization of the specimen gross descriptions. To provide instruction to pathologists, pathology residents and fellows, and pathologists' assistant students about an alternative method of organizing gross pathology reports. Review of pertinent literature relating to preparation of gross pathology reports, report formatting, and pathology laboratory credentialing requirements. The paragraph system offers a viable alternative to traditionally organized pathology reports. Primarily, it provides a working model for medical professionals-in-training. It helps create user-friendly pathology reports by giving precise and concise information in a standardized format. This article provides an overview of the system and discusses our experience in its implementation.

  15. A Diffusive Gradient-in-Thin-Film Technique for Evaluation of the Bioavailability of Cd in Soil Contaminated with Cd and Pb

    PubMed Central

    Wang, Peifang; Wang, Teng; Yao, Yu; Wang, Chao; Liu, Cui; Yuan, Ye

    2016-01-01

    Management of heavy metal contamination requires accurate information about the distribution of bioavailable fractions, and about exchange between the solid and solution phases. In this study, we employed diffusive gradients in thin-films (DGT) and traditional chemical extraction methods (soil solution, HOAc, EDTA, CaCl2, and NaOAc) to determine the Cd bioavailability in Cd-contaminated soil with the addition of Pb. Two typical terrestrial species (wheat, Bainong AK58; maize, Zhengdan 958) were selected as the accumulation plants. The results showed that the added Pb may enhance the efficiency of Cd phytoextraction which is indicated by the increasing concentration of Cd accumulating in the plant tissues. The DGT-measured Cd concentrations and all the selected traditional extractants measured Cd concentrations all increased with increasing concentration of the addition Pb which were similar to the change trends of the accumulated Cd concentrations in plant tissues. Moreover, the Pearson regression coefficients between the different indicators obtained Cd concentrations and plants uptake Cd concentrations were further indicated significant correlations (p < 0.01). However, the values of Pearson regression coefficients showed the merits of DGT, CaCl2, and Csol over the other three methods. Consequently, the in situ measurement of DGT and the ex situ traditional methods could all reflect the inhibition effects between Cd and Pb. Due to the feature of dynamic measurements of DGT, it could be a robust tool to predict Cd bioavaiability in complex contaminated soil. PMID:27271644

  16. Generalization of the Poincare sphere to process 2D displacement signals

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Lamberti, Luciano

    2017-06-01

    Traditionally the multiple phase method has been considered as an essential tool for phase information recovery. The in-quadrature phase method that theoretically is an alternative pathway to achieve the same goal failed in actual applications. The authors in a previous paper dealing with 1D signals have shown that properly implemented the in-quadrature method yields phase values with the same accuracy than the multiple phase method. The present paper extends the methodology developed in 1D to 2D. This extension is not a straight forward process and requires the introduction of a number of additional concepts and developments. The concept of monogenic function provides the necessary tools required for the extension process. The monogenic function has a graphic representation through the Poincare sphere familiar in the field of Photoelasticity and through the developments introduced in this paper connected to the analysis of displacement fringe patterns. The paper is illustrated with examples of application that show that multiple phases method and the in-quadrature are two aspects of the same basic theoretical model.

  17. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and {alpha}-Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less

  18. Statistical and Machine Learning forecasting methods: Concerns and ways forward

    PubMed Central

    Makridakis, Spyros; Assimakopoulos, Vassilios

    2018-01-01

    Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784

  19. An Application of the Quadrature-Free Discontinuous Galerkin Method

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Atkins, Harold L.

    2000-01-01

    The process of generating a block-structured mesh with the smoothness required for high-accuracy schemes is still a time-consuming process often measured in weeks or months. Unstructured grids about complex geometries are more easily generated, and for this reason, methods using unstructured grids have gained favor for aerodynamic analyses. The discontinuous Galerkin (DG) method is a compact finite-element projection method that provides a practical framework for the development of a high-order method using unstructured grids. Higher-order accuracy is obtained by representing the solution as a high-degree polynomial whose time evolution is governed by a local Galerkin projection. The traditional implementation of the discontinuous Galerkin uses quadrature for the evaluation of the integral projections and is prohibitively expensive. Atkins and Shu introduced the quadrature-free formulation in which the integrals are evaluated a-priori and exactly for a similarity element. The approach has been demonstrated to possess the accuracy required for acoustics even in cases where the grid is not smooth. Other issues such as boundary conditions and the treatment of non-linear fluxes have also been studied in earlier work This paper describes the application of the quadrature-free discontinuous Galerkin method to a two-dimensional shear layer problem. First, a brief description of the method is given. Next, the problem is described and the solution is presented. Finally, the resources required to perform the calculations are given.

  20. Impact of agile methodologies on team capacity in automotive radio-navigation projects

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Hutanu, A.; Volker, S.

    2017-01-01

    The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.

  1. Using Mosix for Wide-Area Compuational Resources

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  2. Capsule endoscopy in Crohn’s disease: Are we seeing any better?

    PubMed Central

    Hudesman, David; Mazurek, Jonathan; Swaminath, Arun

    2014-01-01

    Crohn’s disease (CD) is a complex, immune-mediated disorder that often requires a multi-modality approach for optimal diagnosis and management. While traditional methods include ileocolonoscopy and radiologic modalities, increasingly, capsule endoscopy (CE) has been incorporated into the algorithm for both the diagnosis and monitoring of CD. Multiple studies have examined the utility of this emerging technology in the management of CD, and have compared it to other available modalities. CE offers a noninvasive approach to evaluate areas of the small bowel that are difficult to reach with traditional endoscopy. Furthermore, CE maybe favored in specific sub segments of patients with inflammatory bowel disease (IBD), such as those with IBD unclassified (IBD-U), pediatric patients and patients with CD who have previously undergone surgery. PMID:25278698

  3. [Research and development strategies in classical herbal formulae].

    PubMed

    Chen, Chang; Cheng, Jin-Tang; Liu, An

    2017-05-01

    As an outstanding representative of traditional Chinese medicine prescription, classical herbal formulae are the essence of traditional Chinese medicine great treasure. To support the development of classical herbal formulae, the state and relevant administrative departments have successively promulgated the relevant encouraged policies.But some key issues of classic herbal formulae in the development process have not reached a unified consensus and standard, and these problems were discussed in depth here.The authors discussed the registration requirements of classical herbal formulae, proposed the screening specific indicators of classical herbal formulae, determination basis of prescription and dosage,screening method of production process, and the basic principle of clinical localization, in order to bring out valuable opinions and provide a reference for classical herbal formulae development and policy formulation. Copyright© by the Chinese Pharmaceutical Association.

  4. Traditional Chinese medicine: potential approaches from modern dynamical complexity theories.

    PubMed

    Ma, Yan; Zhou, Kehua; Fan, Jing; Sun, Shuchen

    2016-03-01

    Despite the widespread use of traditional Chinese medicine (TCM) in clinical settings, proving its effectiveness via scientific trials is still a challenge. TCM views the human body as a complex dynamical system, and focuses on the balance of the human body, both internally and with its external environment. Such fundamental concepts require investigations using system-level quantification approaches, which are beyond conventional reductionism. Only methods that quantify dynamical complexity can bring new insights into the evaluation of TCM. In a previous article, we briefly introduced the potential value of Multiscale Entropy (MSE) analysis in TCM. This article aims to explain the existing challenges in TCM quantification, to introduce the consistency of dynamical complexity theories and TCM theories, and to inspire future system-level research on health and disease.

  5. Intrusion detection system using Online Sequence Extreme Learning Machine (OS-ELM) in advanced metering infrastructure of smart grid.

    PubMed

    Li, Yuancheng; Qiu, Rixuan; Jing, Sitong

    2018-01-01

    Advanced Metering Infrastructure (AMI) realizes a two-way communication of electricity data through by interconnecting with a computer network as the core component of the smart grid. Meanwhile, it brings many new security threats and the traditional intrusion detection method can't satisfy the security requirements of AMI. In this paper, an intrusion detection system based on Online Sequence Extreme Learning Machine (OS-ELM) is established, which is used to detecting the attack in AMI and carrying out the comparative analysis with other algorithms. Simulation results show that, compared with other intrusion detection methods, intrusion detection method based on OS-ELM is more superior in detection speed and accuracy.

  6. New method for stock-tank oil compositional analysis.

    PubMed

    McAndrews, Kristine; Nighswander, John; Kotzakoulakis, Konstantin; Ross, Paul; Schroeder, Helmut

    2009-01-01

    A new method for accurately determining stock-tank oil composition to normal pentatriacontane using gas chromatography is developed and validated. The new method addresses the potential errors associated with the traditional equipment and technique employed for extended hydrocarbon gas chromatography outside a controlled laboratory environment, such as on an offshore oil platform. In particular, the experimental measurement of stock-tank oil molecular weight with the freezing point depression technique and the use of an internal standard to find the unrecovered sample fraction are replaced with correlations for estimating these properties. The use of correlations reduces the number of necessary experimental steps in completing the required sample preparation and analysis, resulting in reduced uncertainty in the analysis.

  7. An Evaluation of Teaching Introductory Geomorphology Using Computer-based Tools.

    ERIC Educational Resources Information Center

    Wentz, Elizabeth A.; Vender, Joann C.; Brewer, Cynthia A.

    1999-01-01

    Compares student reactions to traditional teaching methods and an approach where computer-based tools (GEODe CD-ROM and GIS-based exercises) were either integrated with or replaced the traditional methods. Reveals that the students found both of these tools valuable forms of instruction when used in combination with the traditional methods. (CMK)

  8. Improving Nursing Students' Learning Outcomes in Fundamentals of Nursing Course through Combination of Traditional and e-Learning Methods.

    PubMed

    Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin

    2018-01-01

    Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills.

  9. Mechanical and Tear Properties of Fabric/Film Laminates

    NASA Technical Reports Server (NTRS)

    Said, Magdi A.

    1998-01-01

    Films reinforced with woven fabrics are being considered for the development of a material suitable for long duration scientific balloons under a program managed by the National Aeronautics and Space Administration (NASA). Recently developed woven fabrics provide a relatively high strength to weight ratio compared to standard homogenous films. Woven fabrics also have better crack propagation resistance and rip stop capabilities when compared to homogenous lightweight, high strength polymeric films such as polyester and nylon. If joining is required, such as in the case of scientific balloons, woven fabrics have the advantage over polymeric thin films to utilize traditional textile methods as well as other techniques including hot sealing, adhesion, and ultrasonic means. Woven fabrics, however, lack the barrier properties required for helium filled scientific balloons, therefore lamination with homogenous films is required to provide the gas barrier capabilities required in these applications.

  10. Managing Complex IT Security Processes with Value Based Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2009-01-01

    Current trends indicate that IT security measures will need to greatly expand to counter the ever increasingly sophisticated, well-funded and/or economically motivated threat space. Traditional risk management approaches provide an effective method for guiding courses of action for assessment, and mitigation investments. However, such approaches no matter how popular demand very detailed knowledge about the IT security domain and the enterprise/cyber architectural context. Typically, the critical nature and/or high stakes require careful consideration and adaptation of a balanced approach that provides reliable and consistent methods for rating vulnerabilities. As reported in earlier works, the Cyberspace Security Econometrics System provides amore » comprehensive measure of reliability, security and safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders interests in that requirement. This paper advocates a dependability measure that acknowledges the aggregate structure of complex system specifications, and accounts for variations by stakeholder, by specification components, and by verification and validation impact.« less

  11. Towards field malaria diagnosis using surface enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Chen, Keren; Xiong, Aoli; Yuen, Clement; Preiser, Peter; Liu, Quan

    2016-04-01

    We report three strategies of surface enhanced Raman spectroscopy (SERS) for β-hematin and hemozoin detection in malaria infected human blood, which can be potentially developed for field malaria diagnosis. In the first strategy, we used silver coated magnetic nanoparticles (Fe3O4@Ag) in combination with an external magnetic field to enhance the Raman signal of β-hematin. Then we developed two SERS methods without the requirement of magnetic field for malaria infection diagnosis. In Method 1, silver nanoparticles were synthesized separately and then mixed with lysed blood just like in traditional SERS measurements; while in Method 2, we developed an ultrasensitive SERS method by synthesizing silver nanoparticles directly inside the parasites of Plasmodium falciparum. Method 2 can be also used to detect single parasites in the ring stage.

  12. Intelligent Detection of Structure from Remote Sensing Images Based on Deep Learning Method

    NASA Astrophysics Data System (ADS)

    Xin, L.

    2018-04-01

    Utilizing high-resolution remote sensing images for earth observation has become the common method of land use monitoring. It requires great human participation when dealing with traditional image interpretation, which is inefficient and difficult to guarantee the accuracy. At present, the artificial intelligent method such as deep learning has a large number of advantages in the aspect of image recognition. By means of a large amount of remote sensing image samples and deep neural network models, we can rapidly decipher the objects of interest such as buildings, etc. Whether in terms of efficiency or accuracy, deep learning method is more preponderant. This paper explains the research of deep learning method by a great mount of remote sensing image samples and verifies the feasibility of building extraction via experiments.

  13. A Comparison of Four Different Beach Profiling Techniques at St Leonards, Victoria - An Example of a Collaborative Stakeholder Research Project

    NASA Astrophysics Data System (ADS)

    Cox, N. L.; Miner, A. S.; Wynn, N.; Turner, D.

    2015-12-01

    Many beaches in Australia are under attack from shoreline erosion due rising sea levels and the action of waves. St Leonard's beach, a tourist town on the Victorian coastline, is of concern from this destructive erosion and the threat to the economic stability of the town. The major cause of erosion in this area is related to waves created from strong to gale force north to north-easterly winds. This in turn produces a northerly longshore current along with sediment suspension leading to a negative sediment budget. Ongoing and systematic monitoring of the shoreline movement is important to ensure the coast is understood and effectively managed now and into the future. Coastal land managers and agencies are required to find 'cost-effective' and 'fit-for-purpose' coastal monitoring methodologies which are affordable and efficient. This project forges a collaboration of stakeholders from academia, public sector land manager, local government and the private sector to compare four different methods of obtaining beach profiles. The four methods of obtaining beach profiles used for comparison are: 1. traditional survey method along transects using a total station theodolite, 2. traditional survey method along transects using a builder's grade laser level, 3. a small multi-rotor unmanned aerial vehicle (UAV) to produce a full 3D digital surface model of the study area, and 4. an experimental stationary device to produce a limited 3D model along a designated transect using terrestrial photogrammetric approach via a small GPS enabled camera. Assessment is made by comparing the method's precision, spatial coverage, expertise and equipment requirements/costs, preparation time, field acquisition time, number of people required in the field, post-acquisition processing time, and applicability for community use. Whilst is must be very clearly stated that all methods proved to be successful, the preliminary results of the "workflow and resourcing" assessment ranked the methods in the following order from highest ranking: 1. Laser levelling, 2. Terrestrial Photogrammetry, 3. Total Station, and 4. UAV aerial survey. The outcomes of this collaboration will be of great use to coastal management organisations grappling with the effectiveness of the different types of methods which are available for measuring beach profiles.

  14. Mapping Nearshore Seagrass and Colonized Hard Bottom Spatial Distribution and Percent Biological Cover in Florida, USA Using Object Based Image Analysis of WorldView-2 Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Baumstark, R. D.; Duffey, R.; Pu, R.

    2016-12-01

    The offshore extent of seagrass habitat along the West Florida (USA) coast represents an important corridor for inshore-offshore migration of economically important fish and shellfish. Surviving at the fringe of light requirements, offshore seagrass beds are sensitive to changes in water clarity. Beyond and intermingled with the offshore seagrass areas are large swaths of colonized hard bottom. These offshore habitats of the West Florida coast have lacked mapping efforts needed for status and trends monitoring. The objective of this study was to propose an object-based classification method for mapping offshore habitats and to compare results to traditional photo-interpreted maps. Benthic maps depicting the spatial distribution and percent biological cover were created from WorldView-2 satellite imagery using Object Based Image Analysis (OBIA) method and a visual photo-interpretation method. A logistic regression analysis identified depth and distance from shore as significant parameters for discriminating spectrally similar seagrass and colonized hard bottom features. Seagrass, colonized hard bottom and unconsolidated sediment (sand) were mapped with 78% overall accuracy using the OBIA method compared to 71% overall accuracy using the photo-interpretation method. This study presents an alternative for mapping deeper, offshore habitats capable of producing higher thematic (percent biological cover) and spatial resolution maps compared to those created with the traditional photo-interpretation method.

  15. Error compensation of single-antenna attitude determination using GNSS for Low-dynamic applications

    NASA Astrophysics Data System (ADS)

    Chen, Wen; Yu, Chao; Cai, Miaomiao

    2017-04-01

    GNSS-based single-antenna pseudo-attitude determination method has attracted more and more attention from the field of high-dynamic navigation due to its low cost, low system complexity, and no temporal accumulated errors. Related researches indicate that this method can be an important complement or even an alternative to the traditional sensors for general accuracy requirement (such as small UAV navigation). The application of single-antenna attitude determining method to low-dynamic carrier has just started. Different from the traditional multi-antenna attitude measurement technique, the pseudo-attitude attitude determination method calculates the rotation angle of the carrier trajectory relative to the earth. Thus it inevitably contains some deviations comparing with the real attitude angle. In low-dynamic application, these deviations are particularly noticeable, which may not be ignored. The causes of the deviations can be roughly classified into three categories, including the measurement error, the offset error, and the lateral error. Empirical correction strategies for the formal two errors have been promoted in previous study, but lack of theoretical support. In this paper, we will provide quantitative description of the three type of errors and discuss the related error compensation methods. Vehicle and shipborne experiments were carried out to verify the feasibility of the proposed correction methods. Keywords: Error compensation; Single-antenna; GNSS; Attitude determination; Low-dynamic

  16. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    PubMed

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  17. Issues and Strategies in Solving Multidisciplinary Optimization Problems

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya

    2013-01-01

    Optimization research at NASA Glenn Research Center has addressed the design of structures, aircraft and airbreathing propulsion engines. The accumulated multidisciplinary design activity is collected under a testbed entitled COMETBOARDS. Several issues were encountered during the solution of the problems. Four issues and the strategies adapted for their resolution are discussed. This is followed by a discussion on analytical methods that is limited to structural design application. An optimization process can lead to an inefficient local solution. This deficiency was encountered during design of an engine component. The limitation was overcome through an augmentation of animation into optimization. Optimum solutions obtained were infeasible for aircraft and airbreathing propulsion engine problems. Alleviation of this deficiency required a cascading of multiple algorithms. Profile optimization of a beam produced an irregular shape. Engineering intuition restored the regular shape for the beam. The solution obtained for a cylindrical shell by a subproblem strategy converged to a design that can be difficult to manufacture. Resolution of this issue remains a challenge. The issues and resolutions are illustrated through a set of problems: Design of an engine component, Synthesis of a subsonic aircraft, Operation optimization of a supersonic engine, Design of a wave-rotor-topping device, Profile optimization of a cantilever beam, and Design of a cylindrical shell. This chapter provides a cursory account of the issues. Cited references provide detailed discussion on the topics. Design of a structure can also be generated by traditional method and the stochastic design concept. Merits and limitations of the three methods (traditional method, optimization method and stochastic concept) are illustrated. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions can be produced by all the three methods. The variation in the weight calculated by the methods was found to be modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.

  18. Untangling cultural inheritance: language diversity and long-house architecture on the Pacific northwest coast

    PubMed Central

    Jordan, Peter; O'Neill, Sean

    2010-01-01

    Many recent studies of cultural inheritance have focused on small-scale craft traditions practised by single individuals, which do not require coordinated participation by larger social collectives. In this paper, we address this gap in the cultural transmission literature by investigating diversity in the vernacular architecture of the Pacific northwest coast, where communities of hunter–fisher–gatherers constructed immense wooden long-houses at their main winter villages. Quantitative analyses of long-house styles along the coastline draw on a range of models and methods from the biological sciences and are employed to test hypotheses relating to basic patterns of macro-scale cultural diversification, and the degree to which the transmission of housing traits has been constrained by the region's numerous linguistic boundaries. The results indicate relatively strong branching patterns of cultural inheritance and also close associations between regional language history and housing styles, pointing to the potentially crucial role played by language boundaries in structuring large-scale patterns of cultural diversification, especially in relation to ‘collective’ cultural traditions like housing that require substantial inputs of coordinated labour. PMID:21041212

  19. Presurgical mapping of basal cell carcinoma or squamous cell carcinoma by confocal laser endomicroscopy compared to traditional micrographic surgery: a single-centre prospective feasibility study.

    PubMed

    Schulz, Alexandra; Daali, Samira; Javed, Mehreen; Fuchs, Paul Christian; Brockmann, Michael; Igressa, Alhadi; Charalampaki, Patra

    2016-12-01

    At present, no ideal diagnostic tools exist in the market to excise cancer tissue with the required safety margins and to achieve optimal aesthetic results using tissue-conserving techniques. In this prospective study, confocal laser endomicroscopy (CLE) and the traditional gold standard of magnifying glasses (MG) were compared regarding the boundaries of in vivo basal cell carcinoma and squamous cell carcinoma. Tumour diameters defined by both methods were measured and compared with those determined by histopathological examination. Nineteen patients were included in the study. The CLE technique was found to be superior to excisional margins based on MG only. Re-excision was required in 68% of the cases following excision based on MG evaluation, but only in 27% of the cases for whom excision margins were based on CLE. Our results are promising regarding the distinction between tumour and healthy surrounding tissue, and indicate that presurgical mapping of basal cell carcinoma and squamous cell carcinoma is possible. The tool itself should be developed further with special attention to early detection of skin cancer.

  20. Microwave assisted synthesis and characterization of magnesium substituted calcium phosphate bioceramics.

    PubMed

    Khan, Nida Iqbal; Ijaz, Kashif; Zahid, Muniza; Khan, Abdul S; Abdul Kadir, Mohammed Rafiq; Hussain, Rafaqat; Anis-Ur-Rehman; Darr, Jawwad A; Ihtesham-Ur-Rehman; Chaudhry, Aqif A

    2015-11-01

    Hydroxyapatite is used extensively in hard tissue repair due to its biocompatibility and similarity to biological apatite, the mineral component of bone. It differs subtly in composition from biological apatite which contains other ions such as magnesium, zinc, carbonate and silicon (believed to play biological roles). Traditional methods of hydroxyapatite synthesis are time consuming and require strict reaction parameter control. This paper outlines synthesis of magnesium substituted hydroxyapatite using simple microwave irradiation of precipitated suspensions. Microwave irradiation resulted in a drastic decrease in ageing times of amorphous apatitic phases. Time taken to synthesize hydroxyapatite (which remained stable upon heat treatment at 900°C for 1h) reduced twelve folds (to 2h) as compared to traditionally required times. The effects of increasing magnesium concentration in the precursors on particle size, surface area, phase-purity, agglomeration and thermal stability, were observed using scanning electron microscopy, BET surface area analysis, X-ray diffraction and photo acoustic Fourier transform infra-red spectroscopy. Porous agglomerates were obtained after a brief heat-treatment (1h) at 900°C. Copyright © 2015 Elsevier B.V. All rights reserved.

Top