ERIC Educational Resources Information Center
Feng, Xiaoying; Lu, Guangxin; Yao, Zhihong
2015-01-01
Curriculum development for distance education (DE) practitioners is more and more focusing on practical requirements and competence development. Delphi and DACUM methods have been used at some universities. However, in the competency-based development area, these methods have been taken over by professional-task-based development in the last…
ERIC Educational Resources Information Center
Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse
2015-01-01
The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…
ERIC Educational Resources Information Center
Mattord, Herbert J.
2012-01-01
Organizations continue to rely on password-based authentication methods to control access to many Web-based systems. This research study developed a benchmarking instrument intended to assess authentication methods used in Web-based information systems (IS). It developed an Authentication Method System Index (AMSI) to analyze collected data from…
Kishikawa, Naoya
2010-10-01
Quinones are compounds that have various characteristics such as a biological electron transporter, an industrial product and a harmful environmental pollutant. Therefore, an effective determination method for quinones is required in many fields. This review describes the development of sensitive and selective determination methods for quinones based on some detection principles and their application to analyses in environmental, pharmaceutical and biological samples. Firstly, a fluorescence method was developed based on fluorogenic derivatization of quinones and applied to environmental analysis. Secondly, a luminol chemiluminescence method was developed based on generation of reactive oxygen species through the redox cycle of quinone and applied to pharmaceutical analysis. Thirdly, a photo-induced chemiluminescence method was developed based on formation of reactive oxygen species and fluorophore or chemiluminescence enhancer by the photoreaction of quinones and applied to biological and environmental analyses.
Changes in Teaching Efficacy during a Professional Development School-Based Science Methods Course
ERIC Educational Resources Information Center
Swars, Susan L.; Dooley, Caitlin McMunn
2010-01-01
This mixed methods study offers a theoretically grounded description of a field-based science methods course within a Professional Development School (PDS) model (i.e., PDS-based course). The preservice teachers' (n = 21) experiences within the PDS-based course prompted significant changes in their personal teaching efficacy, with the…
ERIC Educational Resources Information Center
Davis, Eric J.; Pauls, Steve; Dick, Jonathan
2017-01-01
Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…
Development of performance-based evaluation methods and specifications for roadside maintenance.
DOT National Transportation Integrated Search
2011-01-01
This report documents the work performed during Project 0-6387, Performance Based Roadside : Maintenance Specifications. Quality assurance methods and specifications for roadside performance-based : maintenance contracts (PBMCs) were developed ...
The Development of a Robot-Based Learning Companion: A User-Centered Design Approach
ERIC Educational Resources Information Center
Hsieh, Yi-Zeng; Su, Mu-Chun; Chen, Sherry Y.; Chen, Gow-Dong
2015-01-01
A computer-vision-based method is widely employed to support the development of a variety of applications. In this vein, this study uses a computer-vision-based method to develop a playful learning system, which is a robot-based learning companion named RobotTell. Unlike existing playful learning systems, a user-centered design (UCD) approach is…
Development of gas chromatographic methods for the analyses of organic carbonate-based electrolytes
NASA Astrophysics Data System (ADS)
Terborg, Lydia; Weber, Sascha; Passerini, Stefano; Winter, Martin; Karst, Uwe; Nowak, Sascha
2014-01-01
In this work, novel methods based on gas chromatography (GC) for the investigation of common organic carbonate-based electrolyte systems are presented, which are used in lithium ion batteries. The methods were developed for flame ionization detection (FID), mass spectrometric detection (MS). Further, headspace (HS) sampling for the investigation of solid samples like electrodes is reported. Limits of detection are reported for FID. Finally, the developed methods were applied to the electrolyte system of commercially available lithium ion batteries as well as on in-house assembled cells.
[Analysis and Control of in Vivo Kinetics of Exosomes for the Development of Exosome-based DDS].
Takahashi, Yuki; Nishikawa, Makiya; Takakura, Yoshinobu
2016-01-01
Exosomes are secretory membrane vesicles containing lipids, proteins, and nucleic acids. They act as intercellular transporters by delivering their components to exosome recipient cells. Based on their endogenous delivery system properties, exosomes are expected to become drug delivery systems (DDS) for various molecules such as nucleic acid-based drugs. Important factors such as drug loading to exosomes, production, and pharmacokinetics of exosomes need to be considered for the development of exosome-based DDS. Of these, the pharmacokinetics of exosomes have rarely been studied, probably because of the lack of quantitative evaluation methods of in vivo exosomal pharmacokinetics. We selected lactadherin as an exosome tropic protein and developed it as a fusion protein with Gaussia luciferase to label exosomes for in vivo imaging. In addition, a fusion protein of lactadherin and streptavidin was developed, and the tissue distribution of exosomes was quantitatively evaluated by radiolabeling the exosomes using (125)I-labeled biotin. Using labeled exosomes, we found that intravenously injected exosomes were rapidly cleared from the systemic circulation by macrophages. In addition, the exosomes were mainly distributed to the liver, lung, and spleen. We also examined the effect of exosome isolation methods on their physicochemical and pharmacokinetic properties. We found that exosomes collected by the ultracentrifugation-based density-gradient method were more dispersed than exosomes collected by other methods, including the ultracentrifugation-based pelleting method. The gradient method is more time-consuming than others; therefore the development of a more efficient method for exosome isolation will advance the development of exosome-based DDS.
Drowos, Joanna; Baker, Suzanne; Harrison, Suzanne Leonard; Minor, Suzanne; Chessman, Alexander W; Baker, Dennis
2017-08-01
Community-based faculty play a large role in training medical students nationwide and require faculty development. The authors hypothesized that positive relationships exist between clerkships paying preceptors and requiring faculty development, and between protected clerkship directors' time and delivering face-to-face preceptor training, as well as with the number or length of community-based preceptor visits. Through under standing the quantity, delivery methods, barriers, and institutional support for faculty development provided to community-based preceptors teaching in family medicine clerkships, best practices can be developed. Data from the 2015 Council of Academic Family Medicine's Educational Research Alliance survey of Family Medicine Clerkship Directors were analyzed. The cross-sectional survey of clerkship directors is distributed annually to institutional representatives of U.S. and Canadian accredited medical schools. Survey questions focused on the requirements, delivery methods, barriers, and institutional support available for providing faculty development to community-based preceptors. Paying community-based preceptors was positively correlated with requiring faculty development in family medicine clerkships. The greatest barrier to providing faculty development was community-based preceptor time availability; however, face-to-face methods remain the most common delivery strategy. Many family medicine clerkship directors perform informal or no needs assessment in developing faculty development topics for community-based faculty. Providing payment to community preceptors may allow schools to enhance faculty development program activities and effectiveness. Medical schools could benefit from constructing a formal curriculum for faculty development, including formal preceptor needs assessment and program evaluation. Clerkship directors may consider recruiting and retaining community-based faculty by employing innovative faculty development delivery methods.
Stefanović, Stefica Cerjan; Bolanča, Tomislav; Luša, Melita; Ukić, Sime; Rogošić, Marko
2012-02-24
This paper describes the development of ad hoc methodology for determination of inorganic anions in oilfield water, since their composition often significantly differs from the average (concentration of components and/or matrix). Therefore, fast and reliable method development has to be performed in order to ensure the monitoring of desired properties under new conditions. The method development was based on computer assisted multi-criteria decision making strategy. The used criteria were: maximal value of objective functions used, maximal robustness of the separation method, minimal analysis time, and maximal retention distance between two nearest components. Artificial neural networks were used for modeling of anion retention. The reliability of developed method was extensively tested by the validation of performance characteristics. Based on validation results, the developed method shows satisfactory performance characteristics, proving the successful application of computer assisted methodology in the described case study. Copyright © 2011 Elsevier B.V. All rights reserved.
Agapova, Maria; Bresnahan, Brian B; Higashi, Mitchell; Kessler, Larry; Garrison, Louis P; Devine, Beth
2017-02-01
The American College of Radiology develops evidence-based practice guidelines to aid appropriate utilization of radiological procedures. Panel members use expert opinion to weight trade-offs and consensus methods to rate appropriateness of imaging tests. These ratings include an equivocal range, assigned when there is disagreement about a technology's appropriateness and the evidence base is weak or for special circumstances. It is not clear how expert consensus merges with the evidence base to arrive at an equivocal rating. Quantitative benefit-risk assessment (QBRA) methods may assist decision makers in this capacity. However, many methods exist and it is not clear which methods are best suited for this application. We perform a critical appraisal of QBRA methods and propose several steps that may aid in making transparent areas of weak evidence and barriers to consensus in guideline development. We identify QBRA methods with potential to facilitate decision making in guideline development and build a decision aid for selecting among these methods. This study identified 2 families of QBRA methods suited to guideline development when expert opinion is expected to contribute substantially to decision making. Key steps to deciding among QBRA methods involve identifying specific benefit-risk criteria and developing a state-of-evidence matrix. For equivocal ratings assigned for reasons other than disagreement or weak evidence base, QBRA may not be needed. In the presence of disagreement but the absence of a weak evidence base, multicriteria decision analysis approaches are recommended; and in the presence of weak evidence base and the absence of disagreement, incremental net health benefit alone or combined with multicriteria decision analysis is recommended. Our critical appraisal further extends investigation of the strengths and limitations of select QBRA methods in facilitating diagnostic radiology clinical guideline development. The process of using the decision aid exposes and makes transparent areas of weak evidence and barriers to consensus. © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Zhao, Hui; Qu, Weilu; Qiu, Weiting
2018-03-01
In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.
ERIC Educational Resources Information Center
Ugwu, Romanus Iroabuchi
2012-01-01
The purpose of this mixed-methods study was to describe the perceptions of elementary teachers from an urban school district in Southern California regarding their inquiry-based science instructional practices, assessment methods and professional development. The district's inquiry professional development called the California Mathematics and…
Method Engineering: A Service-Oriented Approach
NASA Astrophysics Data System (ADS)
Cauvet, Corine
In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.
Sassen, Barbara; Kok, Gerjo; Mesters, Ilse; Crutzen, Rik; Cremers, Anita; Vanhees, Luc
2012-12-14
Patients with cardiovascular risk factors can reduce their risk of cardiovascular disease by increasing their physical activity and their physical fitness. According to the guidelines for cardiovascular risk management, health professionals should encourage their patients to engage in physical activity. In this paper, we provide insight regarding the systematic development of a Web-based intervention for both health professionals and patients with cardiovascular risk factors using the development method Intervention Mapping. The different steps of Intervention Mapping are described to open up the "black box" of Web-based intervention development and to support future Web-based intervention development. The development of the Professional and Patient Intention and Behavior Intervention (PIB2 intervention) was initiated with a needs assessment for both health professionals (ie, physiotherapy and nursing) and their patients. We formulated performance and change objectives and, subsequently, theory- and evidence-based intervention methods and strategies were selected that were thought to affect the intention and behavior of health professionals and patients. The rationale of the intervention was based on different behavioral change methods that allowed us to describe the scope and sequence of the intervention and produced the Web-based intervention components. The Web-based intervention consisted of 5 modules, including individualized messages and self-completion forms, and charts and tables. The systematic and planned development of the PIB2 intervention resulted in an Internet-delivered behavior change intervention. The intervention was not developed as a substitute for face-to-face contact between professionals and patients, but as an application to complement and optimize health services. The focus of the Web-based intervention was to extend professional behavior of health care professionals, as well as to improve the risk-reduction behavior of patients with cardiovascular risk factors. The Intervention Mapping protocol provided a systematic method for developing the intervention and each intervention design choice was carefully thought-out and justified. Although it was not a rapid or an easy method for developing an intervention, the protocol guided and directed the development process. The application of evidence-based behavior change methods used in our intervention offers insight regarding how an intervention may change intention and health behavior. The Web-based intervention appeared feasible and was implemented. Further research will test the effectiveness of the PIB2 intervention. Dutch Trial Register, Trial ID: ECP-92.
2012-01-01
Background Patients with cardiovascular risk factors can reduce their risk of cardiovascular disease by increasing their physical activity and their physical fitness. According to the guidelines for cardiovascular risk management, health professionals should encourage their patients to engage in physical activity. Objective In this paper, we provide insight regarding the systematic development of a Web-based intervention for both health professionals and patients with cardiovascular risk factors using the development method Intervention Mapping. The different steps of Intervention Mapping are described to open up the “black box” of Web-based intervention development and to support future Web-based intervention development. Methods The development of the Professional and Patient Intention and Behavior Intervention (PIB2 intervention) was initiated with a needs assessment for both health professionals (ie, physiotherapy and nursing) and their patients. We formulated performance and change objectives and, subsequently, theory- and evidence-based intervention methods and strategies were selected that were thought to affect the intention and behavior of health professionals and patients. The rationale of the intervention was based on different behavioral change methods that allowed us to describe the scope and sequence of the intervention and produced the Web-based intervention components. The Web-based intervention consisted of 5 modules, including individualized messages and self-completion forms, and charts and tables. Results The systematic and planned development of the PIB2 intervention resulted in an Internet-delivered behavior change intervention. The intervention was not developed as a substitute for face-to-face contact between professionals and patients, but as an application to complement and optimize health services. The focus of the Web-based intervention was to extend professional behavior of health care professionals, as well as to improve the risk-reduction behavior of patients with cardiovascular risk factors. Conclusions The Intervention Mapping protocol provided a systematic method for developing the intervention and each intervention design choice was carefully thought-out and justified. Although it was not a rapid or an easy method for developing an intervention, the protocol guided and directed the development process. The application of evidence-based behavior change methods used in our intervention offers insight regarding how an intervention may change intention and health behavior. The Web-based intervention appeared feasible and was implemented. Further research will test the effectiveness of the PIB2 intervention. Trial Registration Dutch Trial Register, Trial ID: ECP-92 PMID:23612470
An XML-based method for astronomy software designing
NASA Astrophysics Data System (ADS)
Liao, Mingxue; Aili, Yusupu; Zhang, Jin
XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.
A novel energy conversion based method for velocity correction in molecular dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Hanhui; Collaborative Innovation Center of Advanced Aero-Engine, Hangzhou 310027; Liu, Ningning
2017-05-01
Molecular dynamics (MD) simulation has become an important tool for studying micro- or nano-scale dynamics and the statistical properties of fluids and solids. In MD simulations, there are mainly two approaches: equilibrium and non-equilibrium molecular dynamics (EMD and NEMD). In this paper, a new energy conversion based correction (ECBC) method for MD is developed. Unlike the traditional systematic correction based on macroscopic parameters, the ECBC method is developed strictly based on the physical interaction processes between the pair of molecules or atoms. The developed ECBC method can apply to EMD and NEMD directly. While using MD with this method, themore » difference between the EMD and NEMD is eliminated, and no macroscopic parameters such as external imposed potentials or coefficients are needed. With this method, many limits of using MD are lifted. The application scope of MD is greatly extended.« less
An index based method is developed that ranks the subwatersheds of a watershed based on their relative impacts on watershed response to anticipated land developments, and then applied to an urbanizing watershed in Eastern Pennsylvania. Simulations with a semi-distributed hydrolo...
Multirate sampled-data yaw-damper and modal suppression system design
NASA Technical Reports Server (NTRS)
Berg, Martin C.; Mason, Gregory S.
1990-01-01
A multirate control law synthesized algorithm based on an infinite-time quadratic cost function, was developed along with a method for analyzing the robustness of multirate systems. A generalized multirate sampled-data control law structure (GMCLS) was introduced. A new infinite-time-based parameter optimization multirate sampled-data control law synthesis method and solution algorithm were developed. A singular-value-based method for determining gain and phase margins for multirate systems was also developed. The finite-time-based parameter optimization multirate sampled-data control law synthesis algorithm originally intended to be applied to the aircraft problem was instead demonstrated by application to a simpler problem involving the control of the tip position of a two-link robot arm. The GMCLS, the infinite-time-based parameter optimization multirate control law synthesis method and solution algorithm, and the singular-value based method for determining gain and phase margins were all demonstrated by application to the aircraft control problem originally proposed for this project.
NASA Astrophysics Data System (ADS)
Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang
2018-06-01
In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.
A method for data base management and analysis for wind tunnel data
NASA Technical Reports Server (NTRS)
Biser, Aileen O.
1987-01-01
To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.
Chemical Entity Recognition and Resolution to ChEBI
Grego, Tiago; Pesquita, Catia; Bastos, Hugo P.; Couto, Francisco M.
2012-01-01
Chemical entities are ubiquitous through the biomedical literature and the development of text-mining systems that can efficiently identify those entities are required. Due to the lack of available corpora and data resources, the community has focused its efforts in the development of gene and protein named entity recognition systems, but with the release of ChEBI and the availability of an annotated corpus, this task can be addressed. We developed a machine-learning-based method for chemical entity recognition and a lexical-similarity-based method for chemical entity resolution and compared them with Whatizit, a popular-dictionary-based method. Our methods outperformed the dictionary-based method in all tasks, yielding an improvement in F-measure of 20% for the entity recognition task, 2–5% for the entity-resolution task, and 15% for combined entity recognition and resolution tasks. PMID:25937941
ERIC Educational Resources Information Center
Komalasari, Kokom; Saripudin, Didin
2018-01-01
This study aims to develop and examine a civic education textbook model based on living values education in order to foster the development of junior high school students' characters. This research employs Research and Development approach with an explorative method being used at model development stage and experiment method at model testing…
Implementing Expertise-Based Training Methods to Accelerate the Development of Peer Academic Coaches
ERIC Educational Resources Information Center
Blair, Lisa
2016-01-01
The field of expertise studies offers several models from which to develop training programs that accelerate the development of novice performers in a variety of domains. This research study implemented two methods of expertise-based training in a course to develop undergraduate peer academic coaches through a ten-week program. An existing…
MURAHASHI, Shun-Ichi
2011-01-01
This review focuses on the development of ruthenium and flavin catalysts for environmentally benign oxidation reactions based on mimicking the functions of cytochrome P-450 and flavoenzymes, and low valent transition-metal catalysts that replace conventional acids and bases. Several new concepts and new types of catalytic reactions based on these concepts are described. PMID:21558760
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina
Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).
Development and Evaluation of the Method with an Affective Interface for Promoting Employees' Morale
NASA Astrophysics Data System (ADS)
Fujino, Hidenori; Ishii, Hirotake; Shimoda, Hiroshi; Yoshikawa, Hidekazu
For the sustainable society, organization management not based on the mass production and mass consumption but having the flexibility to meet to various social needs precisely is required. For realizing such management, the emploees' work morale is required. Recently, however, the emploees' work morale is tend to decrease. Therefore, in this study, the authors developed the model of the method for promoting and keeping employees' work morale effectively and efficiently. Especially the authors thought “work morale” of “attitude to the work”. Based on this idea, it could be considered that the theory of the persuasion psychology and various persuasion techniques. Therefore, the model of the method applying the character agent was developed based on the forced compliance which is one of persuasion techniques based on the theory of the cognitive dissonance. By the evaluation experiment using human subjects, it was confirmed that developed method could improve workers' work morle effectively.
DOT National Transportation Integrated Search
2005-01-01
This report describes work to develop non-destructive testing methods for concrete pavements. Two methods, for pavement thickness and in-place strength estimation, respectively, were developed and evaluated. The thickness estimation method is based o...
Developing and Assessing Teachers' Knowledge of Game-Based Learning
ERIC Educational Resources Information Center
Shah, Mamta; Foster, Aroutis
2015-01-01
Research focusing on the development and assessment of teacher knowledge in game-based learning is in its infancy. A mixed-methods study was undertaken to educate pre-service teachers in game-based learning using the Game Network Analysis (GaNA) framework. Fourteen pre-service teachers completed a methods course, which prepared them in game…
Law, Jodi Woan-Fei; Ab Mutalib, Nurul-Syakima; Chan, Kok-Gan; Lee, Learn-Han
2015-01-01
The incidence of foodborne diseases has increased over the years and resulted in major public health problem globally. Foodborne pathogens can be found in various foods and it is important to detect foodborne pathogens to provide safe food supply and to prevent foodborne diseases. The conventional methods used to detect foodborne pathogen are time consuming and laborious. Hence, a variety of methods have been developed for rapid detection of foodborne pathogens as it is required in many food analyses. Rapid detection methods can be categorized into nucleic acid-based, biosensor-based and immunological-based methods. This review emphasizes on the principles and application of recent rapid methods for the detection of foodborne bacterial pathogens. Detection methods included are simple polymerase chain reaction (PCR), multiplex PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), loop-mediated isothermal amplification (LAMP) and oligonucleotide DNA microarray which classified as nucleic acid-based methods; optical, electrochemical and mass-based biosensors which classified as biosensor-based methods; enzyme-linked immunosorbent assay (ELISA) and lateral flow immunoassay which classified as immunological-based methods. In general, rapid detection methods are generally time-efficient, sensitive, specific and labor-saving. The developments of rapid detection methods are vital in prevention and treatment of foodborne diseases. PMID:25628612
A review on detection methods used for foodborne pathogens
Priyanka, B.; Patil, Rajashekhar K.; Dwarakanath, Sulatha
2016-01-01
Foodborne pathogens have been a cause of a large number of diseases worldwide and more so in developing countries. This has a major economic impact. It is important to contain them, and to do so, early detection is very crucial. Detection and diagnostics relied on culture-based methods to begin with and have developed in the recent past parallel to the developments towards immunological methods such as enzyme-linked immunosorbent assays (ELISA) and molecular biology-based methods such as polymerase chain reaction (PCR). The aim has always been to find a rapid, sensitive, specific and cost-effective method. Ranging from culturing of microbes to the futuristic biosensor technology, the methods have had this common goal. This review summarizes the recent trends and brings together methods that have been developed over the years. PMID:28139531
Developing a Competency-Based Pan-European Accreditation Framework for Health Promotion
ERIC Educational Resources Information Center
Battel-Kirk, Barbara; Van der Zanden, Gerard; Schipperen, Marielle; Contu, Paolo; Gallardo, Carmen; Martinez, Ana; Garcia de Sola, Silvia; Sotgiu, Alessandra; Zaagsma, Miriam; Barry, Margaret M.
2012-01-01
Background: The CompHP Pan-European Accreditation Framework for Health Promotion was developed as part of the CompHP Project that aimed to develop competency-based standards and an accreditation system for health promotion practice, education, and training in Europe. Method: A phased, multiple-method approach was employed to facilitate consensus…
An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle
NASA Astrophysics Data System (ADS)
Wang, Yue; Gao, Dan; Mao, Xuming
2018-03-01
A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.
Blake, Phillipa; Durão, Solange; Naude, Celeste E; Bero, Lisa
2018-01-01
Abstract Evidence-informed guideline development methods underpinned by systematic reviews ensure that guidelines are transparently developed, free from overt bias, and based on the best available evidence. Only recently has the nutrition field begun using these methods to develop public health nutrition guidelines. Given the importance of following an evidence-informed approach and recent advances in related methods, this study sought to describe the methods used to synthesize evidence, rate evidence quality, grade recommendations, and manage conflicts of interest (COIs) in national food-based dietary guidelines (FBDGs). The Food and Agriculture Organization’s FBDGs database was searched to identify the latest versions of FBDGs published from 2010 onward. Relevant data from 32 FBDGs were extracted, and the findings are presented narratively. This study shows that despite advances in evidence-informed methods for developing dietary guidelines, there are variations and deficiencies in methods used to review evidence, rate evidence quality, and grade recommendations. Dietary guidelines should follow systematic and transparent methods and be informed by the best available evidence, while considering important contextual factors and managing conflicts of interest. PMID:29425371
NASA Technical Reports Server (NTRS)
Roth, Don J.; Hendricks, J. Lynne; Whalen, Mike F.; Bodis, James R.; Martin, Katherine
1996-01-01
This article describes the commercial implementation of ultrasonic velocity imaging methods developed and refined at NASA Lewis Research Center on the Sonix c-scan inspection system. Two velocity imaging methods were implemented: thickness-based and non-thickness-based reflector plate methods. The article demonstrates capabilities of the commercial implementation and gives the detailed operating procedures required for Sonix customers to achieve optimum velocity imaging results. This commercial implementation of velocity imaging provides a 100x speed increase in scanning and processing over the lab-based methods developed at LeRC. The significance of this cooperative effort is that the aerospace and other materials development-intensive industries which use extensive ultrasonic inspection for process control and failure analysis will now have an alternative, highly accurate imaging method commercially available.
NASA Astrophysics Data System (ADS)
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun
2017-11-01
In this study, a data-driven method for predicting CO2 leaks and associated concentrations from geological CO2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems.
Hoef, A M; Kok, E J; Bouw, E; Kuiper, H A; Keijer, J
1998-10-01
A method has been developed to distinguish between traditional soy beans and transgenic Roundup Ready soy beans, i.e. the glyphosate ('Roundup') resistant soy bean variety developed by Monsanto Company. Glyphosate resistance results from the incorporation of an Agrobacterium-derived 5-enol-pyruvyl-shikimate-3-phosphatesynthase (EPSPS) gene. The detection method developed is based on a nested Polymerase Chain Reaction (PCR) procedure. Ten femtograms of soy bean DNA can be detected, while, starting from whole soy beans, Roundup Ready DNA can be detected at a level of 1 Roundup Ready soy bean in 5000 non-GM soy beans (0.02% Roundup Ready soy bean). The method has been applied to samples of soy bean, soy-meal pellets and soy bean flour, as well as a number of processed complex products such as infant formula based on soy, tofu, tempeh, soy-based desserts, bakery products and complex meat and meat-replacing products. The results obtained are discussed with respect to practical application of the detection method developed.
ERIC Educational Resources Information Center
Lee, Young-Jin
2012-01-01
This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…
Computer Simulation as an Aid for Management of an Information System.
ERIC Educational Resources Information Center
Simmonds, W. H.; And Others
The aim of this study was to develop methods, based upon computer simulation, of designing information systems and illustrate the use of these methods by application to an information service. The method developed is based upon Monte Carlo and discrete event simulation techniques and is described in an earlier report - Sira report R412 Organizing…
Fish genome manipulation and directional breeding.
Ye, Ding; Zhu, ZuoYan; Sun, YongHua
2015-02-01
Aquaculture is one of the fastest developing agricultural industries worldwide. One of the most important factors for sustainable aquaculture is the development of high performing culture strains. Genome manipulation offers a powerful method to achieve rapid and directional breeding in fish. We review the history of fish breeding methods based on classical genome manipulation, including polyploidy breeding and nuclear transfer. Then, we discuss the advances and applications of fish directional breeding based on transgenic technology and recently developed genome editing technologies. These methods offer increased efficiency, precision and predictability in genetic improvement over traditional methods.
Laser-based methods for the analysis of low molecular weight compounds in biological matrices.
Kiss, András; Hopfgartner, Gérard
2016-07-15
Laser-based desorption and/or ionization methods play an important role in the field of the analysis of low molecular-weight compounds (LMWCs) because they allow direct analysis with high-throughput capabilities. In the recent years there were several new improvements in ionization methods with the emergence of novel atmospheric ion sources such as laser ablation electrospray ionization or laser diode thermal desorption and atmospheric pressure chemical ionization and in sample preparation methods with the development of new matrix compounds for matrix-assisted laser desorption/ionization (MALDI). Also, the combination of ion mobility separation with laser-based ionization methods starts to gain popularity with access to commercial systems. These developments have been driven mainly by the emergence of new application fields such as MS imaging and non-chromatographic analytical approaches for quantification. This review aims to present these new developments in laser-based methods for the analysis of low-molecular weight compounds by MS and several potential applications. Copyright © 2016 Elsevier Inc. All rights reserved.
2012-01-01
Background Optimization of the clinical care process by integration of evidence-based knowledge is one of the active components in care pathways. When studying the impact of a care pathway by using a cluster-randomized design, standardization of the care pathway intervention is crucial. This methodology paper describes the development of the clinical content of an evidence-based care pathway for in-hospital management of chronic obstructive pulmonary disease (COPD) exacerbation in the context of a cluster-randomized controlled trial (cRCT) on care pathway effectiveness. Methods The clinical content of a care pathway for COPD exacerbation was developed based on recognized process design and guideline development methods. Subsequently, based on the COPD case study, a generalized eight-step method was designed to support the development of the clinical content of an evidence-based care pathway. Results A set of 38 evidence-based key interventions and a set of 24 process and 15 outcome indicators were developed in eight different steps. Nine Belgian multidisciplinary teams piloted both the set of key interventions and indicators. The key intervention set was judged by the teams as being valid and clinically applicable. In addition, the pilot study showed that the indicators were feasible for the involved clinicians and patients. Conclusions The set of 38 key interventions and the set of process and outcome indicators were found to be appropriate for the development and standardization of the clinical content of the COPD care pathway in the context of a cRCT on pathway effectiveness. The developed eight-step method may facilitate multidisciplinary teams caring for other patient populations in designing the clinical content of their future care pathways. PMID:23190552
Murahashi, Shun-Ichi
2011-01-01
This review focuses on the development of ruthenium and flavin catalysts for environmentally benign oxidation reactions based on mimicking the functions of cytochrome P-450 and flavoenzymes, and low valent transition-metal catalysts that replace conventional acids and bases. Several new concepts and new types of catalytic reactions based on these concepts are described. (Communicated by Ryoji Noyori, M.J.A.).
Li, C T; Shi, C H; Wu, J G; Xu, H M; Zhang, H Z; Ren, Y L
2004-04-01
The selection of an appropriate sampling strategy and a clustering method is important in the construction of core collections based on predicted genotypic values in order to retain the greatest degree of genetic diversity of the initial collection. In this study, methods of developing rice core collections were evaluated based on the predicted genotypic values for 992 rice varieties with 13 quantitative traits. The genotypic values of the traits were predicted by the adjusted unbiased prediction (AUP) method. Based on the predicted genotypic values, Mahalanobis distances were calculated and employed to measure the genetic similarities among the rice varieties. Six hierarchical clustering methods, including the single linkage, median linkage, centroid, unweighted pair-group average, weighted pair-group average and flexible-beta methods, were combined with random, preferred and deviation sampling to develop 18 core collections of rice germplasm. The results show that the deviation sampling strategy in combination with the unweighted pair-group average method of hierarchical clustering retains the greatest degree of genetic diversities of the initial collection. The core collections sampled using predicted genotypic values had more genetic diversity than those based on phenotypic values.
Optimization of the gypsum-based materials by the sequential simplex method
NASA Astrophysics Data System (ADS)
Doleželová, Magdalena; Vimmrová, Alena
2017-11-01
The application of the sequential simplex optimization method for the design of gypsum based materials is described. The principles of simplex method are explained and several examples of the method usage for the optimization of lightweight gypsum and ternary gypsum based materials are given. By this method lightweight gypsum based materials with desired properties and ternary gypsum based material with higher strength (16 MPa) were successfully developed. Simplex method is a useful tool for optimizing of gypsum based materials, but the objective of the optimization has to be formulated appropriately.
Assessment of sustainable urban transport development based on entropy and unascertained measure.
Li, Yancang; Yang, Jing; Shi, Huawang; Li, Yijie
2017-01-01
To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development.
Ground State and Finite Temperature Lanczos Methods
NASA Astrophysics Data System (ADS)
Prelovšek, P.; Bonča, J.
The present review will focus on recent development of exact- diagonalization (ED) methods that use Lanczos algorithm to transform large sparse matrices onto the tridiagonal form. We begin with a review of basic principles of the Lanczos method for computing ground-state static as well as dynamical properties. Next, generalization to finite-temperatures in the form of well established finite-temperature Lanczos method is described. The latter allows for the evaluation of temperatures T>0 static and dynamic quantities within various correlated models. Several extensions and modification of the latter method introduced more recently are analysed. In particular, the low-temperature Lanczos method and the microcanonical Lanczos method, especially applicable within the high-T regime. In order to overcome the problems of exponentially growing Hilbert spaces that prevent ED calculations on larger lattices, different approaches based on Lanczos diagonalization within the reduced basis have been developed. In this context, recently developed method based on ED within a limited functional space is reviewed. Finally, we briefly discuss the real-time evolution of correlated systems far from equilibrium, which can be simulated using the ED and Lanczos-based methods, as well as approaches based on the diagonalization in a reduced basis.
Recent developments in detection and enumeration of waterborne bacteria: a retrospective minireview.
Deshmukh, Rehan A; Joshi, Kopal; Bhand, Sunil; Roy, Utpal
2016-12-01
Waterborne diseases have emerged as global health problems and their rapid and sensitive detection in environmental water samples is of great importance. Bacterial identification and enumeration in water samples is significant as it helps to maintain safe drinking water for public consumption. Culture-based methods are laborious, time-consuming, and yield false-positive results, whereas viable but nonculturable (VBNCs) microorganisms cannot be recovered. Hence, numerous methods have been developed for rapid detection and quantification of waterborne pathogenic bacteria in water. These rapid methods can be classified into nucleic acid-based, immunology-based, and biosensor-based detection methods. This review summarizes the principle and current state of rapid methods for the monitoring and detection of waterborne bacterial pathogens. Rapid methods outlined are polymerase chain reaction (PCR), digital droplet PCR, real-time PCR, multiplex PCR, DNA microarray, Next-generation sequencing (pyrosequencing, Illumina technology and genomics), and fluorescence in situ hybridization that are categorized as nucleic acid-based methods. Enzyme-linked immunosorbent assay (ELISA) and immunofluorescence are classified into immunology-based methods. Optical, electrochemical, and mass-based biosensors are grouped into biosensor-based methods. Overall, these methods are sensitive, specific, time-effective, and important in prevention and diagnosis of waterborne bacterial diseases. © 2016 The Authors. MicrobiologyOpen published by John Wiley & Sons Ltd.
Recommendations for Developing Alternative Test Methods for Developmental Neurotoxicity
There is great interest in developing alternative methods for developmental neurotoxicity testing (DNT) that are cost-efficient, use fewer animals and are based on current scientific knowledge of the developing nervous system. Alternative methods will require demonstration of the...
A method to estimate weight and dimensions of large and small gas turbine engines
NASA Technical Reports Server (NTRS)
Onat, E.; Klees, G. W.
1979-01-01
A computerized method was developed to estimate weight and envelope dimensions of large and small gas turbine engines within + or - 5% to 10%. The method is based on correlations of component weight and design features of 29 data base engines. Rotating components were estimated by a preliminary design procedure which is sensitive to blade geometry, operating conditions, material properties, shaft speed, hub tip ratio, etc. The development and justification of the method selected, and the various methods of analysis are discussed.
Hofmann, Bjørn
2017-04-01
To develop a method for exposing and elucidating ethical issues with human cognitive enhancement (HCE). The intended use of the method is to support and facilitate open and transparent deliberation and decision making with respect to this emerging technology with great potential formative implications for individuals and society. Literature search to identify relevant approaches. Conventional content analysis of the identified papers and methods in order to assess their suitability for assessing HCE according to four selection criteria. Method development. Amendment after pilot testing on smart-glasses. Based on three existing approaches in health technology assessment a method for exposing and elucidating ethical issues in the assessment of HCE technologies was developed. Based on a pilot test for smart-glasses, the method was amended. The method consists of six steps and a guiding list of 43 questions. A method for exposing and elucidating ethical issues in the assessment of HCE was developed. The method provides the ground work for context specific ethical assessment and analysis. Widespread use, amendments, and further developments of the method are encouraged.
Development of quadruped walking locomotion gait generator using a hybrid method
NASA Astrophysics Data System (ADS)
Jasni, F.; Shafie, A. A.
2013-12-01
The earth, in many areas is hardly reachable by the wheeled or tracked locomotion system. Thus, walking locomotion system is becoming a favourite option for mobile robot these days. This is because of the ability of walking locomotion to move on the rugged and unlevel terrains. However, to develop a walking locomotion gait for a robot is not a simple task. Central Pattern Generator (CPGs) method is a biological inspired method that is introduced as a method to develop the gait for the walking robot recently to tackle the issue faced by the conventional method of pre-designed trajectory based method. However, research shows that even the CPG method do have some limitations. Thus, in this paper, a hybrid method that combines CPG and the pre-designed trajectory based method is introduced to develop a walking gait for quadruped walking robot. The 3-D foot trajectories and the joint angle trajectories developed using the proposed method are compared with the data obtained via the conventional method of pre-designed trajectory to confirm the performance.
Tafti, Nahid; Karimlou, Masoud; Mardani, Mohammad Ali; Jafarpisheh, Amir Salar; Aminian, Gholam Reza; Safari, Reza
2018-04-20
The objectives of current study were to a) assess similarities and relationships between anatomical landmark-based angles and distances of lower limbs in unilateral transtibial amputees and b) develop and evaluate a new anatomically based static prosthetic alignment method. First sub-study assessed the anthropometrical differences and relationships between the lower limbs in the photographs taken from amputees. Data were analysed via paired t-test and regression analysis. Results show no significant differences in frontal and transverse planes. In the sagittal plane, the anthropometric parameters of the amputated limb were significantly correlated to the corresponding variables of the sound limb. The results served as bases for the development of a new prosthetic alignment method. The method was evaluated on a single subject study. Prosthetic alignment carried out by an experienced prosthetist was compared with such alignment adjusted by an inexperienced prosthetist but with the use of the developed method. In sagittal and frontal planes, the socket angle was tuned with respect to the shin angle, and the position of the prosthetic foot was tuned in relation to the pelvic landmarks. Further study is needed to assess the proposed method on a larger sample of amputees and prosthetists.
Methods for the guideline-based development of quality indicators--a systematic review
2012-01-01
Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067
Linhart, S. Mike; Nania, Jon F.; Christiansen, Daniel E.; Hutchinson, Kasey J.; Sanders, Curtis L.; Archfield, Stacey A.
2013-01-01
A variety of individuals from water resource managers to recreational users need streamflow information for planning and decisionmaking at locations where there are no streamgages. To address this problem, two statistically based methods, the Flow Duration Curve Transfer method and the Flow Anywhere method, were developed for statewide application and the two physically based models, the Precipitation Runoff Modeling-System and the Soil and Water Assessment Tool, were only developed for application for the Cedar River Basin. Observed and estimated streamflows for the two methods and models were compared for goodness of fit at 13 streamgages modeled in the Cedar River Basin by using the Nash-Sutcliffe and the percent-bias efficiency values. Based on median and mean Nash-Sutcliffe values for the 13 streamgages the Precipitation Runoff Modeling-System and Soil and Water Assessment Tool models appear to have performed similarly and better than Flow Duration Curve Transfer and Flow Anywhere methods. Based on median and mean percent bias values, the Soil and Water Assessment Tool model appears to have generally overestimated daily mean streamflows, whereas the Precipitation Runoff Modeling-System model and statistical methods appear to have underestimated daily mean streamflows. The Flow Duration Curve Transfer method produced the lowest median and mean percent bias values and appears to perform better than the other models.
What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context.
Hallett, John; Held, Suzanne; McCormick, Alma Knows His Gun; Simonds, Vanessa; Real Bird, Sloane; Martin, Christine; Simpson, Colleen; Schure, Mark; Turnsplenty, Nicole; Trottier, Coleen
2017-07-01
Community-based participatory research and decolonizing research share some recommendations for best practices for conducting research. One commonality is partnering on all stages of research; co-developing methods of data analysis is one stage with a deficit of partnering examples. We present a novel community-based and developed method for analyzing qualitative data within an Indigenous health study and explain incompatibilities of existing methods for our purposes and community needs. We describe how we explored available literature, received counsel from community Elders and experts in the field, and collaboratively developed a data analysis method consonant with community values. The method of analysis, in which interview/story remained intact, team members received story, made meaning through discussion, and generated a conceptual framework to inform intervention development, is detailed. We offer the development process and method as an example for researchers working with communities who want to keep stories intact during qualitative data analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2010 CFR
2010-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
New Methods of Low-Field Magnetic Resonance Imaging for Application to Traumatic Brain Injury
2013-02-01
magnet based ), the development of novel high-speed parallel imaging detection systems, and work on advanced adaptive reconstruction methods ...signal many times within the acquisition time . We present here a new method for 3D OMRI based on b-SSFP at a constant field of 6.5 mT that provides up...developing injury-sensitive MRI based on the detection of free radicals associat- ed with injury using the Overhauser effect and subsequently imaging that
A knowledge-driven approach to biomedical document conceptualization.
Zheng, Hai-Tao; Borchert, Charles; Jiang, Yong
2010-06-01
Biomedical document conceptualization is the process of clustering biomedical documents based on ontology-represented domain knowledge. The result of this process is the representation of the biomedical documents by a set of key concepts and their relationships. Most of clustering methods cluster documents based on invariant domain knowledge. The objective of this work is to develop an effective method to cluster biomedical documents based on various user-specified ontologies, so that users can exploit the concept structures of documents more effectively. We develop a flexible framework to allow users to specify the knowledge bases, in the form of ontologies. Based on the user-specified ontologies, we develop a key concept induction algorithm, which uses latent semantic analysis to identify key concepts and cluster documents. A corpus-related ontology generation algorithm is developed to generate the concept structures of documents. Based on two biomedical datasets, we evaluate the proposed method and five other clustering algorithms. The clustering results of the proposed method outperform the five other algorithms, in terms of key concept identification. With respect to the first biomedical dataset, our method has the F-measure values 0.7294 and 0.5294 based on the MeSH ontology and gene ontology (GO), respectively. With respect to the second biomedical dataset, our method has the F-measure values 0.6751 and 0.6746 based on the MeSH ontology and GO, respectively. Both results outperforms the five other algorithms in terms of F-measure. Based on the MeSH ontology and GO, the generated corpus-related ontologies show informative conceptual structures. The proposed method enables users to specify the domain knowledge to exploit the conceptual structures of biomedical document collections. In addition, the proposed method is able to extract the key concepts and cluster the documents with a relatively high precision. Copyright 2010 Elsevier B.V. All rights reserved.
Won, Jonghun; Lee, Gyu Rie; Park, Hahnbeom; Seok, Chaok
2018-06-07
The second extracellular loops (ECL2s) of G-protein-coupled receptors (GPCRs) are often involved in GPCR functions, and their structures have important implications in drug discovery. However, structure prediction of ECL2 is difficult because of its long length and the structural diversity among different GPCRs. In this study, a new ECL2 conformational sampling method involving both template-based and ab initio sampling was developed. Inspired by the observation of similar ECL2 structures of closely related GPCRs, a template-based sampling method employing loop structure templates selected from the structure database was developed. A new metric for evaluating similarity of the target loop to templates was introduced for template selection. An ab initio loop sampling method was also developed to treat cases without highly similar templates. The ab initio method is based on the previously developed fragment assembly and loop closure method. A new sampling component that takes advantage of secondary structure prediction was added. In addition, a conserved disulfide bridge restraining ECL2 conformation was predicted and analytically incorporated into sampling, reducing the effective dimension of the conformational search space. The sampling method was combined with an existing energy function for comparison with previously reported loop structure prediction methods, and the benchmark test demonstrated outstanding performance.
Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-01-01
Abstract Background: This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. Methods: The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. Results: The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. Conclusion: The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. PMID:27815414
Culturally Adaptive Walking Intervention for Korean-Chinese Female Migrant Workers.
Cho, Sunghye; Lee, Hyeonkyeong; Kim, Jung Hee; Lee, Meenhye; Lee, Young-Me
2017-05-01
Although the literature has commonly cited that development of culturally adaptive interventions is key to improving the health outcomes of culturally and linguistically diverse populations, there have been limited culturally adaptive walking interventions specific to Korean-Chinese (KC) migrants. The objective of this study is to describe the process in development of culturally adaptive walking interventions for KC female migrant workers, using the intervention mapping (IM) method. The culturally adaptive walking intervention was developed using the IM method, which is a stepwise theory and evidence-based approach for planning interventions. The IM method process has six steps, including needs assessment, formulation of change objectives, selection of theory-based methods and practical strategies, development of an intervention program, development of an adoption and implementation plan, and development of an evaluation design. The determinants of walking behavior, including knowledge, self-efficacy, social support, and acculturation, were identified through an extensive literature review, community leader interviews, and a survey of female KC migrant workers. Appropriate intervention methods and strategies were identified based on relevant theories. Acculturation was a determinant of exercise behavior, and various methods to improve cultural adaptation were identified in the context of the lifestyles and working environments of the target population. The IM method provided a foundation for creating a health intervention for KC female migrant workers. This method could easily be useful for health care providers working with other groups.
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun
2017-11-01
In this study, a data-driven method for predicting CO 2 leaks and associated concentrations from geological CO 2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO 2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO 2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO 2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems. Copyright © 2017 Elsevier B.V. All rights reserved.
de Sena, Rodrigo Caciano; Soares, Matheus; Pereira, Maria Luiza Oliveira; da Silva, Rogério Cruz Domingues; do Rosário, Francisca Ferreira; da Silva, Joao Francisco Cajaiba
2011-01-01
The development of a simple, rapid and low cost method based on video image analysis and aimed at the detection of low concentrations of precipitated barium sulfate is described. The proposed system is basically composed of a webcam with a CCD sensor and a conventional dichroic lamp. For this purpose, software for processing and analyzing the digital images based on the RGB (Red, Green and Blue) color system was developed. The proposed method had shown very good repeatability and linearity and also presented higher sensitivity than the standard turbidimetric method. The developed method is presented as a simple alternative for future applications in the study of precipitations of inorganic salts and also for detecting the crystallization of organic compounds. PMID:22346607
Developing a Blended Learning-Based Method for Problem-Solving in Capability Learning
ERIC Educational Resources Information Center
Dwiyogo, Wasis D.
2018-01-01
The main objectives of the study were to develop and investigate the implementation of blended learning based method for problem-solving. Three experts were involved in the study and all three had stated that the model was ready to be applied in the classroom. The implementation of the blended learning-based design for problem-solving was…
Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.
2012-01-01
Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET. PMID:23039679
Research and development of LANDSAT-based crop inventory techniques
NASA Technical Reports Server (NTRS)
Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)
1982-01-01
A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.
The Pixon Method for Data Compression Image Classification, and Image Reconstruction
NASA Technical Reports Server (NTRS)
Puetter, Richard; Yahil, Amos
2002-01-01
As initially proposed, this program had three goals: (1) continue to develop the highly successful Pixon method for image reconstruction and support other scientist in implementing this technique for their applications; (2) develop image compression techniques based on the Pixon method; and (3) develop artificial intelligence algorithms for image classification based on the Pixon approach for simplifying neural networks. Subsequent to proposal review the scope of the program was greatly reduced and it was decided to investigate the ability of the Pixon method to provide superior restorations of images compressed with standard image compression schemes, specifically JPEG-compressed images.
District nursing workforce planning: a review of the methods.
Reid, Bernie; Kane, Kay; Curran, Carol
2008-11-01
District nursing services in Northern Ireland face increasing demands and challenges which may be responded to by effective and efficient workforce planning and development. The aim of this paper is to critically analyse district nursing workforce planning and development methods, in an attempt to find a suitable method for Northern Ireland. A systematic analysis of the literature reveals four methods: professional judgement; population-based health needs; caseload analysis and dependency-acuity. Each method has strengths and weaknesses. Professional judgement offers a 'belt and braces' approach but lacks sensitivity to fluctuating patient numbers. Population-based health needs methods develop staffing algorithms that reflect deprivation and geographical spread, but are poorly understood by district nurses. Caseload analysis promotes equitable workloads but poorly performing district nursing localities may continue if benchmarking processes only consider local data. Dependency-acuity methods provide a means of equalizing and prioritizing workload but are prone to district nurses overstating factors in patient dependency or understating carers' capability. In summary a mixed method approach is advocated to evaluate and adjust the size and mix of district nursing teams using empirically determined patient dependency and activity-based variables based on the population's health needs.
Liang, Sai; Qu, Shen; Xu, Ming
2016-02-02
To develop industry-specific policies for mitigating environmental pressures, previous studies primarily focus on identifying sectors that directly generate large amounts of environmental pressures (a.k.a. production-based method) or indirectly drive large amounts of environmental pressures through supply chains (e.g., consumption-based method). In addition to those sectors as important environmental pressure producers or drivers, there exist sectors that are also important to environmental pressure mitigation as transmission centers. Economy-wide environmental pressure mitigation might be achieved by improving production efficiency of these key transmission sectors, that is, using less upstream inputs to produce unitary output. We develop a betweenness-based method to measure the importance of transmission sectors, borrowing the betweenness concept from network analysis. We quantify the betweenness of sectors by examining supply chain paths extracted from structural path analysis that pass through a particular sector. We take China as an example and find that those critical transmission sectors identified by betweenness-based method are not always identifiable by existing methods. This indicates that betweenness-based method can provide additional insights that cannot be obtained with existing methods on the roles individual sectors play in generating economy-wide environmental pressures. Betweenness-based method proposed here can therefore complement existing methods for guiding sector-level environmental pressure mitigation strategies.
Fast ADC based multichannel acquisition system for the GEM detector
NASA Astrophysics Data System (ADS)
Kasprowicz, G.; Czarski, T.; Chernyshova, M.; Dominik, W.; Jakubowska, K.; Karpinski, L.; Kierzkowski, K.; Pozniak, K.; Rzadkiewicz, J.; Scholz, M.; Zabolotny, W.
2012-05-01
A novel approach to the Gas Electron Multiplier1 (GEM) detector readout is presented. Unlike commonly used methods, based on discriminators, and analogue FIFOs,[ the method developed uses simultaneously sampling high speed ADCs and advanced FPGA-based processing logic to estimate the energy of every single photon. Such method is applied to every GEM strip signal. It is especially useful in case of crystal-based spectrometers for soft X-rays, where higher order reflections need to be identified and rejected. For the purpose of the detector readout, a novel conception of the measurement platform was developed.
NASA Astrophysics Data System (ADS)
Sizov, Gennadi Y.
In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.
Method of fan sound mode structure determination
NASA Technical Reports Server (NTRS)
Pickett, G. F.; Sofrin, T. G.; Wells, R. W.
1977-01-01
A method for the determination of fan sound mode structure in the Inlet of turbofan engines using in-duct acoustic pressure measurements is presented. The method is based on the simultaneous solution of a set of equations whose unknowns are modal amplitude and phase. A computer program for the solution of the equation set was developed. An additional computer program was developed which calculates microphone locations the use of which results in an equation set that does not give rise to numerical instabilities. In addition to the development of a method for determination of coherent modal structure, experimental and analytical approaches are developed for the determination of the amplitude frequency spectrum of randomly generated sound models for use in narrow annulus ducts. Two approaches are defined: one based on the use of cross-spectral techniques and the other based on the use of an array of microphones.
Systematic Method for Establishing Officer Grade Requirements Based Upon Job Demands.
ERIC Educational Resources Information Center
Christal, Raymond E.
This report presents interim results of a study developing a methodology for management engineering teams to determine the appropriate grade requirements for officer positions based on job content and responsibilities. The technology reported represents a modification and extension of methods developed between 1963 and 1966. Results indicated that…
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
NASA Astrophysics Data System (ADS)
Kim, Jae-Chang; Moon, Sung-Ki; Kwak, Sangshin
2018-04-01
This paper presents a direct model-based predictive control scheme for voltage source inverters (VSIs) with reduced common-mode voltages (CMVs). The developed method directly finds optimal vectors without using repetitive calculation of a cost function. To adjust output currents with the CMVs in the range of -Vdc/6 to +Vdc/6, the developed method uses voltage vectors, as finite control resources, excluding zero voltage vectors which produce the CMVs in the VSI within ±Vdc/2. In a model-based predictive control (MPC), not using zero voltage vectors increases the output current ripples and the current errors. To alleviate these problems, the developed method uses two non-zero voltage vectors in one sampling step. In addition, the voltage vectors scheduled to be used are directly selected at every sampling step once the developed method calculates the future reference voltage vector, saving the efforts of repeatedly calculating the cost function. And the two non-zero voltage vectors are optimally allocated to make the output current approach the reference current as close as possible. Thus, low CMV, rapid current-following capability and sufficient output current ripple performance are attained by the developed method. The results of a simulation and an experiment verify the effectiveness of the developed method.
Boushey, C J; Spoden, M; Zhu, F M; Delp, E J; Kerr, D A
2017-08-01
For nutrition practitioners and researchers, assessing dietary intake of children and adults with a high level of accuracy continues to be a challenge. Developments in mobile technologies have created a role for images in the assessment of dietary intake. The objective of this review was to examine peer-reviewed published papers covering development, evaluation and/or validation of image-assisted or image-based dietary assessment methods from December 2013 to January 2016. Images taken with handheld devices or wearable cameras have been used to assist traditional dietary assessment methods for portion size estimations made by dietitians (image-assisted methods). Image-assisted approaches can supplement either dietary records or 24-h dietary recalls. In recent years, image-based approaches integrating application technology for mobile devices have been developed (image-based methods). Image-based approaches aim at capturing all eating occasions by images as the primary record of dietary intake, and therefore follow the methodology of food records. The present paper reviews several image-assisted and image-based methods, their benefits and challenges; followed by details on an image-based mobile food record. Mobile technology offers a wide range of feasible options for dietary assessment, which are easier to incorporate into daily routines. The presented studies illustrate that image-assisted methods can improve the accuracy of conventional dietary assessment methods by adding eating occasion detail via pictures captured by an individual (dynamic images). All of the studies reduced underreporting with the help of images compared with results with traditional assessment methods. Studies with larger sample sizes are needed to better delineate attributes with regards to age of user, degree of error and cost.
ERIC Educational Resources Information Center
Kurtulus, Aytac
2013-01-01
The aim of this study was to investigate the effects of web-based interactive virtual tours on the development of prospective mathematics teachers' spatial skills. The study was designed based on experimental method. The "one-group pre-test post-test design" of this method was taken as the research model. The study was conducted with 3rd year…
ERIC Educational Resources Information Center
Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan
2012-01-01
Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…
Fault management for data systems
NASA Technical Reports Server (NTRS)
Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann
1993-01-01
Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.
Applied Cognitive Task Analysis (ACTA) Methodology
1997-11-01
experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the
Assessment of sustainable urban transport development based on entropy and unascertained measure
Li, Yancang; Yang, Jing; Li, Yijie
2017-01-01
To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development. PMID:29084281
Terminology model discovery using natural language processing and visualization techniques.
Zhou, Li; Tao, Ying; Cimino, James J; Chen, Elizabeth S; Liu, Hongfang; Lussier, Yves A; Hripcsak, George; Friedman, Carol
2006-12-01
Medical terminologies are important for unambiguous encoding and exchange of clinical information. The traditional manual method of developing terminology models is time-consuming and limited in the number of phrases that a human developer can examine. In this paper, we present an automated method for developing medical terminology models based on natural language processing (NLP) and information visualization techniques. Surgical pathology reports were selected as the testing corpus for developing a pathology procedure terminology model. The use of a general NLP processor for the medical domain, MedLEE, provides an automated method for acquiring semantic structures from a free text corpus and sheds light on a new high-throughput method of medical terminology model development. The use of an information visualization technique supports the summarization and visualization of the large quantity of semantic structures generated from medical documents. We believe that a general method based on NLP and information visualization will facilitate the modeling of medical terminologies.
Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun
2017-01-01
The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device’s built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction. PMID:28837096
Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun
2017-08-24
The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device's built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction.
NASA Astrophysics Data System (ADS)
Sanchez, P.; Hinojosa, J.; Ruiz, R.
2005-06-01
Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.
Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue
NASA Technical Reports Server (NTRS)
Ayache, S.; Haziza, M.; Cayrac, D.
1994-01-01
Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.
Engineering large-scale agent-based systems with consensus
NASA Technical Reports Server (NTRS)
Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.
1994-01-01
The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
... DEPARTMENT OF AGRICULTURE Development of Technical Guidelines and Scientific Methods for... technical guidelines and scientific methods for quantifying greenhouse gas (GHG) emissions and carbon...-based methods to measure the carbon benefits from conservation and land management activities. In...
Computer-aided designing of immunosuppressive peptides based on IL-10 inducing potential
Nagpal, Gandharva; Usmani, Salman Sadullah; Dhanda, Sandeep Kumar; Kaur, Harpreet; Singh, Sandeep; Sharma, Meenu; Raghava, Gajendra P. S.
2017-01-01
In the past, numerous methods have been developed to predict MHC class II binders or T-helper epitopes for designing the epitope-based vaccines against pathogens. In contrast, limited attempts have been made to develop methods for predicting T-helper epitopes/peptides that can induce a specific type of cytokine. This paper describes a method, developed for predicting interleukin-10 (IL-10) inducing peptides, a cytokine responsible for suppressing the immune system. All models were trained and tested on experimentally validated 394 IL-10 inducing and 848 non-inducing peptides. It was observed that certain types of residues and motifs are more frequent in IL-10 inducing peptides than in non-inducing peptides. Based on this analysis, we developed composition-based models using various machine-learning techniques. Random Forest-based model achieved the maximum Matthews’s Correlation Coefficient (MCC) value of 0.59 with an accuracy of 81.24% developed using dipeptide composition. In order to facilitate the community, we developed a web server “IL-10pred”, standalone packages and a mobile app for designing IL-10 inducing peptides (http://crdd.osdd.net/raghava/IL-10pred/). PMID:28211521
Spatial weighting approach in numerical method for disaggregation of MDGs indicators
NASA Astrophysics Data System (ADS)
Permai, S. D.; Mukhaiyar, U.; Satyaning PP, N. L. P.; Soleh, M.; Aini, Q.
2018-03-01
Disaggregation use to separate and classify the data based on certain characteristics or on administrative level. Disaggregated data is very important because some indicators not measured on all characteristics. Detailed disaggregation for development indicators is important to ensure that everyone benefits from development and support better development-related policymaking. This paper aims to explore different methods to disaggregate national employment-to-population ratio indicator to province- and city-level. Numerical approach applied to overcome the problem of disaggregation unavailability by constructing several spatial weight matrices based on the neighbourhood, Euclidean distance and correlation. These methods can potentially be used and further developed to disaggregate development indicators into lower spatial level even by several demographic characteristics.
Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.
Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen
2017-11-01
A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.
Putt, Karson S; Pugh, Randall B
2013-01-01
Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.
Putt, Karson S.; Pugh, Randall B.
2013-01-01
Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution. PMID:24260173
Funding Education: Developing a Method of Allocation for Improvement
ERIC Educational Resources Information Center
BenDavid-Hadar, Iris
2018-01-01
Purpose: Resource allocation is a key policy instrument that affects the educational achievement distribution (EAD). The literature on methods of allocation is focused mainly on equity issues. The purpose of this paper is to develop a composite funding formula, which adds to the equity-based element (i.e. a needs-based element compensating for…
Veteran Teacher Engagement in Site-Based Professional Development: A Mixed Methods Study
ERIC Educational Resources Information Center
Houston, Biaze L.
2016-01-01
This research study examined how teachers self-report their levels of engagement, which factors they believe contribute most to their engagement, and which assumptions of andragogy most heavily influence teacher engagement in site-based professional development. This study employed a convergent parallel mixed methods design to study veteran…
ERIC Educational Resources Information Center
Flores, Ingrid M.
2015-01-01
Thirty preservice teachers enrolled in a field-based science methods course were placed at a public elementary school for coursework and for teaching practice with elementary students. Candidates focused on building conceptual understanding of science content and pedagogical methods through innovative curriculum development and other course…
What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context
Hallett, John; Held, Suzanne; McCormick, Alma Knows His Gun; Simonds, Vanessa; Bird, Sloane Real; Martin, Christine; Simpson, Colleen; Schure, Mark; Turnsplenty, Nicole; Trottier, Coleen
2017-01-01
Community-based participatory research and decolonizing research share some recommendations for best practices for conducting research. One commonality is partnering on all stages of research; co-developing methods of data analysis is one stage with a deficit of partnering examples. We present a novel community-based and developed method for analyzing qualitative data within an Indigenous health study and explain incompatibilities of existing methods for our purposes and community needs. We describe how we explored available literature, received counsel from community Elders and experts in the field, and collaboratively developed a data analysis method consonant with community values. The method of analysis, in which interview/story remained intact, team members received story, made meaning through discussion, and generated a conceptual framework to inform intervention development, is detailed. We offer the development process and method as an example for researchers working with communities who want to keep stories intact during qualitative data analysis. PMID:27659019
Web-based data collection: detailed methods of a questionnaire and data gathering tool
Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R
2006-01-01
There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556
Fragment-Based Drug Discovery in Academia: Experiences From a Tuberculosis Programme
NASA Astrophysics Data System (ADS)
Heikkila, Timo J.; Surade, Sachin; Silvestre, Hernani L.; Dias, Marcio V. B.; Ciulli, Alessio; Bromfield, Karen; Scott, Duncan; Howard, Nigel; Wen, Shijun; Wei, Alvin Hung; Osborne, David; Abell, Chris; Blundell, Tom L.
The problems associated with neglected diseases are often compounded by increasing incidence of antibiotic resistance. Patient negligence and abuse of antibiotics has lead to explosive growth in cases of tuberculosis, with some M. tuberculosis strains becoming virtually untreatable. Structure-based drug development is viewed as cost-effective and time-consuming method for discovery and development of hits to lead compounds. In this review we will discuss the suitability of fragment-based methods for developing new chemotherapeutics against neglected diseases, providing examples from our tuberculosis programme.
Design and realization of retina-like three-dimensional imaging based on a MOEMS mirror
NASA Astrophysics Data System (ADS)
Cao, Jie; Hao, Qun; Xia, Wenze; Peng, Yuxin; Cheng, Yang; Mu, Jiaxing; Wang, Peng
2016-07-01
To balance conflicts for high-resolution, large-field-of-view and real-time imaging, a retina-like imaging method based on time-of flight (TOF) is proposed. Mathematical models of 3D imaging based on MOEMS are developed. Based on this method, we perform simulations of retina-like scanning properties, including compression of redundant information and rotation and scaling invariance. To validate the theory, we develop a prototype and conduct relevant experiments. The preliminary results agree well with the simulations.
A facile fluorescent "turn-off" method for sensing paraquat based on pyranine-paraquat interaction
NASA Astrophysics Data System (ADS)
Zhao, Zuzhi; Zhang, Fengwei; Zhang, Zipin
2018-06-01
Development of a technically simple yet effective method for paraquat (PQ) detection is of great importance due to its high clinical and environmental relevance. In this study, we developed a pyranine-based fluorescent "turn-off" method for PQ sensing based on pyranine-PQ interaction. We investigated the dependence of analytical performance of this method on the experimental conditions, such as the ion strength, medium pH, and so on. Under the optimized conditions, the method is sensitive and selective, and could be used for PQ detection in real-world sample. This study essentially provides a readily accessible fluorescent system for PQ sensing which is cheap, robust, and technically simple, and it is envisaged to find more interesting clinical and environmental applications.
Williams, C.J.; Heglund, P.J.
2009-01-01
Habitat association models are commonly developed for individual animal species using generalized linear modeling methods such as logistic regression. We considered the issue of grouping species based on their habitat use so that management decisions can be based on sets of species rather than individual species. This research was motivated by a study of western landbirds in northern Idaho forests. The method we examined was to separately fit models to each species and to use a generalized Mahalanobis distance between coefficient vectors to create a distance matrix among species. Clustering methods were used to group species from the distance matrix, and multidimensional scaling methods were used to visualize the relations among species groups. Methods were also discussed for evaluating the sensitivity of the conclusions because of outliers or influential data points. We illustrate these methods with data from the landbird study conducted in northern Idaho. Simulation results are presented to compare the success of this method to alternative methods using Euclidean distance between coefficient vectors and to methods that do not use habitat association models. These simulations demonstrate that our Mahalanobis-distance- based method was nearly always better than Euclidean-distance-based methods or methods not based on habitat association models. The methods used to develop candidate species groups are easily explained to other scientists and resource managers since they mainly rely on classical multivariate statistical methods. ?? 2008 Springer Science+Business Media, LLC.
Measurement of Crystalline Silica Aerosol Using Quantum Cascade Laser-Based Infrared Spectroscopy.
Wei, Shijun; Kulkarni, Pramod; Ashley, Kevin; Zheng, Lina
2017-10-24
Inhalation exposure to airborne respirable crystalline silica (RCS) poses major health risks in many industrial environments. There is a need for new sensitive instruments and methods for in-field or near real-time measurement of crystalline silica aerosol. The objective of this study was to develop an approach, using quantum cascade laser (QCL)-based infrared spectroscopy (IR), to quantify airborne concentrations of RCS. Three sampling methods were investigated for their potential for effective coupling with QCL-based transmittance measurements: (i) conventional aerosol filter collection, (ii) focused spot sample collection directly from the aerosol phase, and (iii) dried spot obtained from deposition of liquid suspensions. Spectral analysis methods were developed to obtain IR spectra from the collected particulate samples in the range 750-1030 cm -1 . The new instrument was calibrated and the results were compared with standardized methods based on Fourier transform infrared (FTIR) spectrometry. Results show that significantly lower detection limits for RCS (≈330 ng), compared to conventional infrared methods, could be achieved with effective microconcentration and careful coupling of the particulate sample with the QCL beam. These results offer promise for further development of sensitive filter-based laboratory methods and portable sensors for near real-time measurement of crystalline silica aerosol.
Immobilizing affinity proteins to nitrocellulose: a toolbox for paper-based assay developers.
Holstein, Carly A; Chevalier, Aaron; Bennett, Steven; Anderson, Caitlin E; Keniston, Karen; Olsen, Cathryn; Li, Bing; Bales, Brian; Moore, David R; Fu, Elain; Baker, David; Yager, Paul
2016-02-01
To enable enhanced paper-based diagnostics with improved detection capabilities, new methods are needed to immobilize affinity reagents to porous substrates, especially for capture molecules other than IgG. To this end, we have developed and characterized three novel methods for immobilizing protein-based affinity reagents to nitrocellulose membranes. We have demonstrated these methods using recombinant affinity proteins for the influenza surface protein hemagglutinin, leveraging the customizability of these recombinant "flu binders" for the design of features for immobilization. The three approaches shown are: (1) covalent attachment of thiolated affinity protein to an epoxide-functionalized nitrocellulose membrane, (2) attachment of biotinylated affinity protein through a nitrocellulose-binding streptavidin anchor protein, and (3) fusion of affinity protein to a novel nitrocellulose-binding anchor protein for direct coupling and immobilization. We also characterized the use of direct adsorption for the flu binders, as a point of comparison and motivation for these novel methods. Finally, we demonstrated that these novel methods can provide improved performance to an influenza hemagglutinin assay, compared to a traditional antibody-based capture system. Taken together, this work advances the toolkit available for the development of next-generation paper-based diagnostics.
Development of a specification for flexible base construction.
DOT National Transportation Integrated Search
2014-01-01
The Texas Department of Transportation (TxDOT) currently uses Item 247 Flexible Base to specify a : pavement foundation course. The goal of this project was to evaluate the current method of base course : acceptance and investigate methods to r...
Gagnon, Jessica K.; Law, Sean M.; Brooks, Charles L.
2016-01-01
Protein-ligand docking is a commonly used method for lead identification and refinement. While traditional structure-based docking methods represent the receptor as a rigid body, recent developments have been moving toward the inclusion of protein flexibility. Proteins exist in an inter-converting ensemble of conformational states, but effectively and efficiently searching the conformational space available to both the receptor and ligand remains a well-appreciated computational challenge. To this end, we have developed the Flexible CDOCKER method as an extension of the family of complete docking solutions available within CHARMM. This method integrates atomically detailed side chain flexibility with grid-based docking methods, maintaining efficiency while allowing the protein and ligand configurations to explore their conformational space simultaneously. This is in contrast to existing approaches that use induced-fit like sampling, such as Glide or Autodock, where the protein or the ligand space is sampled independently in an iterative fashion. Presented here are developments to the CHARMM docking methodology to incorporate receptor flexibility and improvements to the sampling protocol as demonstrated with re-docking trials on a subset of the CCDC/Astex set. These developments within CDOCKER achieve docking accuracy competitive with or exceeding the performance of other widely utilized docking programs. PMID:26691274
Gagnon, Jessica K; Law, Sean M; Brooks, Charles L
2016-03-30
Protein-ligand docking is a commonly used method for lead identification and refinement. While traditional structure-based docking methods represent the receptor as a rigid body, recent developments have been moving toward the inclusion of protein flexibility. Proteins exist in an interconverting ensemble of conformational states, but effectively and efficiently searching the conformational space available to both the receptor and ligand remains a well-appreciated computational challenge. To this end, we have developed the Flexible CDOCKER method as an extension of the family of complete docking solutions available within CHARMM. This method integrates atomically detailed side chain flexibility with grid-based docking methods, maintaining efficiency while allowing the protein and ligand configurations to explore their conformational space simultaneously. This is in contrast to existing approaches that use induced-fit like sampling, such as Glide or Autodock, where the protein or the ligand space is sampled independently in an iterative fashion. Presented here are developments to the CHARMM docking methodology to incorporate receptor flexibility and improvements to the sampling protocol as demonstrated with re-docking trials on a subset of the CCDC/Astex set. These developments within CDOCKER achieve docking accuracy competitive with or exceeding the performance of other widely utilized docking programs. © 2015 Wiley Periodicals, Inc.
Leng, Pei-Qiang; Zhao, Feng-Lan; Yin, Bin-Cheng; Ye, Bang-Ce
2015-05-21
We developed a novel colorimetric method for rapid detection of biogenic amines based on arylalkylamine N-acetyltransferase (aaNAT). The proposed method offers distinct advantages including simple handling, high speed, low cost, good sensitivity and selectivity.
Design of nuclease-based target recycling signal amplification in aptasensors.
Yan, Mengmeng; Bai, Wenhui; Zhu, Chao; Huang, Yafei; Yan, Jiao; Chen, Ailiang
2016-03-15
Compared with conventional antibody-based immunoassay methods, aptasensors based on nucleic acid aptamer have made at least two significant breakthroughs. One is that aptamers are more easily used for developing various simple and rapid homogeneous detection methods by "sample in signal out" without multi-step washing. The other is that aptamers are more easily employed for developing highly sensitive detection methods by using various nucleic acid-based signal amplification approaches. As many substances playing regulatory roles in physiology or pathology exist at an extremely low concentration and many chemical contaminants occur in trace amounts in food or environment, aptasensors for signal amplification contribute greatly to detection of such targets. Among the signal amplification approaches in highly sensitive aptasensors, the nuclease-based target recycling signal amplification has recently become a research focus because it shows easy design, simple operation, and rapid reaction and can be easily developed for homogenous assay. In this review, we summarized recent advances in the development of various nuclease-based target recycling signal amplification with the aim to provide a general guide for the design of aptamer-based ultrasensitive biosensing assays. Copyright © 2015 Elsevier B.V. All rights reserved.
Soejima, Mikiko; Tsuchiya, Yuji; Egashira, Kouichi; Kawano, Hiroyuki; Sagawa, Kimitaka; Koda, Yoshiro
2010-06-01
Anhaptoglobinemic patients run the risk of severe anaphylactic transfusion reaction because they produce serum haptoglobin (Hp) antibodies. Being homozygous for the Hp gene deletion (HP(del)) is the only known cause of congenital anhaptoglobinemia, and clinical diagnosis of HP(del) before transfusion is important to prevent anaphylactic shock. We recently developed a 5'-nuclease (TaqMan) real-time polymerase chain reaction (PCR) method. A SYBR Green I-based duplex real-time PCR assay using two forward primers and a common reverse primer followed by melting curve analysis was developed to determine HP(del) zygosity in a single tube. In addition, to obviate initial DNA extraction, we examined serially diluted blood samples as PCR templates. Allelic discrimination of HP(del) yielded optimal results at blood sample dilutions of 1:64 to 1:1024. The results from 2231 blood samples were fully concordant with those obtained by the TaqMan-based real-time PCR method. The detection rate of the HP(del) allele by the SYBR Green I-based method is comparable with that using the TaqMan-based method. This method is readily applicable due to its low initial cost and analyzability using economical real-time PCR machines and is suitable for high-throughput analysis as an alternative method for allelic discrimination of HP(del).
2018-01-01
A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis. PMID:29686931
Valavala, Sriram; Seelam, Nareshvarma; Tondepu, Subbaiah; Jagarlapudi, V Shanmukha Kumar; Sundarmurthy, Vivekanandan
2018-01-01
A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µ m) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.
Hocalar, A; Türker, M; Karakuzu, C; Yüzgeç, U
2011-04-01
In this study, previously developed five different state estimation methods are examined and compared for estimation of biomass concentrations at a production scale fed-batch bioprocess. These methods are i. estimation based on kinetic model of overflow metabolism; ii. estimation based on metabolic black-box model; iii. estimation based on observer; iv. estimation based on artificial neural network; v. estimation based on differential evaluation. Biomass concentrations are estimated from available measurements and compared with experimental data obtained from large scale fermentations. The advantages and disadvantages of the presented techniques are discussed with regard to accuracy, reproducibility, number of primary measurements required and adaptation to different working conditions. Among the various techniques, the metabolic black-box method seems to have advantages although the number of measurements required is more than that for the other methods. However, the required extra measurements are based on commonly employed instruments in an industrial environment. This method is used for developing a model based control of fed-batch yeast fermentations. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Ermacora, Alessia; Hrnčiřík, Karel
2014-01-01
Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.
Fountoulakis, Konstantinos N; Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-02-01
This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. © The Author 2016. Published by Oxford University Press on behalf of CINP.
Policy Gradient Adaptive Dynamic Programming for Data-Based Optimal Control.
Luo, Biao; Liu, Derong; Wu, Huai-Ning; Wang, Ding; Lewis, Frank L
2017-10-01
The model-free optimal control problem of general discrete-time nonlinear systems is considered in this paper, and a data-based policy gradient adaptive dynamic programming (PGADP) algorithm is developed to design an adaptive optimal controller method. By using offline and online data rather than the mathematical system model, the PGADP algorithm improves control policy with a gradient descent scheme. The convergence of the PGADP algorithm is proved by demonstrating that the constructed Q -function sequence converges to the optimal Q -function. Based on the PGADP algorithm, the adaptive control method is developed with an actor-critic structure and the method of weighted residuals. Its convergence properties are analyzed, where the approximate Q -function converges to its optimum. Computer simulation results demonstrate the effectiveness of the PGADP-based adaptive control method.
Development of Advanced Methods of Structural and Trajectory Analysis for Transport Aircraft
NASA Technical Reports Server (NTRS)
Ardema, Mark D.
1996-01-01
In this report the author describes: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of flight path optimization. A method of estimating the load-bearing fuselage weight and wing weight of transport aircraft based on fundamental structural principles has been developed. This method of weight estimation represents a compromise between the rapid assessment of component weight using empirical methods based on actual weights of existing aircraft and detailed, but time-consuming, analysis using the finite element method. The method was applied to eight existing subsonic transports for validation and correlation. Integration of the resulting computer program, PDCYL, has been made into the weights-calculating module of the AirCraft SYNThesis (ACSYNT) computer program. ACSYNT bas traditionally used only empirical weight estimation methods; PDCYL adds to ACSYNT a rapid, accurate means of assessing the fuselage and wing weights of unconventional aircraft. PDCYL also allows flexibility in the choice of structural concept, as well as a direct means of determining the impact of advanced materials on structural weight.
Rotor dynamic simulation and system identification methods for application to vacuum whirl data
NASA Technical Reports Server (NTRS)
Berman, A.; Giansante, N.; Flannelly, W. G.
1980-01-01
Methods of using rotor vacuum whirl data to improve the ability to model helicopter rotors were developed. The work consisted of the formulation of the equations of motion of elastic blades on a hub using a Galerkin method; the development of a general computer program for simulation of these equations; the study and implementation of a procedure for determining physical parameters based on measured data; and the application of a method for computing the normal modes and natural frequencies based on test data.
Application of remote sensing to reconnaissance geologic mapping and mineral exploration
NASA Technical Reports Server (NTRS)
Birnie, R. W.; Dykstra, J. D.
1978-01-01
A method of mapping geology at a reconnaissance scale and locating zones of possible hydrothermal alteration has been developed. This method is based on principal component analysis of Landsat digital data and is applied to the desert area of the Chagai Hills, Baluchistan, Pakistan. A method for airborne spectrometric detection of geobotanical anomalies associated with prophyry Cu-Mo mineralization at Heddleston, Montana has also been developed. This method is based on discriminants in the 0.67 micron and 0.79 micron region of the spectrum.
Facilitating Pre-Service Teachers to Develop Regulation of Cognition with Learning Management System
ERIC Educational Resources Information Center
Gutman, Mary
2017-01-01
The object of the present study is to propose a technologically based method for developing Regulation of Cognition (RC) among pre-service teachers in a pedagogical problem context. The research intervention was carried out by two groups during a Teaching Training Workshop, based on the IMPROVE instructional method, which was implemented in the…
Enhanced data validation strategy of air quality monitoring network.
Harkat, Mohamed-Faouzi; Mansouri, Majdi; Nounou, Mohamed; Nounou, Hazem
2018-01-01
Quick validation and detection of faults in measured air quality data is a crucial step towards achieving the objectives of air quality networks. Therefore, the objectives of this paper are threefold: (i) to develop a modeling technique that can be used to predict the normal behavior of air quality variables and help provide accurate reference for monitoring purposes; (ii) to develop fault detection method that can effectively and quickly detect any anomalies in measured air quality data. For this purpose, a new fault detection method that is based on the combination of generalized likelihood ratio test (GLRT) and exponentially weighted moving average (EWMA) will be developed. GLRT is a well-known statistical fault detection method that relies on maximizing the detection probability for a given false alarm rate. In this paper, we propose to develop GLRT-based EWMA fault detection method that will be able to detect the changes in the values of certain air quality variables; (iii) to develop fault isolation and identification method that allows defining the fault source(s) in order to properly apply appropriate corrective actions. In this paper, reconstruction approach that is based on Midpoint-Radii Principal Component Analysis (MRPCA) model will be developed to handle the types of data and models associated with air quality monitoring networks. All air quality modeling, fault detection, fault isolation and reconstruction methods developed in this paper will be validated using real air quality data (such as particulate matter, ozone, nitrogen and carbon oxides measurement). Copyright © 2017 Elsevier Inc. All rights reserved.
Côté, José; Cossette, Sylvie; Ramirez-Garcia, Pilar; Rouleau, Geneviève; Auger, Patricia; Boudreau, François; Gagnon, Marie-Pierre
2017-01-01
Background . In the domain of health behavior change, the deployment and utilization of information and communications technologies as a way to deliver interventions appear to be promising. This article describes the development of a web-based tailored intervention, TAVIE en santé , to support people living with HIV in the adoption of healthy behaviors. Methods . This intervention was developed through an Intervention Mapping (IM) framework and is based on the theory of planned behavior. Results . Crucial steps of IM are the selection of key determinants of behavior and the selection of useful theory-based intervention methods to change the targeted determinants (active ingredients). The content and the sequence of the intervention are then created based on these parameters. TAVIE en santé is composed of 7 interactive web sessions hosted by a virtual nurse. It aims to develop and strengthen skills required for behavior change. Based on an algorithm using individual cognitive data (attitude, perceived behavioral control, and intention), the number of sessions, theory-based intervention methods, and messages contents are tailored to each user. Conclusion . TAVIE en santé is currently being evaluated. The use of IM allows developing intervention with a systematic approach based on theory, empirical evidence, and clinical and experiential knowledge.
Cossette, Sylvie; Ramirez-Garcia, Pilar; Rouleau, Geneviève; Auger, Patricia; Boudreau, François; Gagnon, Marie-Pierre
2017-01-01
Background. In the domain of health behavior change, the deployment and utilization of information and communications technologies as a way to deliver interventions appear to be promising. This article describes the development of a web-based tailored intervention, TAVIE en santé, to support people living with HIV in the adoption of healthy behaviors. Methods. This intervention was developed through an Intervention Mapping (IM) framework and is based on the theory of planned behavior. Results. Crucial steps of IM are the selection of key determinants of behavior and the selection of useful theory-based intervention methods to change the targeted determinants (active ingredients). The content and the sequence of the intervention are then created based on these parameters. TAVIE en santé is composed of 7 interactive web sessions hosted by a virtual nurse. It aims to develop and strengthen skills required for behavior change. Based on an algorithm using individual cognitive data (attitude, perceived behavioral control, and intention), the number of sessions, theory-based intervention methods, and messages contents are tailored to each user. Conclusion. TAVIE en santé is currently being evaluated. The use of IM allows developing intervention with a systematic approach based on theory, empirical evidence, and clinical and experiential knowledge. PMID:28393077
Computational Methods in Drug Discovery
Sliwoski, Gregory; Kothiwale, Sandeepkumar; Meiler, Jens
2014-01-01
Computer-aided drug discovery/design methods have played a major role in the development of therapeutically important small molecules for over three decades. These methods are broadly classified as either structure-based or ligand-based methods. Structure-based methods are in principle analogous to high-throughput screening in that both target and ligand structure information is imperative. Structure-based approaches include ligand docking, pharmacophore, and ligand design methods. The article discusses theory behind the most important methods and recent successful applications. Ligand-based methods use only ligand information for predicting activity depending on its similarity/dissimilarity to previously known active ligands. We review widely used ligand-based methods such as ligand-based pharmacophores, molecular descriptors, and quantitative structure-activity relationships. In addition, important tools such as target/ligand data bases, homology modeling, ligand fingerprint methods, etc., necessary for successful implementation of various computer-aided drug discovery/design methods in a drug discovery campaign are discussed. Finally, computational methods for toxicity prediction and optimization for favorable physiologic properties are discussed with successful examples from literature. PMID:24381236
Cellular Metabolomics for Exposure and Toxicity Assessment
We have developed NMR automation and cell quench methods for cell culture-based metabolomics to study chemical exposure and toxicity. Our flow automation method is robust and free of cross contamination. The direct cell quench method is rapid and effective. Cell culture-based met...
On Inertial Body Tracking in the Presence of Model Calibration Errors
Miezal, Markus; Taetz, Bertram; Bleser, Gabriele
2016-01-01
In inertial body tracking, the human body is commonly represented as a biomechanical model consisting of rigid segments with known lengths and connecting joints. The model state is then estimated via sensor fusion methods based on data from attached inertial measurement units (IMUs). This requires the relative poses of the IMUs w.r.t. the segments—the IMU-to-segment calibrations, subsequently called I2S calibrations—to be known. Since calibration methods based on static poses, movements and manual measurements are still the most widely used, potentially large human-induced calibration errors have to be expected. This work compares three newly developed/adapted extended Kalman filter (EKF) and optimization-based sensor fusion methods with an existing EKF-based method w.r.t. their segment orientation estimation accuracy in the presence of model calibration errors with and without using magnetometer information. While the existing EKF-based method uses a segment-centered kinematic chain biomechanical model and a constant angular acceleration motion model, the newly developed/adapted methods are all based on a free segments model, where each segment is represented with six degrees of freedom in the global frame. Moreover, these methods differ in the assumed motion model (constant angular acceleration, constant angular velocity, inertial data as control input), the state representation (segment-centered, IMU-centered) and the estimation method (EKF, sliding window optimization). In addition to the free segments representation, the optimization-based method also represents each IMU with six degrees of freedom in the global frame. In the evaluation on simulated and real data from a three segment model (an arm), the optimization-based method showed the smallest mean errors, standard deviations and maximum errors throughout all tests. It also showed the lowest dependency on magnetometer information and motion agility. Moreover, it was insensitive w.r.t. I2S position and segment length errors in the tested ranges. Errors in the I2S orientations were, however, linearly propagated into the estimated segment orientations. In the absence of magnetic disturbances, severe model calibration errors and fast motion changes, the newly developed IMU centered EKF-based method yielded comparable results with lower computational complexity. PMID:27455266
Two computational methods are proposed for estimation of the emission rate of volatile organic compounds (VOCs) from solvent-based indoor coating materials based on the knowledge of product formulation. The first method utilizes two previously developed mass transfer models with ...
NASA Astrophysics Data System (ADS)
Takei, Satoshi; Maki, Hirotaka; Sugahara, Kigen; Ito, Kenta; Hanabata, Makoto
2015-07-01
An electron beam (EB) lithography method using inedible cellulose-based resist material derived from woody biomass has been successfully developed. This method allows the use of pure water in the development process instead of the conventionally used tetramethylammonium hydroxide and anisole. The inedible cellulose-based biomass resist material, as an alternative to alpha-linked disaccharides in sugar derivatives that compete with food supplies, was developed by replacing the hydroxyl groups in the beta-linked disaccharides with EB-sensitive 2-methacryloyloxyethyl groups. A 75 nm line and space pattern at an exposure dose of 19 μC/cm2, a resist thickness uniformity of less than 0.4 nm on a 200 mm wafer, and low film thickness shrinkage under EB irradiation were achieved with this inedible cellulose-based biomass resist material using a water-based development process.
NASA Astrophysics Data System (ADS)
Wang, Dong
2016-03-01
Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.
Hernández, Marta; Rodríguez-Lázaro, David; Esteve, Teresa; Prat, Salomé; Pla, Maria
2003-12-15
Commercialization of several genetically modified crops has been approved worldwide to date. Uniplex polymerase chain reaction (PCR)-based methods to identify these different insertion events have been developed, but their use in the analysis of all commercially available genetically modified organisms (GMOs) is becoming progressively insufficient. These methods require a large number of assays to detect all possible GMOs present in the sample and thereby the development of multiplex PCR systems using combined probes and primers targeted to sequences specific to various GMOs is needed for detection of this increasing number of GMOs. Here we report on the development of a multiplex real-time PCR suitable for multiple GMO identification, based on the intercalating dye SYBR Green I and the analysis of the melting curves of the amplified products. Using this method, different amplification products specific for Maximizer 176, Bt11, MON810, and GA21 maize and for GTS 40-3-2 soybean were obtained and identified by their specific Tm. We have combined amplification of these products in a number of multiplex reactions and show the suitability of the methods for identification of GMOs with a sensitivity of 0.1% in duplex reactions. The described methods offer an economic and simple alternative to real-time PCR systems based on sequence-specific probes (i.e., TaqMan chemistry). These methods can be used as selection tests and further optimized for uniplex GMO quantification.
Emergy analysis of an industrial park: the case of Dalian, China.
Geng, Yong; Zhang, Pan; Ulgiati, Sergio; Sarkis, Joseph
2010-10-15
With the rapid development of eco-industrial park projects in China, evaluating their overall eco-efficiency is becoming an important need and a big challenge academically. Developing ecologically conscious industrial park management requires analysis of both industrial and ecological systems. Traditional evaluation methods based on neoclassical economics and embodied energy and exergy analyses have certain limitations due to their focus with environmental issues considered secondary to the maximization of economic and technical objectives. Such methods focus primarily on the environmental impact of emissions and their economic consequences. These approaches ignore the contribution of ecological products and services as well as the load placed on environmental systems and related problems of carrying capacity of economic and industrial development. This paper presents a new method, based upon emergy analysis and synthesis. Such a method links economic and ecological systems together, highlighting the internal relations among the different subsystems and components. The emergy-based method provides insight into the environmental performance and sustainability of an industrial park. This paper depicts the methodology of emergy analysis at the industrial park level and provides a series of emergy-based indices. A case study is investigated and discussed in order to show the emergy method's practical potential. Results from DEDZ (Dalian Economic Development Zone) case show us the potential of emergy synthesis method at the industrial park level for environmental policy making. Its advantages and limitations are also discussed with avenues for future research identified. Copyright © 2010 Elsevier B.V. All rights reserved.
AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES
Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...
A number of PCR-based methods for detecting human fecal material in environmental waters have been developed over the past decade, but these methods have rarely received independent comparative testing. Here, we evaluated ten of these methods (BacH, BacHum-UCD, B. thetaiotaomic...
A comparative potency method for cancer risk assessment has been developed based upon a constant relative potency hypothesis. This method was developed and tested using data from a battery of short-term mutagenesis bioassays, animal tumorigenicity data and human lung cancer risk ...
Kraft, Vadim; Grützke, Martin; Weber, Waldemar; Winter, Martin; Nowak, Sascha
2014-08-08
A method based on the coupling of ion chromatography (IC) and electrospray ionization mass spectrometry (ESI-MS) for the separation and determination of thermal decomposition products of LiPF6-based organic electrolytes is presented. The utilized electrolytes, LP30 and LP50, are commercially available and consist of 1mol/l LiPF6 dissolved in ethylene carbonate/dimethyl carbonate and ethylene carbonate/ethyl methyl carbonate, respectively. For the separation method development three ion chromatographic columns with different capacity and stationary phase were used and compared. Besides the known hydrolysis products of lithium hexafluorophosphate, several new organophosphates were separated and identified with the developed IC-ESI-MS method during aging investigations of the electrolytes. The chemical structures were elucidated with IC-ESI-MS/MS. Copyright © 2014 Elsevier B.V. All rights reserved.
Problem based learning: the effect of real time data on the website to student independence
NASA Astrophysics Data System (ADS)
Setyowidodo, I.; Pramesti, Y. S.; Handayani, A. D.
2018-05-01
Learning science developed as an integrative science rather than disciplinary education, the reality of the nation character development has not been able to form a more creative and independent Indonesian man. Problem Based Learning based on real time data in the website is a learning method focuses on developing high-level thinking skills in problem-oriented situations by integrating technology in learning. The essence of this study is the presentation of authentic problems in the real time data situation in the website. The purpose of this research is to develop student independence through Problem Based Learning based on real time data in website. The type of this research is development research with implementation using purposive sampling technique. Based on the study there is an increase in student self-reliance, where the students in very high category is 47% and in the high category is 53%. This learning method can be said to be effective in improving students learning independence in problem-oriented situations.
NASA Astrophysics Data System (ADS)
Cai, Yong; Cui, Xiangyang; Li, Guangyao; Liu, Wenyang
2018-04-01
The edge-smooth finite element method (ES-FEM) can improve the computational accuracy of triangular shell elements and the mesh partition efficiency of complex models. In this paper, an approach is developed to perform explicit finite element simulations of contact-impact problems with a graphical processing unit (GPU) using a special edge-smooth triangular shell element based on ES-FEM. Of critical importance for this problem is achieving finer-grained parallelism to enable efficient data loading and to minimize communication between the device and host. Four kinds of parallel strategies are then developed to efficiently solve these ES-FEM based shell element formulas, and various optimization methods are adopted to ensure aligned memory access. Special focus is dedicated to developing an approach for the parallel construction of edge systems. A parallel hierarchy-territory contact-searching algorithm (HITA) and a parallel penalty function calculation method are embedded in this parallel explicit algorithm. Finally, the program flow is well designed, and a GPU-based simulation system is developed, using Nvidia's CUDA. Several numerical examples are presented to illustrate the high quality of the results obtained with the proposed methods. In addition, the GPU-based parallel computation is shown to significantly reduce the computing time.
Jonnagaddala, Jitendra; Jue, Toni Rose; Chang, Nai-Wen; Dai, Hong-Jie
2016-01-01
The rapidly increasing biomedical literature calls for the need of an automatic approach in the recognition and normalization of disease mentions in order to increase the precision and effectivity of disease based information retrieval. A variety of methods have been proposed to deal with the problem of disease named entity recognition and normalization. Among all the proposed methods, conditional random fields (CRFs) and dictionary lookup method are widely used for named entity recognition and normalization respectively. We herein developed a CRF-based model to allow automated recognition of disease mentions, and studied the effect of various techniques in improving the normalization results based on the dictionary lookup approach. The dataset from the BioCreative V CDR track was used to report the performance of the developed normalization methods and compare with other existing dictionary lookup based normalization methods. The best configuration achieved an F-measure of 0.77 for the disease normalization, which outperformed the best dictionary lookup based baseline method studied in this work by an F-measure of 0.13. Database URL: https://github.com/TCRNBioinformatics/DiseaseExtract PMID:27504009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yagnik, Gargey B.
The main goal of the presented research is development of nanoparticle based matrix-assisted laser desorption ionization-mass spectrometry (MALDI-MS). This dissertation includes the application of previously developed data acquisition methods, development of novel sample preparation methods, application and comparison of novel nanoparticle matrices, and comparison of two nanoparticle matrix application methods for MALDI-MS and MALDI-MS imaging.
A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS
NASA Technical Reports Server (NTRS)
Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.
1989-01-01
In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.
Walsh, Jane C; Groarke, AnnMarie; Moss-Morris, Rona; Morrissey, Eimear; McGuire, Brian E
2017-01-01
Background Cancer-related fatigue (CrF) is the most common and disruptive symptom experienced by cancer survivors. We aimed to develop a theory-based, interactive Web-based intervention designed to facilitate self-management and enhance coping with CrF following cancer treatment. Objective The aim of our study was to outline the rationale, decision-making processes, methods, and findings which led to the development of a Web-based intervention to be tested in a feasibility trial. This paper outlines the process and method of development of the intervention. Methods An extensive review of the literature and qualitative research was conducted to establish a therapeutic approach for this intervention, based on theory. The psychological principles used in the development process are outlined, and we also clarify hypothesized causal mechanisms. We describe decision-making processes involved in the development of the content of the intervention, input from the target patient group and stakeholders, the design of the website features, and the initial user testing of the website. Results The cocreation of the intervention with the experts and service users allowed the design team to ensure that an acceptable intervention was developed. This evidence-based Web-based program is the first intervention of its kind based on self-regulation model theory, with the primary aim of targeting the representations of fatigue and enhancing self-management of CrF, specifically. Conclusions This research sought to integrate psychological theory, existing evidence of effective interventions, empirically derived principles of Web design, and the views of potential users into the systematic planning and design of the intervention of an easy-to-use website for cancer survivors. PMID:28676465
Multigrid Methods for Aerodynamic Problems in Complex Geometries
NASA Technical Reports Server (NTRS)
Caughey, David A.
1995-01-01
Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.
Methods for determining time of death.
Madea, Burkhard
2016-12-01
Medicolegal death time estimation must estimate the time since death reliably. Reliability can only be provided empirically by statistical analysis of errors in field studies. Determining the time since death requires the calculation of measurable data along a time-dependent curve back to the starting point. Various methods are used to estimate the time since death. The current gold standard for death time estimation is a previously established nomogram method based on the two-exponential model of body cooling. Great experimental and practical achievements have been realized using this nomogram method. To reduce the margin of error of the nomogram method, a compound method was developed based on electrical and mechanical excitability of skeletal muscle, pharmacological excitability of the iris, rigor mortis, and postmortem lividity. Further increasing the accuracy of death time estimation involves the development of conditional probability distributions for death time estimation based on the compound method. Although many studies have evaluated chemical methods of death time estimation, such methods play a marginal role in daily forensic practice. However, increased precision of death time estimation has recently been achieved by considering various influencing factors (i.e., preexisting diseases, duration of terminal episode, and ambient temperature). Putrefactive changes may be used for death time estimation in water-immersed bodies. Furthermore, recently developed technologies, such as H magnetic resonance spectroscopy, can be used to quantitatively study decompositional changes. This review addresses the gold standard method of death time estimation in forensic practice and promising technological and scientific developments in the field.
Flexible Regenerative Nanoelectronics for Advanced Peripheral Neural Interfaces
2017-10-01
these materials will be developed based on 3D printing . Page 4 Task 3. Construct nerve guidance scaffolds comprising of embedded mesh electrodes with...Develop photo mask patterning methods. 1-9 In progress 50% Subtask 2.2.2. Develop 3D printing patterning methods. 9-18 9/1/2017 Milestone(s...research into patterning techniques, we found that 10% gelatin methacrylate (GelMA) base gel was the best for performing 3D printing of the gels
It Really Works: Cultural Communication Proficiency.
ERIC Educational Resources Information Center
Bennett, Ruth, Ed.
This paper describes the cultural communication proficiency method of indigenous language instruction, developed at Humboldt State University's Center for Indian Community Development (California), and demonstrates the method with five Hupa lesson plans. The method is based on three principles: that Native American students learn by doing, learn…
Lin, Jinyao; Li, Xia
2016-04-01
Zoning eco-protected areas is important for ecological conservation and environmental management. Rapid and continuous urban expansion, however, may exert negative effects on the performance of practical zoning designs. Various methods have been developed for protected area zoning, but most of them failed to consider the conflicts between urban development (for the benefit of land developers) and ecological protection (local government). Some real-world zoning schemes even have to be modified occasionally after the lengthy negotiations between the government and land developers. Therefore, our study has presented a game theory-based method to deal with this problem. Future urban expansion in the study area will be predicted by a logistic regression cellular automaton, while eco-protected areas will be delimitated using multi-objective optimization algorithm. Then, two types of conflicts between them can be resolved based on game theory, a theory of decision-making. We established a two-person dynamic game for each conflict zone. The ecological compensation mechanism was taken into account by simulating the negotiation processes between the government and land developers. A final zoning scheme can be obtained when the two sides reach agreements. The proposed method is applied to the eco-protected area zoning in Guangzhou, a fast-growing city in China. The experiments indicate that the conflicts between eco-protection and urban development will inevitably arise when using only traditional zoning methods. Based on game theory, our method can effectively resolve those conflicts, and can provide a relatively reasonable zoning scheme. This method is expected to support policy-making in environmental management and urban planning. Copyright © 2015 Elsevier Ltd. All rights reserved.
Detection of heavy metal by paper-based microfluidics.
Lin, Yang; Gritsenko, Dmitry; Feng, Shaolong; Teh, Yi Chen; Lu, Xiaonan; Xu, Jie
2016-09-15
Heavy metal pollution has shown great threat to the environment and public health worldwide. Current methods for the detection of heavy metals require expensive instrumentation and laborious operation, which can only be accomplished in centralized laboratories. Various microfluidic paper-based analytical devices have been developed recently as simple, cheap and disposable alternatives to conventional ones for on-site detection of heavy metals. In this review, we first summarize current development of paper-based analytical devices and discuss the selection of paper substrates, methods of device fabrication, and relevant theories in these devices. We then compare and categorize recent reports on detection of heavy metals using paper-based microfluidic devices on the basis of various detection mechanisms, such as colorimetric, fluorescent, and electrochemical methods. To finalize, the future development and trend in this field are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.
Bioanalytical method transfer considerations of chromatographic-based assays.
Williard, Clark V
2016-07-01
Bioanalysis is an important part of the modern drug development process. The business practice of outsourcing and transferring bioanalytical methods from laboratory to laboratory has increasingly become a crucial strategy for successful and efficient delivery of therapies to the market. This chapter discusses important considerations when transferring various types of chromatographic-based assays in today's pharmaceutical research and development environment.
Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.
2018-01-01
Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718
Translating expert system rules into Ada code with validation and verification
NASA Technical Reports Server (NTRS)
Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam
1991-01-01
The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.
Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi
2013-01-01
In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.
MLACP: machine-learning-based prediction of anticancer peptides
Manavalan, Balachandran; Basith, Shaherin; Shin, Tae Hwan; Choi, Sun; Kim, Myeong Ok; Lee, Gwang
2017-01-01
Cancer is the second leading cause of death globally, and use of therapeutic peptides to target and kill cancer cells has received considerable attention in recent years. Identification of anticancer peptides (ACPs) through wet-lab experimentation is expensive and often time consuming; therefore, development of an efficient computational method is essential to identify potential ACP candidates prior to in vitro experimentation. In this study, we developed support vector machine- and random forest-based machine-learning methods for the prediction of ACPs using the features calculated from the amino acid sequence, including amino acid composition, dipeptide composition, atomic composition, and physicochemical properties. We trained our methods using the Tyagi-B dataset and determined the machine parameters by 10-fold cross-validation. Furthermore, we evaluated the performance of our methods on two benchmarking datasets, with our results showing that the random forest-based method outperformed the existing methods with an average accuracy and Matthews correlation coefficient value of 88.7% and 0.78, respectively. To assist the scientific community, we also developed a publicly accessible web server at www.thegleelab.org/MLACP.html. PMID:29100375
Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches
NASA Technical Reports Server (NTRS)
Farassat, Fereidoun; Casper, Jay H.
2006-01-01
In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.
Umesh P. Agarwal; Richard S. Reiner; Sally A. Ralph
2010-01-01
Two new methods based on FTâRaman spectroscopy, one simple, based on band intensity ratio, and the other using a partial least squares (PLS) regression model, are proposed to determine cellulose I crystallinity. In the simple method, crystallinity in cellulose I samples was determined based on univariate regression that was first developed using the Raman band...
Vanadium based materials as electrode materials for high performance supercapacitors
NASA Astrophysics Data System (ADS)
Yan, Yan; Li, Bing; Guo, Wei; Pang, Huan; Xue, Huaiguo
2016-10-01
As a kind of supercapacitors, pseudocapacitors have attracted wide attention in recent years. The capacitance of the electrochemical capacitors based on pseudocapacitance arises mainly from redox reactions between electrolytes and active materials. These materials usually have several oxidation states for oxidation and reduction. Many research teams have focused on the development of an alternative material for electrochemical capacitors. Many transition metal oxides have been shown to be suitable as electrode materials of electrochemical capacitors. Among them, vanadium based materials are being developed for this purpose. Vanadium based materials are known as one of the best active materials for high power/energy density electrochemical capacitors due to its outstanding specific capacitance and long cycle life, high conductivity and good electrochemical reversibility. There are different kinds of synthetic methods such as sol-gel hydrothermal/solvothermal method, template method, electrospinning method, atomic layer deposition, and electrodeposition method that have been successfully applied to prepare vanadium based electrode materials. In our review, we give an overall summary and evaluation of the recent progress in the research of vanadium based materials for electrochemical capacitors that include synthesis methods, the electrochemical performances of the electrode materials and the devices.
Segurado, P; Caiola, N; Pont, D; Oliveira, J M; Delaigue, O; Ferreira, M T
2014-04-01
In this work we compare two Iberian and a pan-European fish-based methods to assess ecological quality in rivers: the Fish-based Index of Biotic Integrity for Portuguese Wadeable Streams (F-IBIP), the Mediterranean Index of Biotic Integrity (IBIMED) and the pan-European Fish Index (EFI+). The results presented herein were developed in the context of the 2nd phase of the Intercalibration Exercise (IC), as required by the Water Frame Directive (WFD). The IC is aimed at ensuring comparability of the quality boundaries among the different WFD assessment methods developed by the Member States for each biological quality element. Although the two national assessment methods were developed for very distinct regions of Iberia (Western and Eastern Iberian Peninsula) they share the same methodological background: both are type-specific and guild-based multimetric indices. EFI+ is a multimetric guild-based model, but it is site-specific and uses a predictive modelling approach. The three indices were computed for all sites included in the Iberian Intercalibration database to allow the direct comparison, by means of linear regressions, of the resulting three quality values per site. The quality boundary harmonization between the two Iberian methods was only possible through an indirect comparison between the two indices, using EFI+ as a common metric. The three indices were also shown to be responsive to a common set of human induced pressures. This study highlights the need to develop general assessment methods adapted to wide geographical ranges with high species turnover to help intercalibrating assessment methods tailored for geographically more restricted regions. © 2013.
A Focusing Method in the Calibration Process of Image Sensors Based on IOFBs
Fernández, Pedro R.; Lázaro, José L.; Gardel, Alfredo; Cano, Ángel E.; Bravo, Ignacio
2010-01-01
A focusing procedure in the calibration process of image sensors based on Incoherent Optical Fiber Bundles (IOFBs) is described using the information extracted from fibers. These procedures differ from any other currently known focusing method due to the non spatial in-out correspondence between fibers, which produces a natural codification of the image to transmit. Focus measuring is essential prior to carrying out calibration in order to guarantee accurate processing and decoding. Four algorithms have been developed to estimate the focus measure; two methods based on mean grey level, and the other two based on variance. In this paper, a few simple focus measures are defined and compared. Some experimental results referred to the focus measure and the accuracy of the developed methods are discussed in order to demonstrate its effectiveness. PMID:22315526
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor-Pashow, K.; Fondeur, F.; White, T.
Savannah River National Laboratory (SRNL) was tasked with identifying and developing at least one, but preferably two methods for quantifying the suppressor in the Next Generation Solvent (NGS) system. The suppressor is a guanidine derivative, N,N',N"-tris(3,7-dimethyloctyl)guanidine (TiDG). A list of 10 possible methods was generated, and screening experiments were performed for 8 of the 10 methods. After completion of the screening experiments, the non-aqueous acid-base titration was determined to be the most promising, and was selected for further development as the primary method. {sup 1}H NMR also showed promising results from the screening experiments, and this method was selected formore » further development as the secondary method. Other methods, including {sup 36}Cl radiocounting and ion chromatography, also showed promise; however, due to the similarity to the primary method (titration) and the inability to differentiate between TiDG and TOA (tri-n-ocytlamine) in the blended solvent, {sup 1}H NMR was selected over these methods. Analysis of radioactive samples obtained from real waste ESS (extraction, scrub, strip) testing using the titration method showed good results. Based on these results, the titration method was selected as the method of choice for TiDG measurement. {sup 1}H NMR has been selected as the secondary (back-up) method, and additional work is planned to further develop this method and to verify the method using radioactive samples. Procedures for analyzing radioactive samples of both pure NGS and blended solvent were developed and issued for the both methods.« less
ERIC Educational Resources Information Center
Akerson, Valarie L.; Carter, Ingrid S.; Park Rogers, Meredith A.; Pongsanon, Khemmawadee
2018-01-01
In this mixed methods study, the researchers developed a video-based measure called a "Prediction Assessment" to determine preservice elementary teachers' abilities to predict students' scientific reasoning. The instrument is based on teachers' need to develop pedagogical content knowledge for teaching science. Developing a knowledge…
Culturally Based Intervention Development: The Case of Latino Families Dealing with Schizophrenia
ERIC Educational Resources Information Center
Barrio, Concepcion; Yamada, Ann-Marie
2010-01-01
Objectives: This article describes the process of developing a culturally based family intervention for Spanish-speaking Latino families with a relative diagnosed with schizophrenia. Method: Our iterative intervention development process was guided by a cultural exchange framework and based on findings from an ethnographic study. We piloted this…
A Method for Cognitive Task Analysis
1992-07-01
A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.
Venkateswarlu, Kambham; Rangareddy, Ardhgeri; Narasimhaiah, Kanaka; Sharma, Hemraj; Bandi, Naga Mallikarjuna Raja
2017-01-01
The main objective of present study was to develop a RP-HPLC method for estimation of Armodafinil in pharmaceutical dosage forms and characterization of its base hydrolytic product. The method was developed for Armodafinil estimation and base hydrolytic products were characterized. The separation was carried out on C18 column by using mobile phase as mixture of water and methanol (45:55%v/v). Eluents were detected at 220nm at 1ml/min. Stress studies were performed with milder conditions followed by stronger conditions so as to get sufficient degradation around 20%. A total of five degradation products were detected and separated from analyte. The linearity of the proposed method was investigated in the range of 20-120µg/ml for Armodafinil. The detection limit and quantification limit was found to be 0.01183μg/ml and 0.035µg/ml respectively. The precision % RSD was found to be less than 2% and the recovery was between 98-102%. Armodafinil was found to be more sensitive to the base hydrolysis and yielded its carboxylic acid as degradant. The developed method was stability indicating assay, suitable to quantify Armodafinil in presence of possible degradants. The drug was sensitive to acid, base &photolytic stress and resistant to thermal &oxidation.
NASA Astrophysics Data System (ADS)
Ronglian, Yuan; Mingye, Ai; Qiaona, Jia; Yuxuan, Liu
2018-03-01
Sustainable development is the only way for the development of human society. As an important part of the national economy, the steel industry is an energy-intensive industry and needs to go further for sustainable development. In this paper, we use entropy method and Topsis method to evaluate the development of China’s steel industry during the “12th Five-Year Plan” from four aspects: resource utilization efficiency, main energy and material consumption, pollution status and resource reuse rate. And we also put forward some suggestions for the development of China’s steel industry.
EPA Scientists Develop Research Methods for Studying Mold Fact Sheet
In 2002, U.S. Environmental Protection Agency researchers developed a DNA-based Mold Specific Quantitative Polymerase Chain Reaction method (MSQPCR) for identifying and quantifying over 100 common molds and fungi.
NASA Astrophysics Data System (ADS)
Kleinhans, Ilse; Van Rooy, J. Louis
2016-05-01
A sound understanding of the various factors influencing and associated with the formation of sinkholes or subsidences on dolomite land is essential for the selection of appropriate rehabilitation methods. The investigation and rehabilitation of numerous sinkholes and subsidences located on dolomite in the East Rand of South Africa, created an opportunity to develop a broad based understanding of different karst environments, their susceptibility to sinkhole and subsidence formation and best practice rehabilitation methods. This paper is based on the guidelines developed whereby the geological model of the sinkhole or subsidence is used to recommend an appropriate rehabilitation method. Nine typical geological models with recommended rehabilitation methods are presented in this paper.
Development of an ELA-DRA gene typing method based on pyrosequencing technology.
Díaz, S; Echeverría, M G; It, V; Posik, D M; Rogberg-Muñoz, A; Pena, N L; Peral-García, P; Vega-Pla, J L; Giovambattista, G
2008-11-01
The polymorphism of equine lymphocyte antigen (ELA) class II DRA gene had been detected by polymerase chain reaction-single-strand conformational polymorphism (PCR-SSCP) and reference strand-mediated conformation analysis. These methodologies allowed to identify 11 ELA-DRA exon 2 sequences, three of which are widely distributed among domestic horse breeds. Herein, we describe the development of a pyrosequencing-based method applicable to ELA-DRA typing, by screening samples from eight different horse breeds previously typed by PCR-SSCP. This sequence-based method would be useful in high-throughput genotyping of major histocompatibility complex genes in horses and other animal species, making this system interesting as a rapid screening method for animal genotyping of immune-related genes.
Use of focused ultrasonication in activity-based profiling of deubiquitinating enzymes in tissue.
Nanduri, Bindu; Shack, Leslie A; Rai, Aswathy N; Epperson, William B; Baumgartner, Wes; Schmidt, Ty B; Edelmann, Mariola J
2016-12-15
To develop a reproducible tissue lysis method that retains enzyme function for activity-based protein profiling, we compared four different methods to obtain protein extracts from bovine lung tissue: focused ultrasonication, standard sonication, mortar & pestle method, and homogenization combined with standard sonication. Focused ultrasonication and mortar & pestle methods were sufficiently effective for activity-based profiling of deubiquitinases in tissue, and focused ultrasonication also had the fastest processing time. We used focused-ultrasonicator for subsequent activity-based proteomic analysis of deubiquitinases to test the compatibility of this method in sample preparation for activity-based chemical proteomics. Copyright © 2016 Elsevier Inc. All rights reserved.
The Simulation of the Recharging Method Based on Solar Radiation for an Implantable Biosensor.
Li, Yun; Song, Yong; Kong, Xianyue; Li, Maoyuan; Zhao, Yufei; Hao, Qun; Gao, Tianxin
2016-09-10
A method of recharging implantable biosensors based on solar radiation is proposed. Firstly, the models of the proposed method are developed. Secondly, the recharging processes based on solar radiation are simulated using Monte Carlo (MC) method and the energy distributions of sunlight within the different layers of human skin have been achieved and discussed. Finally, the simulation results are verified experimentally, which indicates that the proposed method will contribute to achieve a low-cost, convenient and safe method for recharging implantable biosensors.
The Simulation of the Recharging Method Based on Solar Radiation for an Implantable Biosensor
Li, Yun; Song, Yong; Kong, Xianyue; Li, Maoyuan; Zhao, Yufei; Hao, Qun; Gao, Tianxin
2016-01-01
A method of recharging implantable biosensors based on solar radiation is proposed. Firstly, the models of the proposed method are developed. Secondly, the recharging processes based on solar radiation are simulated using Monte Carlo (MC) method and the energy distributions of sunlight within the different layers of human skin have been achieved and discussed. Finally, the simulation results are verified experimentally, which indicates that the proposed method will contribute to achieve a low-cost, convenient and safe method for recharging implantable biosensors. PMID:27626422
USDA-ARS?s Scientific Manuscript database
Objective: To develop and evaluate a method for calculating the Healthy Eating Index-2005 (HEI-2005) with the widely used Nutrition Data System for Research (NDSR) based on the method developed for use with the US Department of Agriculture’s (USDA) Food and Nutrient Dietary Data System (FNDDS) and M...
ERIC Educational Resources Information Center
Pinheiro, Sandro O.; Rohrer, Jonathan D.; Heimann, C. F. Larry
This paper describes a mixed method evaluation study that was developed to assess faculty teaching behavior change in a faculty development fellowship program for community-based hospital faculty. Principles of adult learning were taught to faculty participants over the fellowship period. These included instruction in teaching methods, group…
Kim, Huiyong; Hwang, Sung June; Lee, Kwang Soon
2015-02-03
Among various CO2 capture processes, the aqueous amine-based absorption process is considered the most promising for near-term deployment. However, the performance evaluation of newly developed solvents still requires complex and time-consuming procedures, such as pilot plant tests or the development of a rigorous simulator. Absence of accurate and simple calculation methods for the energy performance at an early stage of process development has lengthened and increased expense of the development of economically feasible CO2 capture processes. In this paper, a novel but simple method to reliably calculate the regeneration energy in a standard amine-based carbon capture process is proposed. Careful examination of stripper behaviors and exploitation of energy balance equations around the stripper allowed for calculation of the regeneration energy using only vapor-liquid equilibrium and caloric data. Reliability of the proposed method was confirmed by comparing to rigorous simulations for two well-known solvents, monoethanolamine (MEA) and piperazine (PZ). The proposed method can predict the regeneration energy at various operating conditions with greater simplicity, greater speed, and higher accuracy than those proposed in previous studies. This enables faster and more precise screening of various solvents and faster optimization of process variables and can eventually accelerate the development of economically deployable CO2 capture processes.
A Novel Quantum Dots-Based Point of Care Test for Syphilis
NASA Astrophysics Data System (ADS)
Yang, Hao; Li, Ding; He, Rong; Guo, Qin; Wang, Kan; Zhang, Xueqing; Huang, Peng; Cui, Daxiang
2010-05-01
One-step lateral flow test is recommended as the first line screening of syphilis for primary healthcare settings in developing countries. However, it generally shows low sensitivity. We describe here the development of a novel fluorescent POC (Point Of Care) test method to be used for screening for syphilis. The method was designed to combine the rapidness of lateral flow test and sensitiveness of fluorescent method. 50 syphilis-positive specimens and 50 healthy specimens conformed by Treponema pallidum particle agglutination (TPPA) were tested with Quantum Dot-labeled and colloidal gold-labeled lateral flow test strips, respectively. The results showed that both sensitivity and specificity of the quantum dots-based method reached up to 100% (95% confidence interval [CI], 91-100%), while those of the colloidal gold-based method were 82% (95% CI, 68-91%) and 100% (95% CI, 91-100%), respectively. In addition, the naked-eye detection limit of quantum dot-based method could achieve 2 ng/ml of anti-TP47 polyclonal antibodies purified by affinity chromatography with TP47 antigen, which was tenfold higher than that of colloidal gold-based method. In conclusion, the quantum dots were found to be suitable for labels of lateral flow test strip. Its ease of use, sensitiveness and low cost make it well-suited for population-based on-the-site syphilis screening.
Development, history, and future of automated cell counters.
Green, Ralph; Wachsmann-Hogiu, Sebastian
2015-03-01
Modern automated hematology instruments use either optical methods (light scatter), impedance-based methods based on the Coulter principle (changes in electrical current induced by blood cells flowing through an electrically charged opening), or a combination of both optical and impedance-based methods. Progressive improvement in these instruments has allowed the enumeration and evaluation of blood cells with great accuracy, precision, and speed at very low cost. Future directions of hematology instrumentation include the addition of new parameters and the development of point-of-care instrumentation. In the future, in-vivo analysis of blood cells may allow noninvasive and near-continuous measurements. Copyright © 2015 Elsevier Inc. All rights reserved.
Task-based statistical image reconstruction for high-quality cone-beam CT
NASA Astrophysics Data System (ADS)
Dang, Hao; Webster Stayman, J.; Xu, Jennifer; Zbijewski, Wojciech; Sisniega, Alejandro; Mow, Michael; Wang, Xiaohui; Foos, David H.; Aygun, Nafi; Koliatsos, Vassilis E.; Siewerdsen, Jeffrey H.
2017-11-01
Task-based analysis of medical imaging performance underlies many ongoing efforts in the development of new imaging systems. In statistical image reconstruction, regularization is often formulated in terms to encourage smoothness and/or sharpness (e.g. a linear, quadratic, or Huber penalty) but without explicit formulation of the task. We propose an alternative regularization approach in which a spatially varying penalty is determined that maximizes task-based imaging performance at every location in a 3D image. We apply the method to model-based image reconstruction (MBIR—viz., penalized weighted least-squares, PWLS) in cone-beam CT (CBCT) of the head, focusing on the task of detecting a small, low-contrast intracranial hemorrhage (ICH), and we test the performance of the algorithm in the context of a recently developed CBCT prototype for point-of-care imaging of brain injury. Theoretical predictions of local spatial resolution and noise are computed via an optimization by which regularization (specifically, the quadratic penalty strength) is allowed to vary throughout the image to maximize local task-based detectability index ({{d}\\prime} ). Simulation studies and test-bench experiments were performed using an anthropomorphic head phantom. Three PWLS implementations were tested: conventional (constant) penalty; a certainty-based penalty derived to enforce constant point-spread function, PSF; and the task-based penalty derived to maximize local detectability at each location. Conventional (constant) regularization exhibited a fairly strong degree of spatial variation in {{d}\\prime} , and the certainty-based method achieved uniform PSF, but each exhibited a reduction in detectability compared to the task-based method, which improved detectability up to ~15%. The improvement was strongest in areas of high attenuation (skull base), where the conventional and certainty-based methods tended to over-smooth the data. The task-driven reconstruction method presents a promising regularization method in MBIR by explicitly incorporating task-based imaging performance as the objective. The results demonstrate improved ICH conspicuity and support the development of high-quality CBCT systems.
Research and Development of Web-Based Virtual Online Classroom
ERIC Educational Resources Information Center
Yang, Zongkai; Liu, Qingtang
2007-01-01
To build a web-based virtual learning environment depends on information technologies, concerns technology supporting learning methods and theories. A web-based virtual online classroom is designed and developed based on learning theories and streaming media technologies. And it is composed of two parts: instructional communicating environment…
An advanced analysis method of initial orbit determination with too short arc data
NASA Astrophysics Data System (ADS)
Li, Binzhe; Fang, Li
2018-02-01
This paper studies the initial orbit determination (IOD) based on space-based angle measurement. Commonly, these space-based observations have short durations. As a result, classical initial orbit determination algorithms give poor results, such as Laplace methods and Gauss methods. In this paper, an advanced analysis method of initial orbit determination is developed for space-based observations. The admissible region and triangulation are introduced in the method. Genetic algorithm is also used for adding some constraints of parameters. Simulation results show that the algorithm can successfully complete the initial orbit determination.
Hardcastle, Thomas J
2016-01-15
High-throughput data are now commonplace in biological research. Rapidly changing technologies and application mean that novel methods for detecting differential behaviour that account for a 'large P, small n' setting are required at an increasing rate. The development of such methods is, in general, being done on an ad hoc basis, requiring further development cycles and a lack of standardization between analyses. We present here a generalized method for identifying differential behaviour within high-throughput biological data through empirical Bayesian methods. This approach is based on our baySeq algorithm for identification of differential expression in RNA-seq data based on a negative binomial distribution, and in paired data based on a beta-binomial distribution. Here we show how the same empirical Bayesian approach can be applied to any parametric distribution, removing the need for lengthy development of novel methods for differently distributed data. Comparisons with existing methods developed to address specific problems in high-throughput biological data show that these generic methods can achieve equivalent or better performance. A number of enhancements to the basic algorithm are also presented to increase flexibility and reduce computational costs. The methods are implemented in the R baySeq (v2) package, available on Bioconductor http://www.bioconductor.org/packages/release/bioc/html/baySeq.html. tjh48@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Mcclellan, James H.; Ravichandran, Lakshminarayan; Tridandapani, Srini
2013-01-01
Two novel methods for detecting cardiac quiescent phases from B-mode echocardiography using a correlation-based frame-to-frame deviation measure were developed. Accurate knowledge of cardiac quiescence is crucial to the performance of many imaging modalities, including computed tomography coronary angiography (CTCA). Synchronous electrocardiography (ECG) and echocardiography data were obtained from 10 healthy human subjects (four male, six female, 23–45 years) and the interventricular septum (IVS) was observed using the apical four-chamber echocardiographic view. The velocity of the IVS was derived from active contour tracking and verified using tissue Doppler imaging echocardiography methods. In turn, the frame-to-frame deviation methods for identifying quiescence of the IVS were verified using active contour tracking. The timing of the diastolic quiescent phase was found to exhibit both inter- and intra-subject variability, suggesting that the current method of CTCA gating based on the ECG is suboptimal and that gating based on signals derived from cardiac motion are likely more accurate in predicting quiescence for cardiac imaging. Two robust and efficient methods for identifying cardiac quiescent phases from B-mode echocardiographic data were developed and verified. The methods presented in this paper will be used to develop new CTCA gating techniques and quantify the resulting potential improvement in CTCA image quality. PMID:26609501
Pathway analysis with next-generation sequencing data.
Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao
2015-04-01
Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.
NHEERL is conducting a demonstration project to develop tools and approaches for assessing the risks of multiple stressors to populations of piscivorous wildlife, leading to the development of risk-based criteria. Specifically, we are developing methods and approaches to assess...
Reddy, M Rami; Singh, U C; Erion, Mark D
2004-05-26
Free-energy perturbation (FEP) is considered the most accurate computational method for calculating relative solvation and binding free-energy differences. Despite some success in applying FEP methods to both drug design and lead optimization, FEP calculations are rarely used in the pharmaceutical industry. One factor limiting the use of FEP is its low throughput, which is attributed in part to the dependence of conventional methods on the user's ability to develop accurate molecular mechanics (MM) force field parameters for individual drug candidates and the time required to complete the process. In an attempt to find an FEP method that could eventually be automated, we developed a method that uses quantum mechanics (QM) for treating the solute, MM for treating the solute surroundings, and the FEP method for computing free-energy differences. The thread technique was used in all transformations and proved to be essential for the successful completion of the calculations. Relative solvation free energies for 10 structurally diverse molecular pairs were calculated, and the results were in close agreement with both the calculated results generated by conventional FEP methods and the experimentally derived values. While considerably more CPU demanding than conventional FEP methods, this method (QM/MM-based FEP) alleviates the need for development of molecule-specific MM force field parameters and therefore may enable future automation of FEP-based calculations. Moreover, calculation accuracy should be improved over conventional methods, especially for calculations reliant on MM parameters derived in the absence of experimental data.
Many PCR-based methods for microbial source tracking (MST) have been developed and validated within individual research laboratories. Inter-laboratory validation of these methods, however, has been minimal, and the effects of protocol standardization regimes have not been thor...
Mallik, Rangan; Wa, Chunling; Hage, David S.
2008-01-01
Two techniques were developed for the immobilization of proteins and other ligands to silica through sulfhydryl groups. These methods made use of maleimide-activated silica (the SMCC method) or iodoacetyl-activated silica (the SIA method). The resulting supports were tested for use in high-performance affinity chromatography by employing human serum albumin (HSA) as a model protein. Studies with normal and iodoacetamide-modified HSA indicated that these methods had a high selectivity for sulfhydryl groups on this protein, which accounted for the coupling of 77–81% of this protein to maleimide- or iodacetyl-activated silica. These supports were also evaluated in terms of their total protein content, binding capacity, specific activity, non-specific binding, stability and chiral selectivity for several test solutes. HSA columns prepared using maleimide-activated silica gave the best overall results for these properties when compared to HSA that had been immobilized to silica through the Schiff base method (i.e., an amine-based coupling technique). A key advantage of the supports developed in this work is that they offer the potential of giving greater site-selective immobilization and ligand activity than amine-based coupling methods. These features make these supports attractive in the development of protein columns for such applications as the study of biological interactions and chiral separations. PMID:17297940
Woodruff, Tracey J; Sutton, Patrice
2014-10-01
Synthesizing what is known about the environmental drivers of health is instrumental to taking prevention-oriented action. Methods of research synthesis commonly used in environmental health lag behind systematic review methods developed in the clinical sciences over the past 20 years. We sought to develop a proof of concept of the "Navigation Guide," a systematic and transparent method of research synthesis in environmental health. The Navigation Guide methodology builds on best practices in research synthesis in evidence-based medicine and environmental health. Key points of departure from current methods of expert-based narrative review prevalent in environmental health include a prespecified protocol, standardized and transparent documentation including expert judgment, a comprehensive search strategy, assessment of "risk of bias," and separation of the science from values and preferences. Key points of departure from evidence-based medicine include assigning a "moderate" quality rating to human observational studies and combining diverse evidence streams. The Navigation Guide methodology is a systematic and rigorous approach to research synthesis that has been developed to reduce bias and maximize transparency in the evaluation of environmental health information. Although novel aspects of the method will require further development and validation, our findings demonstrated that improved methods of research synthesis under development at the National Toxicology Program and under consideration by the U.S. Environmental Protection Agency are fully achievable. The institutionalization of robust methods of systematic and transparent review would provide a concrete mechanism for linking science to timely action to prevent harm.
ERIC Educational Resources Information Center
Zou, Junhua; Liu, Qingtang; Yang, Zongkai
2012-01-01
Based on Competence Motivation Theory (CMT), a Moodle course for schoolchildren's table tennis learning was developed (The URL is http://www.bssepp.com, and this course allows guest access). The effects of the course on students' knowledge, perceived competence and interest were evaluated through quantitative methods. The sample of the study…
ERIC Educational Resources Information Center
Jesness, Bradley
This paper examines concepts in information-processing theory which are likely to be relevant to development and characterizes the methods and data upon which the concepts are based. Among the concepts examined are those which have slight empirical grounds. Other concepts examined are those which seem to have empirical bases but which are…
Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika
2017-01-01
Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.
An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...
Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J
2017-12-01
qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.
Model-Based Method for Sensor Validation
NASA Technical Reports Server (NTRS)
Vatan, Farrokh
2012-01-01
Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).
Shidahara, Miho; Watabe, Hiroshi; Kim, Kyeong Min; Kato, Takashi; Kawatsu, Shoji; Kato, Rikio; Yoshimura, Kumiko; Iida, Hidehiro; Ito, Kengo
2005-10-01
An image-based scatter correction (IBSC) method was developed to convert scatter-uncorrected into scatter-corrected SPECT images. The purpose of this study was to validate this method by means of phantom simulations and human studies with 99mTc-labeled tracers, based on comparison with the conventional triple energy window (TEW) method. The IBSC method corrects scatter on the reconstructed image I(mub)AC with Chang's attenuation correction factor. The scatter component image is estimated by convolving I(mub)AC with a scatter function followed by multiplication with an image-based scatter fraction function. The IBSC method was evaluated with Monte Carlo simulations and 99mTc-ethyl cysteinate dimer SPECT human brain perfusion studies obtained from five volunteers. The image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were compared. Using data obtained from the simulations, the image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were found to be nearly identical for both gray and white matter. In human brain images, no significant differences in image contrast were observed between the IBSC and TEW methods. The IBSC method is a simple scatter correction technique feasible for use in clinical routine.
Uncertainty Modeling for Structural Control Analysis and Synthesis
NASA Technical Reports Server (NTRS)
Campbell, Mark E.; Crawley, Edward F.
1996-01-01
The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.
Thupayagale-Tshweneagae, Gloria
2011-12-01
The article describes a framework and the process for the development of the peer-based mental health support programme and its implementation. The development of a peer-based mental health support programme is based on Erikson's theory on the adolescent phase of development, the psycho-educational processes; the peer approach and the orphaned adolescents lived experiences as conceptual framework. A triangulation of five qualitative methods of photography, reflective diaries, focus groups, event history calendar and field notes were used to capture the lived experiences of adolescents orphaned to HIV and AIDS. Analysis of data followed Colaizzi's method of data analysis. The combination of psycho-education, Erikson's stages of development and peer support assisted the participants to gain knowledge and skills to overcome adversity and to assist them to become to more resilient. The peer based mental health support programme if used would enhance the mental health of adolescent orphans.
2001-10-25
Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for
Condition number estimation of preconditioned matrices.
Kushida, Noriyuki
2015-01-01
The present paper introduces a condition number estimation method for preconditioned matrices. The newly developed method provides reasonable results, while the conventional method which is based on the Lanczos connection gives meaningless results. The Lanczos connection based method provides the condition numbers of coefficient matrices of systems of linear equations with information obtained through the preconditioned conjugate gradient method. Estimating the condition number of preconditioned matrices is sometimes important when describing the effectiveness of new preconditionerers or selecting adequate preconditioners. Operating a preconditioner on a coefficient matrix is the simplest method of estimation. However, this is not possible for large-scale computing, especially if computation is performed on distributed memory parallel computers. This is because, the preconditioned matrices become dense, even if the original matrices are sparse. Although the Lanczos connection method can be used to calculate the condition number of preconditioned matrices, it is not considered to be applicable to large-scale problems because of its weakness with respect to numerical errors. Therefore, we have developed a robust and parallelizable method based on Hager's method. The feasibility studies are curried out for the diagonal scaling preconditioner and the SSOR preconditioner with a diagonal matrix, a tri-daigonal matrix and Pei's matrix. As a result, the Lanczos connection method contains around 10% error in the results even with a simple problem. On the other hand, the new method contains negligible errors. In addition, the newly developed method returns reasonable solutions when the Lanczos connection method fails with Pei's matrix, and matrices generated with the finite element method.
Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-09-20
These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less
The Use of Intervention Mapping to Develop a Tailored Web-Based Intervention, Condom-HIM
2017-01-01
Background Many HIV (human immunodeficiency virus) prevention interventions are currently being implemented and evaluated, with little information published on their development. A framework highlighting the method of development of an intervention can be used by others wanting to replicate interventions or develop similar interventions to suit other contexts and settings. It provides researchers with a comprehensive development process of the intervention. Objective The objective of this paper was to describe how a systematic approach, intervention mapping, was used to develop a tailored Web-based intervention to increase condom use among HIV-positive men who have sex with men. Methods The intervention was developed in consultation with a multidisciplinary team composed of academic researchers, community members, Web designers, and the target population. Intervention mapping involved a systematic process of 6 steps: (1) needs assessment; (2) identification of proximal intervention objectives; (3) selection of theory-based intervention methods and practical strategies; (4) development of intervention components and materials; (5) adoption, implementation, and maintenance; and (6) evaluation planning. Results The application of intervention mapping resulted in the development of a tailored Web-based intervention for HIV-positive men who have sex with men, called Condom-HIM. Conclusions Using intervention mapping as a systematic process to develop interventions is a feasible approach that specifically integrates the use of theory and empirical findings. Outlining the process used to develop a particular intervention provides clarification on the conceptual use of experimental interventions in addition to potentially identifying reasons for intervention failures. PMID:28428162
Music Retrieval Based on the Relation between Color Association and Lyrics
NASA Astrophysics Data System (ADS)
Nakamur, Tetsuaki; Utsumi, Akira; Sakamoto, Maki
Various methods for music retrieval have been proposed. Recently, many researchers are tackling developing methods based on the relationship between music and feelings. In our previous psychological study, we found that there was a significant correlation between colors evoked from songs and colors evoked only from lyrics, and showed that the music retrieval system using lyrics could be developed. In this paper, we focus on the relationship among music, lyrics and colors, and propose a music retrieval method using colors as queries and analyzing lyrics. This method estimates colors evoked from songs by analyzing lyrics of the songs. On the first step of our method, words associated with colors are extracted from lyrics. We assumed two types of methods to extract words associated with colors. In the one of two methods, the words are extracted based on the result of a psychological experiment. In the other method, in addition to the words extracted based on the result of the psychological experiment, the words from corpora for the Latent Semantic Analysis are extracted. On the second step, colors evoked from the extracted words are compounded, and the compounded colors are regarded as those evoked from the song. On the last step, colors as queries are compared with colors estimated from lyrics, and the list of songs is presented based on similarities. We evaluated the two methods described above and found that the method based on the psychological experiment and corpora performed better than the method only based on the psychological experiment. As a result, we showed that the method using colors as queries and analyzing lyrics is effective for music retrieval.
Development of Speaking Skills through Activity Based Learning at the Elementary Level
ERIC Educational Resources Information Center
Ul-Haq, Zahoor; Khurram, Bushra Ahmed; Bangash, Arshad Khan
2017-01-01
Purpose: This paper discusses an effective instructional method called "activity based learning" that can be used to develop the speaking skills of students in the elementary school level. The present study was conducted to determine the effect of activity based learning on the development of the speaking skills of low and high achievers…
ERIC Educational Resources Information Center
Shultz, Ginger V.; Li, Ye
2016-01-01
Problem-based learning methods support student learning of content as well as scientific skills. In the course of problem-based learning, students seek outside information related to the problem, and therefore, information literacy skills are practiced when problem-based learning is used. This work describes a mixed-methods approach to investigate…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-15
... that is based on rigorous scientifically based research methods to assess the effectiveness of a...) Relies on measurements or observational methods that provide reliable and valid data across evaluators... of innovative, cohesive models that are based on research and have demonstrated that they effectively...
Improved regulatory element prediction based on tissue-specific local epigenomic signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Yupeng; Gorkin, David U.; Dickel, Diane E.
Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulator y element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared withmore » existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types.« less
Automated variance reduction for MCNP using deterministic methods.
Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B
2005-01-01
In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.
Improved regulatory element prediction based on tissue-specific local epigenomic signatures
He, Yupeng; Gorkin, David U.; Dickel, Diane E.; ...
2017-02-13
Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulator y element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared withmore » existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types.« less
The dynamic micro computed tomography at SSRF
NASA Astrophysics Data System (ADS)
Chen, R.; Xu, L.; Du, G.; Deng, B.; Xie, H.; Xiao, T.
2018-05-01
Synchrotron radiation micro-computed tomography (SR-μCT) is a critical technique for quantitative characterizing the 3D internal structure of samples, recently the dynamic SR-μCT has been attracting vast attention since it can evaluate the three-dimensional structure evolution of a sample. A dynamic μCT method, which is based on monochromatic beam, was developed at the X-ray Imaging and Biomedical Application Beamline at Shanghai Synchrotron Radiation Facility, by combining the compressed sensing based CT reconstruction algorithm and hardware upgrade. The monochromatic beam based method can achieve quantitative information, and lower dose than the white beam base method in which the lower energy beam is absorbed by the sample rather than contribute to the final imaging signal. The developed method is successfully used to investigate the compression of the air sac during respiration in a bell cricket, providing new knowledge for further research on the insect respiratory system.
NASA Astrophysics Data System (ADS)
Drwal, Malgorzata N.; Agama, Keli; Pommier, Yves; Griffith, Renate
2013-12-01
Purely structure-based pharmacophores (SBPs) are an alternative method to ligand-based approaches and have the advantage of describing the entire interaction capability of a binding pocket. Here, we present the development of SBPs for topoisomerase I, an anticancer target with an unusual ligand binding pocket consisting of protein and DNA atoms. Different approaches to cluster and select pharmacophore features are investigated, including hierarchical clustering and energy calculations. In addition, the performance of SBPs is evaluated retrospectively and compared to the performance of ligand- and complex-based pharmacophores. SBPs emerge as a valid method in virtual screening and a complementary approach to ligand-focussed methods. The study further reveals that the choice of pharmacophore feature clustering and selection methods has a large impact on the virtual screening hit lists. A prospective application of the SBPs in virtual screening reveals that they can be used successfully to identify novel topoisomerase inhibitors.
Single Wall Carbon Nanotube Alignment Mechanisms for Non-Destructive Evaluation
NASA Technical Reports Server (NTRS)
Hong, Seunghun
2002-01-01
As proposed in our original proposal, we developed a new innovative method to assemble millions of single wall carbon nanotube (SWCNT)-based circuit components as fast as conventional microfabrication processes. This method is based on surface template assembly strategy. The new method solves one of the major bottlenecks in carbon nanotube based electrical applications and, potentially, may allow us to mass produce a large number of SWCNT-based integrated devices of critical interests to NASA.
Treuer, H; Hoevels, M; Luyken, K; Gierich, A; Kocher, M; Müller, R P; Sturm, V
2000-08-01
We have developed a densitometric method for measuring the isocentric accuracy and the accuracy of marking the isocentre position for linear accelerator based radiosurgery with circular collimators and room lasers. Isocentric shots are used to determine the accuracy of marking the isocentre position with room lasers and star shots are used to determine the wobble of the gantry and table rotation movement, the effect of gantry sag, the stereotactic collimator alignment, and the minimal distance between gantry and table rotation axes. Since the method is based on densitometric measurements, beam spot stability is implicitly tested. The method developed is also suitable for quality assurance and has proved to be useful in optimizing isocentric accuracy. The method is simple to perform and only requires a film box and film scanner for instrumentation. Thus, the method has the potential to become widely available and may therefore be useful in standardizing the description of linear accelerator based radiosurgical systems.
NASA Astrophysics Data System (ADS)
Miner, Nadine Elizabeth
1998-09-01
This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.
Experimental cocrystal screening and solution based scale-up cocrystallization methods.
Malamatari, Maria; Ross, Steven A; Douroumis, Dennis; Velaga, Sitaram P
2017-08-01
Cocrystals are crystalline single phase materials composed of two or more different molecular and/or ionic compounds generally in a stoichiometric ratio which are neither solvates nor simple salts. If one of the components is an active pharmaceutical ingredient (API), the term pharmaceutical cocrystal is often used. There is a growing interest among drug development scientists in exploring cocrystals, as means to address physicochemical, biopharmaceutical and mechanical properties and expand solid form diversity of the API. Conventionally, coformers are selected based on crystal engineering principles, and the equimolar mixtures of API and coformers are subjected to solution-based crystallization that are commonly employed in polymorph and salt screening. However, the availability of new knowledge on cocrystal phase behaviour in solid state and solutions has spurred the development and implementation of more rational experimental cocrystal screening as well as scale-up methods. This review aims to provide overview of commonly employed solid form screening techniques in drug development with an emphasis on cocrystal screening methodologies. The latest developments in understanding and the use of cocrystal phase diagrams in both screening and solution based scale-up methods are also presented. Final section is devoted to reviewing the state of the art research covering solution based scale-up cocrystallization process for different cocrystals besides more recent continuous crystallization methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Method for determiantion of the frequency-contrast characteristics of electronic-optic systems
NASA Astrophysics Data System (ADS)
Mardirossian, Garo; Zhekov, Zhivko
The frequency-contrast characteristics is an important criterion to judge the quality of electronic-optic systems, which boast an increasing application in space research, astronomy, martial art etc. The paper provides a brief description of the methods for determining the frequency-contrast characteristics of optic systems, developed at the Space Research Institute of the Bulgarian Academy of Science. The suggested methods have been used to develop a couple of electronic-optic systems participated in the designed ground-based and aerospace scientific-research equipment. Based on the obtained practical results, the conclusion was made that the methods provide to obtain sufficiently precise data, which coincide well with the results, obtained when using other methods.
Sawin, Kathleen J; Weiss, Marianne E; Johnson, Norah; Gralton, Karen; Malin, Shelly; Klingbeil, Carol; Lerret, Stacee M; Thompson, Jamie J; Zimmanck, Kim; Kaul, Molly; Schiffman, Rachel F
2017-03-01
Parents of hospitalized children, especially parents of children with complex and chronic health conditions, report not being adequately prepared for self-management of their child's care at home after discharge. No theory-based discharge intervention exists to guide pediatric nurses' preparation of parents for discharge. To develop a theory-based conversation guide to optimize nurses' preparation of parents for discharge and self-management of their child at home following hospitalization. Two frameworks and one method influenced the development of the intervention: the Individual and Family Self-Management Theory, Tanner's Model of Clinical Judgment, and the Teach-Back method. A team of nurse scientists, nursing leaders, nurse administrators, and clinical nurses developed and field tested the electronic version of a nine-domain conversation guide for use in acute care pediatric hospitals. The theory-based intervention operationalized self-management concepts, added components of nursing clinical judgment, and integrated the Teach-Back method. Development of a theory-based intervention, the translation of theoretical knowledge to clinical innovation, is an important step toward testing the effectiveness of the theory in guiding clinical practice. Clinical nurses will establish the practice relevance through future use and refinement of the intervention. © 2017 Sigma Theta Tau International.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2010 CFR
2010-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Modeling of Continuum Manipulators Using Pythagorean Hodograph Curves.
Singh, Inderjeet; Amara, Yacine; Melingui, Achille; Mani Pathak, Pushparaj; Merzouki, Rochdi
2018-05-10
Research on continuum manipulators is increasingly developing in the context of bionic robotics because of their many advantages over conventional rigid manipulators. Due to their soft structure, they have inherent flexibility, which makes it a huge challenge to control them with high performances. Before elaborating a control strategy of such robots, it is essential to reconstruct first the behavior of the robot through development of an approximate behavioral model. This can be kinematic or dynamic depending on the conditions of operation of the robot itself. Kinematically, two types of modeling methods exist to describe the robot behavior; quantitative methods describe a model-based method, and qualitative methods describe a learning-based method. In kinematic modeling of continuum manipulator, the assumption of constant curvature is often considered to simplify the model formulation. In this work, a quantitative modeling method is proposed, based on the Pythagorean hodograph (PH) curves. The aim is to obtain a three-dimensional reconstruction of the shape of the continuum manipulator with variable curvature, allowing the calculation of its inverse kinematic model (IKM). It is noticed that the performances of the PH-based kinematic modeling of continuum manipulators are considerable regarding position accuracy, shape reconstruction, and time/cost of the model calculation, than other kinematic modeling methods, for two cases: free load manipulation and variable load manipulation. This modeling method is applied to the compact bionic handling assistant (CBHA) manipulator for validation. The results are compared with other IKMs developed in case of CBHA manipulator.
NASA Astrophysics Data System (ADS)
Wang, Le
2003-10-01
Modern forest management poses an increasing need for detailed knowledge of forest information at different spatial scales. At the forest level, the information for tree species assemblage is desired whereas at or below the stand level, individual tree related information is preferred. Remote Sensing provides an effective tool to extract the above information at multiple spatial scales in the continuous time domain. To date, the increasing volume and readily availability of high-spatial-resolution data have lead to a much wider application of remotely sensed products. Nevertheless, to make effective use of the improving spatial resolution, conventional pixel-based classification methods are far from satisfactory. Correspondingly, developing object-based methods becomes a central challenge for researchers in the field of Remote Sensing. This thesis focuses on the development of methods for accurate individual tree identification and tree species classification. We develop a method in which individual tree crown boundaries and treetop locations are derived under a unified framework. We apply a two-stage approach with edge detection followed by marker-controlled watershed segmentation. Treetops are modeled from radiometry and geometry aspects. Specifically, treetops are assumed to be represented by local radiation maxima and to be located near the center of the tree-crown. As a result, a marker image was created from the derived treetop to guide a watershed segmentation to further differentiate overlapping trees and to produce a segmented image comprised of individual tree crowns. The image segmentation method developed achieves a promising result for a 256 x 256 CASI image. Then further effort is made to extend our methods to the multiscales which are constructed from a wavelet decomposition. A scale consistency and geometric consistency are designed to examine the gradients along the scale-space for the purpose of separating true crown boundary from unwanted textures occurring due to branches and twigs. As a result from the inverse wavelet transform, the tree crown boundary is enhanced while the unwanted textures are suppressed. Based on the enhanced image, an improvement is achieved when applying the two-stage methods to a high resolution aerial photograph. To improve tree species classification, we develop a new method to choose the optimal scale parameter with the aid of Bhattacharya Distance (BD), a well-known index of class separability in traditional pixel-based classification. The optimal scale parameter is then fed in the process of a region-growing-based segmentation as a break-off value. Our object classification achieves a better accuracy in separating tree species when compared to the conventional Maximum Likelihood Classification (MLC). In summary, we develop two object-based methods for identifying individual trees and classifying tree species from high-spatial resolution imagery. Both methods achieve promising results and will promote integration of Remote Sensing and GIS in forest applications.
Determination of free polysaccharide in Vi glycoconjugate vaccine against typhoid fever.
Giannelli, C; Cappelletti, E; Di Benedetto, R; Pippi, F; Arcuri, M; Di Cioccio, V; Martin, L B; Saul, A; Micoli, F
2017-05-30
Glycoconjugate vaccines based on the Vi capsular polysaccharide directed against Salmonella enterica serovar Typhi are licensed or in development against typhoid fever, an important cause of morbidity and mortality in developing countries. Quantification of free polysaccharide in conjugate vaccines is an important quality control for release, to monitor vaccine stability and to ensure appropriate immune response. However, we found that existing separation methods based on size are not appropriate as free Vi non-specifically binds to unconjugated and conjugated protein. We developed a method based on free Vi separation by Capto Adhere resin and quantification by HPAEC-PAD. The method has been tested for conjugates of Vi derived from Citrobacter freundii with different carrier proteins such as CRM 197 , Tetanus Toxoid and Diphtheria Toxoid. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
Genetics-based methods for detection of Salmonella spp. in foods.
Mozola, Mark A
2006-01-01
Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.
Deurenberg, Rikie; Vlayen, Joan; Guillo, Sylvie; Oliver, Thomas K; Fervers, Beatrice; Burgers, Jako
2008-03-01
Effective literature searching is particularly important for clinical practice guideline development. Sophisticated searching and filtering mechanisms are needed to help ensure that all relevant research is reviewed. To assess the methods used for the selection of evidence for guideline development by evidence-based guideline development organizations. A semistructured questionnaire assessing the databases, search filters and evaluation methods used for literature retrieval was distributed to eight major organizations involved in evidence-based guideline development. All of the organizations used search filters as part of guideline development. The medline database was the primary source accessed for literature retrieval. The OVID or SilverPlatter interfaces were used in preference to the freely accessed PubMed interface. The Cochrane Library, embase, cinahl and psycinfo databases were also frequently used by the organizations. All organizations reported the intention to improve and validate their filters for finding literature specifically relevant for guidelines. In the first international survey of its kind, eight major guideline development organizations indicated a strong interest in identifying, improving and standardizing search filters to improve guideline development. It is to be hoped that this will result in the standardization of, and open access to, search filters, an improvement in literature searching outcomes and greater collaboration among guideline development organizations.
Microscale Concentration Measurements Using Laser Light Scattering Methods
NASA Technical Reports Server (NTRS)
Niederhaus, Charles; Miller, Fletcher
2004-01-01
The development of lab-on-a-chip devices for microscale biochemical assays has led to the need for microscale concentration measurements of specific analyses. While fluorescence methods are the current choice, this method requires developing fluorophore-tagged conjugates for each analyte of interest. In addition, fluorescent imaging is also a volume-based method, and can be limiting as smaller detection regions are required.
Adaptive Modal Identification for Flutter Suppression Control
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.; Drew, Michael; Swei, Sean S.
2016-01-01
In this paper, we will develop an adaptive modal identification method for identifying the frequencies and damping of a flutter mode based on model-reference adaptive control (MRAC) and least-squares methods. The least-squares parameter estimation will achieve parameter convergence in the presence of persistent excitation whereas the MRAC parameter estimation does not guarantee parameter convergence. Two adaptive flutter suppression control approaches are developed: one based on MRAC and the other based on the least-squares method. The MRAC flutter suppression control is designed as an integral part of the parameter estimation where the feedback signal is used to estimate the modal information. On the other hand, the separation principle of control and estimation is applied to the least-squares method. The least-squares modal identification is used to perform parameter estimation.
An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.
Nicklas, Janice A; Buel, Eric
2005-09-01
The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).
A novel knowledge-based potential for RNA 3D structure evaluation
NASA Astrophysics Data System (ADS)
Yang, Yi; Gu, Qi; Zhang, Ben-Gong; Shi, Ya-Zhou; Shao, Zhi-Gang
2018-03-01
Ribonucleic acids (RNAs) play a vital role in biology, and knowledge of their three-dimensional (3D) structure is required to understand their biological functions. Recently structural prediction methods have been developed to address this issue, but a series of RNA 3D structures are generally predicted by most existing methods. Therefore, the evaluation of the predicted structures is generally indispensable. Although several methods have been proposed to assess RNA 3D structures, the existing methods are not precise enough. In this work, a new all-atom knowledge-based potential is developed for more accurately evaluating RNA 3D structures. The potential not only includes local and nonlocal interactions but also fully considers the specificity of each RNA by introducing a retraining mechanism. Based on extensive test sets generated from independent methods, the proposed potential correctly distinguished the native state and ranked near-native conformations to effectively select the best. Furthermore, the proposed potential precisely captured RNA structural features such as base-stacking and base-pairing. Comparisons with existing potential methods show that the proposed potential is very reliable and accurate in RNA 3D structure evaluation. Project supported by the National Science Foundation of China (Grants Nos. 11605125, 11105054, 11274124, and 11401448).
Jonnagaddala, Jitendra; Jue, Toni Rose; Chang, Nai-Wen; Dai, Hong-Jie
2016-01-01
The rapidly increasing biomedical literature calls for the need of an automatic approach in the recognition and normalization of disease mentions in order to increase the precision and effectivity of disease based information retrieval. A variety of methods have been proposed to deal with the problem of disease named entity recognition and normalization. Among all the proposed methods, conditional random fields (CRFs) and dictionary lookup method are widely used for named entity recognition and normalization respectively. We herein developed a CRF-based model to allow automated recognition of disease mentions, and studied the effect of various techniques in improving the normalization results based on the dictionary lookup approach. The dataset from the BioCreative V CDR track was used to report the performance of the developed normalization methods and compare with other existing dictionary lookup based normalization methods. The best configuration achieved an F-measure of 0.77 for the disease normalization, which outperformed the best dictionary lookup based baseline method studied in this work by an F-measure of 0.13.Database URL: https://github.com/TCRNBioinformatics/DiseaseExtract. © The Author(s) 2016. Published by Oxford University Press.
Application of Competency-Based Education in Laparoscopic Training
Xue, Dongbo; Bo, Hong; Zhao, Song; Meng, Xianzhi
2015-01-01
Background and Objectives: To induce competency-based education/developing a curriculum in the training of postgraduate students in laparoscopic surgery. Methods: This study selected postgraduate students before the implementation of competency-based education (n = 16) or after the implementation of competency-based education (n = 17). On the basis of the 5 competencies of patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, and professionalism, the research team created a developing a curriculum chart and specific improvement measures that were implemented in the competency-based education group. Results: On the basis of the developing a curriculum chart, the assessment of the 5 comprehensive competencies using the 360° assessment method indicated that the competency-based education group's competencies were significantly improved compared with those of the traditional group (P < .05). The improvement in the comprehensive assessment was also significant compared with the traditional group (P < .05). Conclusion: The implementation of competency-based education/developing a curriculum teaching helps to improve the comprehensive competencies of postgraduate students and enables them to become qualified clinicians equipped to meet society's needs. PMID:25901105
Type Theory, Computation and Interactive Theorem Proving
2015-09-01
postdoc Cody Roux, to develop new methods of verifying real-valued inequalities automatically. They developed a prototype implementation in Python [8] (an...he has developed new heuristic, geometric methods of verifying real-valued inequalities. A python -based implementation has performed surprisingly...express complex mathematical and computational assertions. In this project, Avigad and Harper developed type-theoretic algorithms and formalisms that
Parameter Studies, time-dependent simulations and design with automated Cartesian methods
NASA Technical Reports Server (NTRS)
Aftosmis, Michael
2005-01-01
Over the past decade, NASA has made a substantial investment in developing adaptive Cartesian grid methods for aerodynamic simulation. Cartesian-based methods played a key role in both the Space Shuttle Accident Investigation and in NASA's return to flight activities. The talk will provide an overview of recent technological developments focusing on the generation of large-scale aerodynamic databases, automated CAD-based design, and time-dependent simulations with of bodies in relative motion. Automation, scalability and robustness underly all of these applications and research in each of these topics will be presented.
Method of moments comparison for soot population modeling in turbulent combustion
NASA Astrophysics Data System (ADS)
Chong, Shao Teng; Im, Hong; Raman, Venkat
2017-11-01
Representation of soot population is an important component in the efficient computational prediction of particulate emissions. However, there are a number of moments-based techniques with varying numerical complexity. In the past, development of such methods has been principally carried out on canonical laminar and 0-D flows. However, their applications in realistic solvers developed for turbulent combustion may face challenges from turbulence closure to selection of moment sets. In this work, the accuracy and relative computational expense of a few common soot method of moments are tested in canonical turbulent flames for different configurations. Large eddy simulation (LES) will be used as the turbulence modeling framework. In grid-filtered LES, the interaction of numerical and modeling errors is a first-order problem that can undermine the accuracy of soot predictions. In the past, special moments-based methods for solvers that transport high frequency content fluid with ability to reconstruct particle size distribution have been developed. Here, a similar analysis will be carried out for the moment-based soot modeling approaches above. Specifically, realizability of moments methods with nonlinear advection schemes will be discussed.
NASA Astrophysics Data System (ADS)
Baumgartner, Matthew P.; Evans, David A.
2018-01-01
Two of the major ongoing challenges in computational drug discovery are predicting the binding pose and affinity of a compound to a protein. The Drug Design Data Resource Grand Challenge 2 was developed to address these problems and to drive development of new methods. The challenge provided the 2D structures of compounds for which the organizers help blinded data in the form of 35 X-ray crystal structures and 102 binding affinity measurements and challenged participants to predict the binding pose and affinity of the compounds. We tested a number of pose prediction methods as part of the challenge; we found that docking methods that incorporate protein flexibility (Induced Fit Docking) outperformed methods that treated the protein as rigid. We also found that using binding pose metadynamics, a molecular dynamics based method, to score docked poses provided the best predictions of our methods with an average RMSD of 2.01 Å. We tested both structure-based (e.g. docking) and ligand-based methods (e.g. QSAR) in the affinity prediction portion of the competition. We found that our structure-based methods based on docking with Smina (Spearman ρ = 0.614), performed slightly better than our ligand-based methods (ρ = 0.543), and had equivalent performance with the other top methods in the competition. Despite the overall good performance of our methods in comparison to other participants in the challenge, there exists significant room for improvement especially in cases such as these where protein flexibility plays such a large role.
Power System Transient Diagnostics Based on Novel Traveling Wave Detection
NASA Astrophysics Data System (ADS)
Hamidi, Reza Jalilzadeh
Modern electrical power systems demand novel diagnostic approaches to enhancing the system resiliency by improving the state-of-the-art algorithms. The proliferation of high-voltage optical transducers and high time-resolution measurements provide opportunities to develop novel diagnostic methods of very fast transients in power systems. At the same time, emerging complex configuration, such as multi-terminal hybrid transmission systems, limits the applications of the traditional diagnostic methods, especially in fault location and health monitoring. The impedance-based fault-location methods are inefficient for cross-bounded cables, which are widely used for connection of offshore wind farms to the main grid. Thus, this dissertation first presents a novel traveling wave-based fault-location method for hybrid multi-terminal transmission systems. The proposed method utilizes time-synchronized high-sampling voltage measurements. The traveling wave arrival times (ATs) are detected by observation of the squares of wavelet transformation coefficients. Using the ATs, an over-determined set of linear equations are developed for noise reduction, and consequently, the faulty segment is determined based on the characteristics of the provided equation set. Then, the fault location is estimated. The accuracy and capabilities of the proposed fault location method are evaluated and also compared to the existing traveling-wave-based method for a wide range of fault parameters. In order to improve power systems stability, auto-reclosing (AR), single-phase auto-reclosing (SPAR), and adaptive single-phase auto-reclosing (ASPAR) methods have been developed with the final objectives of distinguishing between the transient and permanent faults to clear the transient faults without de-energization of the solid phases. However, the features of the electrical arcs (transient faults) are severely influenced by a number of random parameters, including the convection of the air and plasma, wind speed, air pressure, and humidity. Therefore, the dead-time (the de-energization duration of the faulty phase) is unpredictable. Accordingly, conservatively long dead-times are usually considered by protection engineers. However, if the exact arc distinction time is determined, the power system stability and quality will enhance. Therefore, a new method for detection of arc extinction times leading to a new ASPAR method utilizing power line carrier (PLC) signals is presented. The efficiency of the proposed ASPAR method is verified through simulations and compared with the existing ASPAR methods. High-sampling measurements are prone to be skewed by the environmental noises and analog-to-digital (A/D) converters quantization errors. Therefore noise-contaminated measurements are the major source of uncertainties and errors in the outcomes of traveling wave-based diagnostic applications. The existing AT-detection methods do not provide enough sensitivity and selectivity at the same time. Therefore, a new AT-detection method based on short-time matrix pencil (STMPM) is developed to accurately detect ATs of the traveling waves with low signal-to-noise (SNR) ratios. As STMPM is based on matrix algebra, it is a challenging to implement this new technique in microprocessor-based fault locators. Hence, a fully recursive and computationally efficient method based on adaptive discrete Kalman filter (ADKF) is introduced for AT-detection, which is proper for microprocessors and able to accomplish accurate AT-detection for online applications such as ultra-high-speed protection. Both proposed AT-detection methods are evaluated based on extensive simulation studies, and the superior outcomes are compared to the existing methods.
ERIC Educational Resources Information Center
Paleeri, Sankaranarayanan
2015-01-01
Transaction methods and approaches of value education have to change from lecturing to process based methods according to the development of constructivist approach. The process based methods provide creative interpretation and active participation from student side. Teachers have to organize suitable activities to transact values through process…
Several library independent Microbial Source Tracking methods have been developed to rapidly determine the source of fecal contamination. Thus far, none of these methods have been tested in tropical marine waters. In this study, we used a Bacteroides 16S rDNA PCR-based...
CAE "FOCUS" for modelling and simulating electron optics systems: development and application
NASA Astrophysics Data System (ADS)
Trubitsyn, Andrey; Grachev, Evgeny; Gurov, Victor; Bochkov, Ilya; Bochkov, Victor
2017-02-01
Electron optics is a theoretical base of scientific instrument engineering. Mathematical simulation of occurring processes is a base for contemporary design of complicated devices of the electron optics. Problems of the numerical mathematical simulation are effectively solved by CAE system means. CAE "FOCUS" developed by the authors includes fast and accurate methods: boundary element method (BEM) for the electric field calculation, Runge-Kutta- Fieghlberg method for the charged particle trajectory computation controlling an accuracy of calculations, original methods for search of terms for the angular and time-of-flight focusing. CAE "FOCUS" is organized as a collection of modules each of which solves an independent (sub) task. A range of physical and analytical devices, in particular a microfocus X-ray tube of high power, has been developed using this soft.
Stevens, Katherine; Palfreyman, Simon
2012-12-01
To describe how qualitative methods can be used in the development of descriptive systems of preference-based measures (PBMs) of health-related quality of life. The requirements of the National Institute for Health and Clinical Excellence and other agencies together with the increasing use of patient-reported outcome measures has led to an increase in the demand for PBMs. Recently, interest has grown in developing new PBMs and while previous research on PBMs has mainly focused on the methods of valuation, research into the methods of developing descriptive systems is an emerging field. Traditionally, descriptive systems of PBMs were developed by using top-down methods, where content was derived from existing measures, the literature, or health surveys. A contrasting approach is a bottom-up methodology, which takes the views of patients or laypeople on how their life is affected by their health. This approach generally requires the use of qualitative methods. Qualitative methods lend themselves well to the development of PBMs. They also ensure that the measure has appropriate language, content validity, and responsiveness to change. While the use of qualitative methods in the development of non-PBMs is fairly standard, their use in developing PBMs was until recently nonexistent. In this article, we illustrate the use of qualitative methods by presenting two case studies of recently developed PBMs, one generic and one condition specific. We outline the stages involved, discuss the strengths and weaknesses of the approach, and compare with the top-down approach used in the majority of PBMs to date. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Parametric synthesis of a robust controller on a base of mathematical programming method
NASA Astrophysics Data System (ADS)
Khozhaev, I. V.; Gayvoronskiy, S. A.; Ezangina, T. A.
2018-05-01
Considered paper is dedicated to deriving sufficient conditions, linking root indices of robust control quality with coefficients of interval characteristic polynomial, on the base of mathematical programming method. On the base of these conditions, a method of PI- and PID-controllers, providing aperiodic transient process with acceptable stability degree and, subsequently, acceptable setting time, synthesis was developed. The method was applied to a problem of synthesizing a controller for a depth control system of an unmanned underwater vehicle.
Qiao, Tian-Min; Zhang, Jing; Li, Shu-Jiang; Han, Shan; Zhu, Tian-Hui
2016-10-01
Eucalyptus dieback disease, caused by Cylindrocladium scoparium , has occurred in last few years in large Eucalyptus planting areas in China and other countries. Rapid, simple, and reliable diagnostic techniques are desired for the early detection of Eucalyptus dieback of C. scoparium prior to formulation of efficient control plan. For this purpose, three PCR-based methods of nested PCR, multiplex PCR, loop-mediated isothermal amplification (LAMP) were developed for detection of C. scoparium based on factor 1-alpha (tef1) and beta-tubulin gene in this study. All of the three methods showed highly specific to C. scoparium . The sensitivities of the nested PCR and LAMP were much higher than the multiplex PCR. The sensitivity of multiplex PCR was also higher than regular PCR. C. scoparium could be detected within 60 min from infected Eucalyptus plants by LAMP, while at least 2 h was needed by the rest two methods. Using different Eucalyptus tissues as samples for C. scoparium detection, all of the three PCR-based methods showed much better detection results than regular PCR. Base on the results from this study, we concluded that any of the three PCR-based methods could be used as diagnostic technology for the development of efficient strategies of Eucalyptus dieback disease control. Particularly, LAMP was the most practical method in field application because of its one-step and rapid reaction, simple operation, single-tube utilization, and simple visualization of amplification products.
Qiao, Tian-Min; Zhang, Jing; Li, Shu-Jiang; Han, Shan; Zhu, Tian-Hui
2016-01-01
Eucalyptus dieback disease, caused by Cylindrocladium scoparium, has occurred in last few years in large Eucalyptus planting areas in China and other countries. Rapid, simple, and reliable diagnostic techniques are desired for the early detection of Eucalyptus dieback of C. scoparium prior to formulation of efficient control plan. For this purpose, three PCR-based methods of nested PCR, multiplex PCR, loop-mediated isothermal amplification (LAMP) were developed for detection of C. scoparium based on factor 1-alpha (tef1) and beta-tubulin gene in this study. All of the three methods showed highly specific to C. scoparium. The sensitivities of the nested PCR and LAMP were much higher than the multiplex PCR. The sensitivity of multiplex PCR was also higher than regular PCR. C. scoparium could be detected within 60 min from infected Eucalyptus plants by LAMP, while at least 2 h was needed by the rest two methods. Using different Eucalyptus tissues as samples for C. scoparium detection, all of the three PCR-based methods showed much better detection results than regular PCR. Base on the results from this study, we concluded that any of the three PCR-based methods could be used as diagnostic technology for the development of efficient strategies of Eucalyptus dieback disease control. Particularly, LAMP was the most practical method in field application because of its one-step and rapid reaction, simple operation, single-tube utilization, and simple visualization of amplification products. PMID:27721691
On the Development of Multi-Step Inverse FEM with Shell Model
NASA Astrophysics Data System (ADS)
Huang, Y.; Du, R.
2005-08-01
The inverse or one-step finite element approach is increasingly used in the sheet metal stamping industry to predict strain distribution and the initial blank shape in the preliminary design stage. Based on the existing theory, there are two types of method: one is based on the principle of virtual work and the other is based on the principle of extreme work. Much research has been conducted to improve the accuracy of simulation results. For example, based on the virtual work principle, Batoz et al. developed a new method using triangular DKT shell elements. In this new method, the bending and unbending effects are considered. Based on the principle of extreme work, Majlessi and et al. proposed the multi-step inverse approach with membrane elements and applied it to an axis-symmetric part. Lee and et al. presented an axis-symmetric shell element model to solve the similar problem. In this paper, a new multi-step inverse method is introduced with no limitation on the workpiece shape. It is a shell element model based on the virtual work principle. The new method is validated by means of comparing to the commercial software system (PAMSTAMP®). The comparison results indicate that the accuracy is good.
NASA Astrophysics Data System (ADS)
Smoczek, Jaroslaw
2015-10-01
The paper deals with the problem of reducing the residual vibration and limiting the transient oscillations of a flexible and underactuated system with respect to the variation of operating conditions. The comparative study of generalized predictive control (GPC) and fuzzy scheduling scheme developed based on the P1-TS fuzzy theory, local pole placement method and interval analysis of closed-loop system polynomial coefficients is addressed to the problem of flexible crane control. The two alternatives of a GPC-based method are proposed that enable to realize this technique either with or without a sensor of payload deflection. The first control technique is based on the recursive least squares (RLS) method applied to on-line estimate the parameters of a linear parameter varying (LPV) model of a crane dynamic system. The second GPC-based approach is based on a payload deflection feedback estimated using a pendulum model with the parameters interpolated using the P1-TS fuzzy system. Feasibility and applicability of the developed methods were confirmed through experimental verification performed on a laboratory scaled overhead crane.
Iranian Tentacles into Iraq: The Basis and Extent of Iranian Influence into Southern Iraq
2009-01-01
cultural values, based on their historical development. Tehran uses methods along the spectrum of psychological persuasion to influence and subvert the ...49 pages. Most of those who comment on Iran’s attempt to influence Iraqi Shia, do so without considering the historical and cultural connection...varied cultural values, based on their historical development. Tehran uses methods along the spectrum of psychological persuasion to influence and
Properties of a Formal Method to Model Emergence in Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.
Reducing data friction through site-based data curation
NASA Astrophysics Data System (ADS)
Thomer, A.; Palmer, C. L.
2017-12-01
Much of geoscience research takes place at "scientifically significant sites": localities which have attracted a critical mass of scientific interest, and thereby merit protection by government bodies, as well as the preservation of specimen and data collections and the development of site-specific permitting requirements for access to the site and its associated collections. However, many data standards and knowledge organization schemas do not adequately describe key characteristics of the sites, despite their centrality to research projects. Through work conducted as part of the IMLS-funded Site-Based Data Curation (SBDC) project, we developed a Minimum Information Framework (MIF) for site-based science, in which "information about a site's structure" is considered a core class of information. Here we present our empirically-derived information framework, as well as the methods used to create it. We believe these approaches will lead to the development of more effective data repositories and tools, and thereby will reduce "data friction" in interdisciplinary, yet site-based, geoscience workflows. The Minimum Information Framework for Site-based Research was developed through work at two scientifically significant sites: the hot springs at Yellowstone National Park, which are key to geobiology research; and the La Brea Tar Pits, an important paleontology locality in Southern California. We employed diverse methods of participatory engagement, in which key stakeholders at our sites (e.g. curators, collections managers, researchers, permit officers) were consulted through workshops, focus groups, interviews, action research methods, and collaborative information modeling and systems analysis. These participatory approaches were highly effective in fostering on-going partnership among a diverse team of domain scientists, information scientists, and software developers. The MIF developed in this work may be viewed as a "proto-standard" that can inform future repository development and data standards. Further, the approaches used to develop the MIF represent an important step toward systematic methods of developing geoscience data standards. Finally, we argue that organizing data around aspects of a site makes data collections more accessible to a range of scientific communities.
Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A
2015-01-01
Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.
NASA Astrophysics Data System (ADS)
Danala, Gopichandh; Wang, Yunzhi; Thai, Theresa; Gunderson, Camille C.; Moxley, Katherine M.; Moore, Kathleen; Mannel, Robert S.; Cheng, Samuel; Liu, Hong; Zheng, Bin; Qiu, Yuchen
2017-02-01
Accurate tumor segmentation is a critical step in the development of the computer-aided detection (CAD) based quantitative image analysis scheme for early stage prognostic evaluation of ovarian cancer patients. The purpose of this investigation is to assess the efficacy of several different methods to segment the metastatic tumors occurred in different organs of ovarian cancer patients. In this study, we developed a segmentation scheme consisting of eight different algorithms, which can be divided into three groups: 1) Region growth based methods; 2) Canny operator based methods; and 3) Partial differential equation (PDE) based methods. A number of 138 tumors acquired from 30 ovarian cancer patients were used to test the performance of these eight segmentation algorithms. The results demonstrate each of the tested tumors can be successfully segmented by at least one of the eight algorithms without the manual boundary correction. Furthermore, modified region growth, classical Canny detector, and fast marching, and threshold level set algorithms are suggested in the future development of the ovarian cancer related CAD schemes. This study may provide meaningful reference for developing novel quantitative image feature analysis scheme to more accurately predict the response of ovarian cancer patients to the chemotherapy at early stage.
NASA Astrophysics Data System (ADS)
Zapata, D.; Salazar, M.; Chaves, B.; Keller, M.; Hoogenboom, G.
2015-12-01
Thermal time models have been used to predict the development of many different species, including grapevine ( Vitis vinifera L.). These models normally assume that there is a linear relationship between temperature and plant development. The goal of this study was to estimate the base temperature and duration in terms of thermal time for predicting veraison for four grapevine cultivars. Historical phenological data for four cultivars that were collected in the Pacific Northwest were used to develop the thermal time model. Base temperatures ( T b) of 0 and 10 °C and the best estimated T b using three different methods were evaluated for predicting veraison in grapevine. Thermal time requirements for each individual cultivar were evaluated through analysis of variance, and means were compared using the Fisher's test. The methods that were applied to estimate T b for the development of wine grapes included the least standard deviation in heat units, the regression coefficient, and the development rate method. The estimated T b varied among methods and cultivars. The development rate method provided the lowest T b values for all cultivars. For the three methods, Chardonnay had the lowest T b ranging from 8.7 to 10.7 °C, while the highest T b values were obtained for Riesling and Cabernet Sauvignon with 11.8 and 12.8 °C, respectively. Thermal time also differed among cultivars, when either the fixed or estimated T b was used. Predictions of the beginning of ripening with the estimated temperature resulted in the lowest variation in real days when compared with predictions using T b = 0 or 10 °C, regardless of the method that was used to estimate the T b.
Focke, Felix; Haase, Ilka; Fischer, Markus
2011-01-26
Usually spices are identified morphologically using simple methods like magnifying glasses or microscopic instruments. On the other hand, molecular biological methods like the polymerase chain reaction (PCR) enable an accurate and specific detection also in complex matrices. Generally, the origins of spices are plants with diverse genetic backgrounds and relationships. The processing methods used for the production of spices are complex and individual. Consequently, the development of a reliable DNA-based method for spice analysis is a challenging intention. However, once established, this method will be easily adapted to less difficult food matrices. In the current study, several alternative methods for the isolation of DNA from spices have been developed and evaluated in detail with regard to (i) its purity (photometric), (ii) yield (fluorimetric methods), and (iii) its amplifiability (PCR). Whole genome amplification methods were used to preamplify isolates to improve the ratio between amplifiable DNA and inhibiting substances. Specific primer sets were designed, and the PCR conditions were optimized to detect 18 spices selectively. Assays of self-made spice mixtures were performed to proof the applicability of the developed methods.
Tugwell, Peter; Pottie, Kevin; Welch, Vivian; Ueffing, Erin; Chambers, Andrea; Feightner, John
2011-01-01
Background: This article describes the evidence review and guideline development method developed for the Clinical Preventive Guidelines for Immigrants and Refugees in Canada by the Canadian Collaboration for Immigrant and Refugee Health Guideline Committee. Methods: The Appraisal of Guidelines for Research and Evaluation (AGREE) best-practice framework was combined with the recently developed Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to produce evidence-based clinical guidelines for immigrants and refugees in Canada. Results: A systematic approach was designed to produce the evidence reviews and apply the GRADE approach, including building on evidence from previous systematic reviews, searching for and comparing evidence between general and specific immigrant populations, and applying the GRADE criteria for making recommendations. This method was used for priority health conditions that had been selected by practitioners caring for immigrants and refugees in Canada. Interpretation: This article outlines the 14-step method that was defined to standardize the guideline development process for each priority health condition. PMID:20573711
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
Lamers, Romy E D; Cuypers, Maarten; Garvelink, Mirjam M; de Vries, Marieke; Bosch, J L H Ruud; Kil, Paul J M
2016-07-01
To develop a web-based decision aid (DA) for the treatment of lower urinary tract symptoms due to benign prostatic hyperplasia (LUTS/BPH). From February-September 2014 we performed a four-stage development method: 1: Two-round Delphi consensus method among urologists, 2: Identifying patients' needs and expectations, 3: Development of DA content and structure, 4: Usability testing with LUTS/BPH patients. 1 (N=15): Dutch urologists reached consensus on 61% of the statements concerning users' criteria, decision options, structure, and medical content. 2 (N=24): Consensus was reached in 69% on statements concerning the need for improvement of information provision, the need for DA development and that the DA should clarify patients' preferences. 3: DA development based on results from stage 1 and stage 2. 4 (N=10): Pros of the DA were clear information provision, systematic design and easy to read and re-read. A LUTS/BPH DA containing VCEs(**) was developed in cooperation with urologists and patients following a structured 4 stage method and was stated to be well accepted. This method can be adopted for the development of DAs to support other medical decision issues. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
2014-01-01
Background The negative impact of musculoskeletal diseases on the physical function and quality of life of people living in developing countries is considerable. This disabling effect is even more marked in low-socioeconomic communities within developing countries. In Mexico, there is a need to create community-based rehabilitation programs for people living with musculoskeletal diseases in low-socioeconomic areas. These programs should be directed to prevent and decrease disability, accommodating the specific local culture of communities. Objective The objective of this paper is to describe a research protocol designed to develop, implement, and evaluate culturally sensitive community-based rehabilitation programs aiming to decrease disability of people living with musculoskeletal diseases in two low-income Mexican communities. Methods A community-based participatory research approach is proposed, including multi and transdisciplinary efforts among the community, medical anthropology, and the health sciences. The project is structured in 4 main stages: (1) situation analysis, (2) program development, (3) program implementation, and (4) program evaluation. Each stage includes the use of quantitative and qualitative methods (mixed method program). Results So far, we obtained resources from a Mexican federal agency and completed stage one of the project at Chankom, Yucatán. We are currently receiving funding from an international agency to complete stage two at this same location. We expect that the project at Chankom will be concluded by December of 2017. On the other hand, we just started the execution of stage one at Nuevo León with funding from a Mexican federal agency. We expect to conclude the project at this site by September of 2018. Conclusions Using a community-based participatory research approach and a mixed method program could result in the creation of culturally sensitive community-based rehabilitation programs that promote community development and decrease the disabling effects of musculoskeletal diseases within two low-income Mexican communities. PMID:25474820
Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong
2015-02-01
Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.
Xiang, Xiaowei; Shang, Bing; Wang, Xiaozheng; Chen, Qinhua
2017-04-01
Yohimbine is a novel compound for the treatment of erectile dysfunction derived from natural products, and pharmacokinetic study is important for its further development as a new medicine. In this work, we developed a novel PEEK tube-based solid-phase microextraction (SPME)-HPLC method for analysis of yohimbine in plasma and further for pharmacokinetic study. Poly (AA-EGDMA) was synthesized inside a PEEK tube as the sorbent for microextraction of yohimbine, and parameters that could influence extraction efficiency were systematically investigated. Under optimum conditions, the PEEK tube-based SPME method exhibits excellent enrichment efficiency towards yohimbine. By using berberine as internal standard, an online SPME-HPLC method was developed for analysis of yohimbine in human plasma sample. The method has wide linear range (2-1000 ng/mL) with an R 2 of 0.9962; the limit of detection was determined and was as low as 0.1 ng/mL using UV detection. Finally, a pharmacokinetic study of yohimbine was carried out by the online SPME-HPLC method and the results have been compared with those of reported methods. Copyright © 2016 John Wiley & Sons, Ltd.
The Systematic Development of an Internet-Based Smoking Cessation Intervention for Adults.
Dalum, Peter; Brandt, Caroline Lyng; Skov-Ettrup, Lise; Tolstrup, Janne; Kok, Gerjo
2016-07-01
Objectives The objective of this project was to determine whether intervention mapping is a suitable strategy for developing an Internet- and text message-based smoking cessation intervention. Method We used the Intervention Mapping framework for planning health promotion programs. After a needs assessment, we identified important changeable determinants of cessation behavior, specified objectives for the intervention, selected theoretical methods for meeting our objectives, and operationalized change methods into practical intervention strategies. Results We found that "social cognitive theory," the "transtheoretical model/stages of change," "self-regulation theory," and "appreciative inquiry" were relevant theories for smoking cessation interventions. From these theories, we selected modeling/behavioral journalism, feedback, planning coping responses/if-then statements, gain frame/positive imaging, consciousness-raising, helping relationships, stimulus control, and goal-setting as suitable methods for an Internet- and text-based adult smoking cessation program. Furthermore, we identified computer tailoring as a useful strategy for adapting the intervention to individual users. Conclusion The Intervention Mapping method, with a clear link between behavioral goals, theoretical methods, and practical strategies and materials, proved useful for systematic development of a digital smoking cessation intervention for adults. © 2016 Society for Public Health Education.
Human swallowing simulation based on videofluorography images using Hamiltonian MPS method
NASA Astrophysics Data System (ADS)
Kikuchi, Takahiro; Michiwaki, Yukihiro; Kamiya, Tetsu; Toyama, Yoshio; Tamai, Tasuku; Koshizuka, Seiichi
2015-09-01
In developed nations, swallowing disorders and aspiration pneumonia have become serious problems. We developed a method to simulate the behavior of the organs involved in swallowing to clarify the mechanisms of swallowing and aspiration. The shape model is based on anatomically realistic geometry, and the motion model utilizes forced displacements based on realistic dynamic images to reflect the mechanisms of human swallowing. The soft tissue organs are modeled as nonlinear elastic material using the Hamiltonian MPS method. This method allows for stable simulation of the complex swallowing movement. A penalty method using metaballs is employed to simulate contact between organ walls and smooth sliding along the walls. We performed four numerical simulations under different analysis conditions to represent four cases of swallowing, including a healthy volunteer and a patient with a swallowing disorder. The simulation results were compared to examine the epiglottic downfolding mechanism, which strongly influences the risk of aspiration.
NASA Astrophysics Data System (ADS)
Yang, Kun; Xu, Quan-li; Peng, Shuang-yun; Cao, Yan-bo
2008-10-01
Based on the necessity analysis of GIS applications in earthquake disaster prevention, this paper has deeply discussed the spatial integration scheme of urban earthquake disaster loss evaluation models and visualization technologies by using the network development methods such as COM/DCOM, ActiveX and ASP, as well as the spatial database development methods such as OO4O and ArcSDE based on ArcGIS software packages. Meanwhile, according to Software Engineering principles, a solution of Urban Earthquake Emergency Response Decision Support Systems based on GIS technologies have also been proposed, which include the systems logical structures, the technical routes,the system realization methods and function structures etc. Finally, the testing systems user interfaces have also been offered in the paper.
The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...
Final Report for X-ray Diffraction Sample Preparation Method Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ely, T. M.; Meznarich, H. K.; Valero, T.
WRPS-1500790, “X-ray Diffraction Saltcake Sample Preparation Method Development Plan/Procedure,” was originally prepared with the intent of improving the specimen preparation methodology used to generate saltcake specimens suitable for XRD-based solid phase characterization. At the time that this test plan document was originally developed, packed powder in cavity supports with collodion binder was the established XRD specimen preparation method. An alternate specimen preparation method less vulnerable, if not completely invulnerable to preferred orientation effects, was desired as a replacement for the method.
Chen, J D; Sun, H L
1999-04-01
Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.
An Object-Based Requirements Modeling Method.
ERIC Educational Resources Information Center
Cordes, David W.; Carver, Doris L.
1992-01-01
Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…
NASA Astrophysics Data System (ADS)
Al-Chalabi, Rifat M. Khalil
1997-09-01
Development of an improvement to the computational efficiency of the existing nested iterative solution strategy of the Nodal Exapansion Method (NEM) nodal based neutron diffusion code NESTLE is presented. The improvement in the solution strategy is the result of developing a multilevel acceleration scheme that does not suffer from the numerical stalling associated with a number of iterative solution methods. The acceleration scheme is based on the multigrid method, which is specifically adapted for incorporation into the NEM nonlinear iterative strategy. This scheme optimizes the computational interplay between the spatial discretization and the NEM nonlinear iterative solution process through the use of the multigrid method. The combination of the NEM nodal method, calculation of the homogenized, neutron nodal balance coefficients (i.e. restriction operator), efficient underlying smoothing algorithm (power method of NESTLE), and the finer mesh reconstruction algorithm (i.e. prolongation operator), all operating on a sequence of coarser spatial nodes, constitutes the multilevel acceleration scheme employed in this research. Two implementations of the multigrid method into the NESTLE code were examined; the Imbedded NEM Strategy and the Imbedded CMFD Strategy. The main difference in implementation between the two methods is that in the Imbedded NEM Strategy, the NEM solution is required at every MG level. Numerical tests have shown that the Imbedded NEM Strategy suffers from divergence at coarse- grid levels, hence all the results for the different benchmarks presented here were obtained using the Imbedded CMFD Strategy. The novelties in the developed MG method are as follows: the formulation of the restriction and prolongation operators, and the selection of the relaxation method. The restriction operator utilizes a variation of the reactor physics, consistent homogenization technique. The prolongation operator is based upon a variant of the pin power reconstruction methodology. The relaxation method, which is the power method, utilizes a constant coefficient matrix within the NEM non-linear iterative strategy. The choice of the MG nesting within the nested iterative strategy enables the incorporation of other non-linear effects with no additional coding effort. In addition, if an eigenvalue problem is being solved, it remains an eigenvalue problem at all grid levels, simplifying coding implementation. The merit of the developed MG method was tested by incorporating it into the NESTLE iterative solver, and employing it to solve four different benchmark problems. In addition to the base cases, three different sensitivity studies are performed, examining the effects of number of MG levels, homogenized coupling coefficients correction (i.e. restriction operator), and fine-mesh reconstruction algorithm (i.e. prolongation operator). The multilevel acceleration scheme developed in this research provides the foundation for developing adaptive multilevel acceleration methods for steady-state and transient NEM nodal neutron diffusion equations. (Abstract shortened by UMI.)
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Wiegers, Ann L
2003-07-01
Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.
In silico platform for predicting and initiating β-turns in a protein at desired locations.
Singh, Harinder; Singh, Sandeep; Raghava, Gajendra P S
2015-05-01
Numerous studies have been performed for analysis and prediction of β-turns in a protein. This study focuses on analyzing, predicting, and designing of β-turns to understand the preference of amino acids in β-turn formation. We analyzed around 20,000 PDB chains to understand the preference of residues or pair of residues at different positions in β-turns. Based on the results, a propensity-based method has been developed for predicting β-turns with an accuracy of 82%. We introduced a new approach entitled "Turn level prediction method," which predicts the complete β-turn rather than focusing on the residues in a β-turn. Finally, we developed BetaTPred3, a Random forest based method for predicting β-turns by utilizing various features of four residues present in β-turns. The BetaTPred3 achieved an accuracy of 79% with 0.51 MCC that is comparable or better than existing methods on BT426 dataset. Additionally, models were developed to predict β-turn types with better performance than other methods available in the literature. In order to improve the quality of prediction of turns, we developed prediction models on a large and latest dataset of 6376 nonredundant protein chains. Based on this study, a web server has been developed for prediction of β-turns and their types in proteins. This web server also predicts minimum number of mutations required to initiate or break a β-turn in a protein at specified location of a protein. © 2015 Wiley Periodicals, Inc.
Chemical Fingerprinting of Materials Developed Due to Environmental Issues
NASA Technical Reports Server (NTRS)
Smith, Doris A.; McCool, A. (Technical Monitor)
2000-01-01
Instrumental chemical analysis methods are developed and used to chemically fingerprint new and modified External Tank materials made necessary by changing environmental requirements. Chemical fingerprinting can detect and diagnose variations in material composition. To chemically characterize each material, fingerprint methods are selected from an extensive toolbox based on the material's chemistry and the ability of the specific methods to detect the material's critical ingredients. Fingerprint methods have been developed for a variety of materials including Thermal Protection System foams, adhesives, primers, and composites.
NASA Astrophysics Data System (ADS)
Zan, Hao; Li, Haowei; Jiang, Yuguang; Wu, Meng; Zhou, Weixing; Bao, Wen
2018-06-01
As part of our efforts to find ways and means to further improve the regenerative cooling technology in scramjet, the experiments of thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing have been conducted in horizontal circular tubes at different conditions. The experimental results indicate that there is a developing process from thermo-acoustic stability to instability. In order to have a deep understanding on the developing process of thermo-acoustic instability, the method of Multi-scale Shannon Wavelet Entropy (MSWE) based on Wavelet Transform Correlation Filter (WTCF) and Multi-Scale Shannon Entropy (MSE) is adopted in this paper. The results demonstrate that the developing process of thermo-acoustic instability from noise and weak signals is well detected by MSWE method and the differences among the stability, the developing process and the instability can be identified. These properties render the method particularly powerful for warning thermo-acoustic instability of hydrocarbon fuel flowing in scramjet cooling channels. The mass flow rate and the inlet pressure will make an influence on the developing process of the thermo-acoustic instability. The investigation on thermo-acoustic instability dynamic characteristics at supercritical pressure based on wavelet entropy method offers guidance on the control of scramjet fuel supply, which can secure stable fuel flowing in regenerative cooling system.
Bioforensics: Characterization of biological weapons agents by NanoSIMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, P K; Ghosal, S; Leighton, T J
2007-02-26
The anthrax attacks of Fall 2001 highlight the need to develop forensic methods based on multiple identifiers to determine the origin of biological weapons agents. Genetic typing methods (i.e., DNA and RNA-based) provide one attribution technology, but genetic information alone is not usually sufficient to determine the provenance of the material. Non-genetic identifiers, including elemental and isotopic signatures, provide complementary information that can be used to identify the means, geographic location and date of production. Under LDRD funding, we have successfully developed the techniques necessary to perform bioforensic characterization with the NanoSIMS at the individual spore level. We have developedmore » methods for elemental and isotopic characterization at the single spore scale. We have developed methods for analyzing spore sections to map elemental abundance within spores. We have developed rapid focused ion beam (FIB) sectioning techniques for spores to preserve elemental and structural integrity. And we have developed a high-resolution depth profiling method to characterize the elemental distribution in individual spores without sectioning. We used these newly developed methods to study the controls on elemental abundances in spores, characterize the elemental distribution of in spores, and to study elemental uptake by spores. Our work under this LDRD project attracted FBI and DHS funding for applied purposes.« less
Simulation-Based Valuation of Transactive Energy Systems
Huang, Qiuhua; McDermott, Tom; Tang, Yingying; ...
2018-05-18
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
Simulation-Based Valuation of Transactive Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qiuhua; McDermott, Tom; Tang, Yingying
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Kuangcai
The goal of this study is to help with future data analysis and experiment designs in rotational dynamics research using DIC-based SPORT technique. Most of the current studies using DIC-based SPORT techniques are technical demonstrations. Understanding the mechanisms behind the observed rotational behaviors of the imaging probes should be the focus of the future SPORT studies. More efforts are still needed in the development of new imaging probes, particle tracking methods, instrumentations, and advanced data analysis methods to further extend the potential of DIC-based SPORT technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, J; Lasio, G; Chen, S
2015-06-15
Purpose: To develop a CBCT HU correction method using a patient specific HU to mass density conversion curve based on a novel image registration and organ mapping method for head-and-neck radiation therapy. Methods: There are three steps to generate a patient specific CBCT HU to mass density conversion curve. First, we developed a novel robust image registration method based on sparseness analysis to register the planning CT (PCT) and the CBCT. Second, a novel organ mapping method was developed to transfer the organs at risk (OAR) contours from the PCT to the CBCT and corresponding mean HU values of eachmore » OAR were measured in both the PCT and CBCT volumes. Third, a set of PCT and CBCT HU to mass density conversion curves were created based on the mean HU values of OARs and the corresponding mass density of the OAR in the PCT. Then, we compared our proposed conversion curve with the traditional Catphan phantom based CBCT HU to mass density calibration curve. Both curves were input into the treatment planning system (TPS) for dose calculation. Last, the PTV and OAR doses, DVH and dose distributions of CBCT plans are compared to the original treatment plan. Results: One head-and-neck cases which contained a pair of PCT and CBCT was used. The dose differences between the PCT and CBCT plans using the proposed method are −1.33% for the mean PTV, 0.06% for PTV D95%, and −0.56% for the left neck. The dose differences between plans of PCT and CBCT corrected using the CATPhan based method are −4.39% for mean PTV, 4.07% for PTV D95%, and −2.01% for the left neck. Conclusion: The proposed CBCT HU correction method achieves better agreement with the original treatment plan compared to the traditional CATPhan based calibration method.« less
Poikane, Sandra; Johnson, Richard K; Sandin, Leonard; Schartau, Ann Kristin; Solimini, Angelo G; Urbanič, Gorazd; Arbačiauskas, Kęstutis; Aroviita, Jukka; Gabriels, Wim; Miler, Oliver; Pusch, Martin T; Timm, Henn; Böhmer, Jürgen
2016-02-01
Legislation in Europe has been adopted to determine and improve the ecological integrity of inland and coastal waters. Assessment is based on four biotic groups, including benthic macroinvertebrate communities. For lakes, benthic invertebrates have been recognized as one of the most difficult organism groups to use in ecological assessment, and hitherto their use in ecological assessment has been limited. In this study, we review and intercalibrate 13 benthic invertebrate-based tools across Europe. These assessment tools address different human impacts: acidification (3 methods), eutrophication (3 methods), morphological alterations (2 methods), and a combination of the last two (5 methods). For intercalibration, the methods were grouped into four intercalibration groups, according to the habitat sampled and putative pressure. Boundaries of the 'good ecological status' were compared and harmonized using direct or indirect comparison approaches. To enable indirect comparison of the methods, three common pressure indices and two common biological multimetric indices were developed for larger geographical areas. Additionally, we identified the best-performing methods based on their responsiveness to different human impacts. Based on these experiences, we provide practical recommendations for the development and harmonization of benthic invertebrate assessment methods in lakes and similar habitats. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Li, Hongzhi; Yang, Wei
2007-03-21
An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.
Case-based Long-term Professional Development of Science Teachers
NASA Astrophysics Data System (ADS)
Dori, Yehudit J.; Herscovitz, Orit
2005-10-01
Reform efforts are often unsuccessful because they failed to understand that teachers play a key role in making educational reforms successful. This paper describes a long-term teacher professional development (PD) program aimed at educating and training teachers to teach interdisciplinary topics using case-based method in science. The research objective was to identify, follow and document the processes that science teachers went through as they assimilated the interdisciplinary, case-based science teaching approach. The research accompanied the PD program throughout its 3-year period. About 50 teachers, who took part in the PD program, were exposed to an interdisciplinary case-based teaching method. The research instruments included teacher portfolios, which contained projects and reflection questionnaires, classroom observations, teacher interviews, and student feedback questionnaires. The portfolios contained the projects that the teachers had carried out during the PD program, which included case studies and accompanying student activities. We found that the teachers gradually moved from exposure to new teaching methods and subject matter, through active learning and preparing case-based team projects, to interdisciplinary, active classroom teaching using the case studies they developed.
Nishikawa, Keizo; Iwamoto, Yoriko; Ishii, Masaru
2014-05-01
The development of methods for differentiation of embryonic stem cells (ESCs) and induced pluripotent stem cell (iPSCs) into functional cells have helped to analyze the mechanism regulating cellular processes and to explore cell-based assays for drug discovery. Although several reports have demonstrated methods for differentiation of mouse ESCs into osteoclast-like cells, it remains unclear whether these methods are applicable for differentiation of iPSCs to osteoclasts. In this study, we developed a simple method for stepwise differentiation of mouse ESCs and iPSCs into bone-resorbing osteoclasts based upon a monoculture approach consisting of three steps. First, based on conventional hanging-drop methods, embryoid bodies (EBs) were produced from mouse ESCs or iPSCs. Second, EBs were cultured in medium supplemented with macrophage colony-stimulating factor (M-CSF), and differentiated to osteoclast precursors, which expressed CD11b. Finally, ESC- or iPSC-derived osteoclast precursors stimulated with receptor activator of nuclear factor-B ligand (RANKL) and M-CSF formed large multinucleated osteoclast-like cells that expressed tartrate-resistant acid phosphatase and were capable of bone resorption. Molecular analysis showed that the expression of osteoclast marker genes such as Nfatc1, Ctsk, and Acp5 are increased in a RANKL-dependent manner. Thus, our procedure is simple and easy and would be helpful for stem cell-based bone research.
NASA Astrophysics Data System (ADS)
Huang, C.; Hsu, N.
2013-12-01
This study imports Low-Impact Development (LID) technology of rainwater catchment systems into a Storm-Water runoff Management Model (SWMM) to design the spatial capacity and quantity of rain barrel for urban flood mitigation. This study proposes a simulation-optimization model for effectively searching the optimal design. In simulation method, we design a series of regular spatial distributions of capacity and quantity of rainwater catchment facilities, and thus the reduced flooding circumstances using a variety of design forms could be simulated by SWMM. Moreover, we further calculate the net benefit that is equal to subtract facility cost from decreasing inundation loss and the best solution of simulation method would be the initial searching solution of the optimization model. In optimizing method, first we apply the outcome of simulation method and Back-Propagation Neural Network (BPNN) for developing a water level simulation model of urban drainage system in order to replace SWMM which the operating is based on a graphical user interface and is hard to combine with optimization model and method. After that we embed the BPNN-based simulation model into the developed optimization model which the objective function is minimizing the negative net benefit. Finally, we establish a tabu search-based algorithm to optimize the planning solution. This study applies the developed method in Zhonghe Dist., Taiwan. Results showed that application of tabu search and BPNN-based simulation model into the optimization model not only can find better solutions than simulation method in 12.75%, but also can resolve the limitations of previous studies. Furthermore, the optimized spatial rain barrel design can reduce 72% of inundation loss according to historical flood events.
Zheng, Dandan; Todor, Dorin A
2011-01-01
In real-time trans-rectal ultrasound (TRUS)-based high-dose-rate prostate brachytherapy, the accurate identification of needle-tip position is critical for treatment planning and delivery. Currently, needle-tip identification on ultrasound images can be subject to large uncertainty and errors because of ultrasound image quality and imaging artifacts. To address this problem, we developed a method based on physical measurements with simple and practical implementation to improve the accuracy and robustness of needle-tip identification. Our method uses measurements of the residual needle length and an off-line pre-established coordinate transformation factor, to calculate the needle-tip position on the TRUS images. The transformation factor was established through a one-time systematic set of measurements of the probe and template holder positions, applicable to all patients. To compare the accuracy and robustness of the proposed method and the conventional method (ultrasound detection), based on the gold-standard X-ray fluoroscopy, extensive measurements were conducted in water and gel phantoms. In water phantom, our method showed an average tip-detection accuracy of 0.7 mm compared with 1.6 mm of the conventional method. In gel phantom (more realistic and tissue-like), our method maintained its level of accuracy while the uncertainty of the conventional method was 3.4mm on average with maximum values of over 10mm because of imaging artifacts. A novel method based on simple physical measurements was developed to accurately detect the needle-tip position for TRUS-based high-dose-rate prostate brachytherapy. The method demonstrated much improved accuracy and robustness over the conventional method. Copyright © 2011 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Developing rapid methods for analyzing upland riparian functions and values.
Hruby, Thomas
2009-06-01
Regulators protecting riparian areas need to understand the integrity, health, beneficial uses, functions, and values of this resource. Up to now most methods providing information about riparian areas are based on analyzing condition or integrity. These methods, however, provide little information about functions and values. Different methods are needed that specifically address this aspect of riparian areas. In addition to information on functions and values, regulators have very specific needs that include: an analysis at the site scale, low cost, usability, and inclusion of policy interpretations. To meet these needs a rapid method has been developed that uses a multi-criteria decision matrix to categorize riparian areas in Washington State, USA. Indicators are used to identify the potential of the site to provide a function, the potential of the landscape to support the function, and the value the function provides to society. To meet legal needs fixed boundaries for assessment units are established based on geomorphology, the distance from "Ordinary High Water Mark" and different categories of land uses. Assessment units are first classified based on ecoregions, geomorphic characteristics, and land uses. This simplifies the data that need to be collected at a site, but it requires developing and calibrating a separate model for each "class." The approach to developing methods is adaptable to other locations as its basic structure is not dependent on local conditions.
Condition Number Estimation of Preconditioned Matrices
Kushida, Noriyuki
2015-01-01
The present paper introduces a condition number estimation method for preconditioned matrices. The newly developed method provides reasonable results, while the conventional method which is based on the Lanczos connection gives meaningless results. The Lanczos connection based method provides the condition numbers of coefficient matrices of systems of linear equations with information obtained through the preconditioned conjugate gradient method. Estimating the condition number of preconditioned matrices is sometimes important when describing the effectiveness of new preconditionerers or selecting adequate preconditioners. Operating a preconditioner on a coefficient matrix is the simplest method of estimation. However, this is not possible for large-scale computing, especially if computation is performed on distributed memory parallel computers. This is because, the preconditioned matrices become dense, even if the original matrices are sparse. Although the Lanczos connection method can be used to calculate the condition number of preconditioned matrices, it is not considered to be applicable to large-scale problems because of its weakness with respect to numerical errors. Therefore, we have developed a robust and parallelizable method based on Hager’s method. The feasibility studies are curried out for the diagonal scaling preconditioner and the SSOR preconditioner with a diagonal matrix, a tri-daigonal matrix and Pei’s matrix. As a result, the Lanczos connection method contains around 10% error in the results even with a simple problem. On the other hand, the new method contains negligible errors. In addition, the newly developed method returns reasonable solutions when the Lanczos connection method fails with Pei’s matrix, and matrices generated with the finite element method. PMID:25816331
Sudo, Hirotaka; O'driscoll, Michael; Nishiwaki, Kenji; Kawamoto, Yuji; Gammell, Philip; Schramm, Gerhard; Wertli, Toni; Prinz, Heino; Mori, Atsuhide; Sako, Kazuhiro
2012-01-01
The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. Studies using ampoules filled with ethanol-based solution and with nitrogen in the headspace demonstrated that the head space analysis (HSA) method showed sufficient sensitivity in detecting an ampoule crack. The proposed method is the use of HSA in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate the oxygen flow through the crack in the ampoule. The method was examined in comparative studies with a conventional dye ingress method, and the results showed that the HSA method exhibits sensitivity superior to the dye method. The results indicate that the HSA method in combination with the bombing treatment provides potential application as a leak test for the detection of container defects not only for ampoule products with ethanol-based solutions, but also for testing lyophilized products in vials with nitrogen in the head space. The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. The proposed method is the use of head space analysis (HSA) in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate oxygen flow through the crack in the ampoule for use in routine production. The result of the comparative study with a conventional dye leak test method indicates that the HSA method in combination with the bombing treatment can be used as a leak test method, enabling detection of container defects.
Polidori, David; Rowley, Clarence
2014-07-22
The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method.
Barros, Ana B; Dias, Sonia F; Martins, Maria Rosario O
2015-10-30
In public health, hard-to-reach populations are often recruited by non-probabilistic sampling methods that produce biased results. In order to overcome this, several sampling methods have been improved and developed in the last years. The aim of this systematic review was to identify all current methods used to survey most-at-risk populations of men who have sex with men and sex workers. The review also aimed to assess if there were any relations between the study populations and the sampling methods used to recruit them. Lastly, we wanted to assess if the number of publications originated in middle and low human development (MLHD) countries had been increasing in the last years. A systematic review was conducted using electronic databases and a total of 268 published studies were included in the analysis. In this review, 11 recruitment methods were identified. Semi-probabilistic methods were used most commonly to survey men who have sex with men, and the use of the Internet was the method that gathered more respondents. We found that female sex workers were more frequently recruited through non-probabilistic methods than men who have sex with men (odds = 2.2; p < 0.05; confidence interval (CI) [1.1-4.2]). In the last 6 years, the number of studies based in middle and low human development countries increased more than the number of studies based in very high and high human development countries (odds = 2.5; p < 0.05; CI [1.3-4.9]). This systematic literature review identified 11 methods used to sample men who have sex with men and female sex workers. There is an association between the type of sampling method and the population being studied. The number of studies based in middle and low human development countries has increased in the last 6 years of this study.
Research on Chinese characters display of airborne MFD based on GL studio
NASA Astrophysics Data System (ADS)
Wang, Zhile; Dong, Junyu; Hu, Wenting; Cui, Yipeng
2018-04-01
GL Studio cannot display Chinese characters during developing the airborne MFD, this paper propose a method of establishing a Chinese character font with GB2312 encoding, establish the font table and the display unit of Chinese characters based on GL Studio. Abstract the storage and display data model of Chinese characters, parse the GB encoding of the corresponding Chinese characters that MFD received, find the coordinates of the Chinese characters in the font table, establish the dynamic control model and the dynamic display model of Chinese characters based on the display unit of Chinese characters. In GL Studio and VC ++.NET environment, this model has been successfully applied to develop the airborne MFD in a variety of mission simulators. This method has successfully solved the problem that GL Studio software cannot develop MFD software of Chinese domestic aircraft and can also be used for other professional airborne MFD development tools such as IDATA. It has been proved by experiments that this is a fast effective scalable and reconfigurable method of developing both actual equipment and simulators.
Development of a speech autocuer
NASA Astrophysics Data System (ADS)
Bedles, R. L.; Kizakvich, P. N.; Lawson, D. T.; McCartney, M. L.
1980-12-01
A wearable, visually based prosthesis for the deaf based upon the proven method for removing lipreading ambiguity known as cued speech was fabricated and tested. Both software and hardware developments are described, including a microcomputer, display, and speech preprocessor.
Development of a speech autocuer
NASA Technical Reports Server (NTRS)
Bedles, R. L.; Kizakvich, P. N.; Lawson, D. T.; Mccartney, M. L.
1980-01-01
A wearable, visually based prosthesis for the deaf based upon the proven method for removing lipreading ambiguity known as cued speech was fabricated and tested. Both software and hardware developments are described, including a microcomputer, display, and speech preprocessor.
Performance-Based Assessment: An Alternative Assessment Process for Young Gifted Children.
ERIC Educational Resources Information Center
Hafenstein, Norma Lu; Tucker, Brooke
Performance-based assessment provides an alternative identification method for young gifted children. A performance-based identification process was developed and implemented to select three-, four-, and five-year-old children for inclusion in a school for gifted children. Literature regarding child development, characteristics of young gifted…
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Ongoing lead-based paint...
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Ongoing lead-based paint...
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Ongoing lead-based paint...
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Ongoing lead-based paint...
Implementing Assessment in an Outcome-Based Marketing Curriculum
ERIC Educational Resources Information Center
Borin, Norm; Metcalf, Lynn E.; Tietje, Brian C.
2008-01-01
This article describes the development and implementation of assessment in a new outcome-based marketing curriculum that was developed using a zero-based approach. Outcomes for the marketing curriculum were specified at the program, department, course, and lesson levels. Direct embedded assessments as well as indirect assessment methods were used…
NASA Astrophysics Data System (ADS)
Resita, I.; Ertikanto, C.
2018-05-01
This study aims to develop electronic module design based on Learning Content Development System (LCDS) to foster students’ multi representation skills in physics subject material. This study uses research and development method to the product design. This study involves 90 students and 6 physics teachers who were randomly chosen from 3 different Senior High Schools in Lampung Province. The data were collected by using questionnaires and analyzed by using quantitative descriptive method. Based on the data, 95% of the students only use one form of representation in solving physics problems. Representation which is tend to be used by students is symbolic representation. Students are considered to understand the concept of physics if they are able to change from one form to the other forms of representation. Product design of LCDS-based electronic module presents text, image, symbolic, video, and animation representation.
NASA Astrophysics Data System (ADS)
Liu, Ruiwen; Jiao, Binbin; Kong, Yanmei; Li, Zhigang; Shang, Haiping; Lu, Dike; Gao, Chaoqun; Chen, Dapeng
2013-09-01
Micro-devices with a bi-material-cantilever (BMC) commonly suffer initial curvature due to the mismatch of residual stress. Traditional corrective methods to reduce the residual stress mismatch generally involve the development of different material deposition recipes. In this paper, a new method for reducing residual stress mismatch in a BMC is proposed based on various previously developed deposition recipes. An initial material film is deposited using two or more developed deposition recipes. This first film is designed to introduce a stepped stress gradient, which is then balanced by overlapping a second material film on the first and using appropriate deposition recipes to form a nearly stress-balanced structure. A theoretical model is proposed based on both the moment balance principle and total equal strain at the interface of two adjacent layers. Experimental results and analytical models suggest that the proposed method is effective in producing multi-layer micro cantilevers that display balanced residual stresses. The method provides a generic solution to the problem of mismatched initial stresses which universally exists in micro-electro-mechanical systems (MEMS) devices based on a BMC. Moreover, the method can be incorporated into a MEMS design automation package for efficient design of various multiple material layer devices from MEMS material library and developed deposition recipes.
Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing
2009-11-25
Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.
Beetles, Beechnuts, and Behavior: Using Nature-based Activities To Develop Social Skills.
ERIC Educational Resources Information Center
Henderson, Kelly
This paper describes an instructional method designed to increase opportunities for students to learn and practice appropriate social skills. The strategies for development and implementation of such structured programs of nature-based and animal-based activities are based in part on a pilot program in three urban elementary and middle schools.…
Recent trends related to the use of formal methods in software engineering
NASA Technical Reports Server (NTRS)
Prehn, Soren
1986-01-01
An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.
Cost estimating methods for advanced space systems
NASA Technical Reports Server (NTRS)
Cyr, Kelley
1988-01-01
The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.
A random spatial sampling method in a rural developing nation
Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas
2014-01-01
Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...
Collaborative voxel-based surgical virtual environments.
Acosta, Eric; Muniz, Gilbert; Armonda, Rocco; Bowyer, Mark; Liu, Alan
2008-01-01
Virtual Reality-based surgical simulators can utilize Collaborative Virtual Environments (C-VEs) to provide team-based training. To support real-time interactions, C-VEs are typically replicated on each user's local computer and a synchronization method helps keep all local copies consistent. This approach does not work well for voxel-based C-VEs since large and frequent volumetric updates make synchronization difficult. This paper describes a method that allows multiple users to interact within a voxel-based C-VE for a craniotomy simulator being developed. Our C-VE method requires smaller update sizes and provides faster synchronization update rates than volumetric-based methods. Additionally, we address network bandwidth/latency issues to simulate networked haptic and bone drilling tool interactions with a voxel-based skull C-VE.
Robust Methods for Moderation Analysis with a Two-Level Regression Model.
Yang, Miao; Yuan, Ke-Hai
2016-01-01
Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.
Task Based Language Teaching: Development of CALL
ERIC Educational Resources Information Center
Anwar, Khoirul; Arifani, Yudhi
2016-01-01
The dominant complexities of English teaching in Indonesia are about limited development of teaching methods and materials which still cannot optimally reflect students' needs (in particular of how to acquire knowledge and select the most effective learning models). This research is to develop materials with complete task-based activities by using…
A Study of Instructor Goals, Strategies and Stages of Group Development.
ERIC Educational Resources Information Center
Beadle, Mary Ellen; Feitler, Fred C.
A study examining the relationship between the importance of instructional goals and instructor use of instructional strategies based on the five stages of group development (orientation, norm development, conflict, productivity, and termination) is reported. Instructional design is concerned with prescribing methods of instruction based on…
On Anticipatory Development of Dual Education Based on the Systemic Approach
ERIC Educational Resources Information Center
Alshynbayeva, Zhuldyz; Sarbassova, Karlygash; Galiyeva, Temir; Kaltayeva, Gulnara; Bekmagambetov, Aidos
2016-01-01
The article addresses separate theoretical and methodical aspects of the anticipatory development of dual education in the Republic of Kazakhstan based on the systemic approach. It states the need to develop orientating basis of prospective professional activities in students. We define the concepts of anticipatory cognition and anticipatory…
Digital Signal Processing Based on a Clustering Algorithm for Ir/Au TES Microcalorimeter
NASA Astrophysics Data System (ADS)
Zen, N.; Kunieda, Y.; Takahashi, H.; Hiramoto, K.; Nakazawa, M.; Fukuda, D.; Ukibe, M.; Ohkubo, M.
2006-02-01
In recent years, cryogenic microcalorimeters using their superconducting transition edge have been under development for possible application to the research for astronomical X-ray observations. To improve the energy resolution of superconducting transition edge sensors (TES), several correction methods have been developed. Among them, a clustering method based on digital signal processing has recently been proposed. In this paper, we applied the clustering method to Ir/Au bilayer TES. This method resulted in almost a 10% improvement in the energy resolution. Conversely, from the point of view of imaging X-ray spectroscopy, we applied the clustering method to pixellated Ir/Au-TES devices. We will thus show how a clustering method which sorts signals by their shapes is also useful for position identification
Effects of empty bins on image upscaling in capsule endoscopy
NASA Astrophysics Data System (ADS)
Rukundo, Olivier
2017-07-01
This paper presents a preliminary study of the effect of empty bins on image upscaling in capsule endoscopy. The presented study was conducted based on results of existing contrast enhancement and interpolation methods. A low contrast enhancement method based on pixels consecutiveness and modified bilinear weighting scheme has been developed to distinguish between necessary empty bins and unnecessary empty bins in the effort to minimize the number of empty bins in the input image, before further processing. Linear interpolation methods have been used for upscaling input images with stretched histograms. Upscaling error differences and similarity indices between pairs of interpolation methods have been quantified using the mean squared error and feature similarity index techniques. Simulation results demonstrated more promising effects using the developed method than other contrast enhancement methods mentioned.
NASA Astrophysics Data System (ADS)
Yoo, Byungjin; Hirata, Katsuhiro; Oonishi, Atsurou
In this study, a coupled analysis method for flat panel speakers driven by giant magnetostrictive material (GMM) based actuator was developed. The sound field produced by a flat panel speaker that is driven by a GMM actuator depends on the vibration of the flat panel, this vibration is a result of magnetostriction property of the GMM. In this case, to predict the sound pressure level (SPL) in the audio-frequency range, it is necessary to take into account not only the magnetostriction property of the GMM but also the effect of eddy current and the vibration characteristics of the actuator and the flat panel. In this paper, a coupled electromagnetic-structural-acoustic analysis method is presented; this method was developed by using the finite element method (FEM). This analysis method is used to predict the performance of a flat panel speaker in the audio-frequency range. The validity of the analysis method is verified by comparing with the measurement results of a prototype speaker.
Developing Mathematics Problems Based on PISA Level of Change and Relationships Content
ERIC Educational Resources Information Center
Ahyan, Shahibul; Zulkardi; Darmawijoyo
2014-01-01
This research aims to produce mathematics problems based on PISA level with valid and practical content of change and relationships and has potential effect for Junior High School students. A development research method developed by Akker, Gravemeijer, McKenney and Nieveen is used this research. This development research consists of three stages;…
Chui, Huixia; Domish, Larissa; Hernandez, Drexler; Wang, Gehua
2016-01-01
Identification and typing of bacteria occupy a large fraction of time and work in clinical microbiology laboratories. With the certification of some MS platforms in recent years, more applications and tests of MS‐based diagnosis methods for bacteria identification and typing have been created, not only on well‐accepted MALDI‐TOF‐MS‐based fingerprint matches, but also on solving the insufficiencies of MALDI‐TOF‐MS‐based platforms and advancing the technology to areas such as targeted MS identification and typing of bacteria, bacterial toxin identification, antibiotics susceptibility/resistance tests, and MS‐based diagnostic method development on unique bacteria such as Clostridium and Mycobacteria. This review summarizes the recent development in MS platforms and applications in bacteria identification and typing of common pathogenic bacteria. PMID:26751976
Teaching and assessing procedural skills using simulation: metrics and methodology.
Lammers, Richard L; Davenport, Moira; Korley, Frederick; Griswold-Theodorson, Sharon; Fitch, Michael T; Narang, Aneesh T; Evans, Leigh V; Gross, Amy; Rodriguez, Elliot; Dodge, Kelly L; Hamann, Cara J; Robey, Walter C
2008-11-01
Simulation allows educators to develop learner-focused training and outcomes-based assessments. However, the effectiveness and validity of simulation-based training in emergency medicine (EM) requires further investigation. Teaching and testing technical skills require methods and assessment instruments that are somewhat different than those used for cognitive or team skills. Drawing from work published by other medical disciplines as well as educational, behavioral, and human factors research, the authors developed six research themes: measurement of procedural skills; development of performance standards; assessment and validation of training methods, simulator models, and assessment tools; optimization of training methods; transfer of skills learned on simulator models to patients; and prevention of skill decay over time. The article reviews relevant and established educational research methodologies and identifies gaps in our knowledge of how physicians learn procedures. The authors present questions requiring further research that, once answered, will advance understanding of simulation-based procedural training and assessment in EM.
Microchannel gel electrophoretic separation systems and methods for preparing and using
Herr, Amy E; Singh, Anup K; Throckmorton, Daniel J
2015-02-24
A micro-analytical platform for performing electrophoresis-based immunoassays was developed by integrating photopolymerized cross-linked polyacrylamide gels within a microfluidic device. The microfluidic immunoassays are performed by gel electrophoretic separation and quantifying analyte concentration based upon conventional polyacrylamide gel electrophoresis (PAGE). To retain biological activity of proteins and maintain intact immune complexes, native PAGE conditions were employed. Both direct (non-competitive) and competitive immunoassay formats are demonstrated in microchips for detecting toxins and biomarkers (cytokines, c-reactive protein) in bodily fluids (serum, saliva, oral fluids). Further, a description of gradient gels fabrication is included, in an effort to describe methods we have developed for further optimization of on-chip PAGE immunoassays. The described chip-based PAGE immunoassay method enables immunoassays that are fast (minutes) and require very small amounts of sample (less than a few microliters). Use of microfabricated chips as a platform enables integration, parallel assays, automation and development of portable devices.
Microchannel gel electrophoretic separation systems and methods for preparing and using
Herr, Amy; Singh, Anup K; Throckmorton, Daniel J
2013-09-03
A micro-analytical platform for performing electrophoresis-based immunoassays was developed by integrating photopolymerized cross-linked polyacrylamide gels within a microfluidic device. The microfluidic immunoassays are performed by gel electrophoretic separation and quantifying analyte concentration based upon conventional polyacrylamide gel electrophoresis (PAGE). To retain biological activity of proteins and maintain intact immune complexes, native PAGE conditions were employed. Both direct (non-competitive) and competitive immunoassay formats are demonstrated in microchips for detecting toxins and biomarkers (cytokines, c-reactive protein) in bodily fluids (serum, saliva, oral fluids). Further, a description of gradient gels fabrication is included, in an effort to describe methods we have developed for further optimization of on-chip PAGE immunoassays. The described chip-based PAGE immunoassay method enables immunoassays that are fast (minutes) and require very small amounts of sample (less than a few microliters). Use of microfabricated chips as a platform enables integration, parallel assays, automation and development of portable devices.
Bridge Displacement Monitoring Method Based on Laser Projection-Sensing Technology
Zhao, Xuefeng; Liu, Hao; Yu, Yan; Xu, Xiaodong; Hu, Weitong; Li, Mingchu; Ou, Jingping
2015-01-01
Bridge displacement is the most basic evaluation index of the health status of a bridge structure. The existing measurement methods for bridge displacement basically fail to realize long-term and real-time dynamic monitoring of bridge structures, because of the low degree of automation and the insufficient precision, causing bottlenecks and restriction. To solve this problem, we proposed a bridge displacement monitoring system based on laser projection-sensing technology. First, the laser spot recognition method was studied. Second, the software for the displacement monitoring system was developed. Finally, a series of experiments using this system were conducted, and the results show that such a system has high measurement accuracy and speed. We aim to develop a low-cost, high-accuracy and long-term monitoring method for bridge displacement based on these preliminary efforts. PMID:25871716
Comprehensive evaluation of global energy interconnection development index
NASA Astrophysics Data System (ADS)
Liu, Lin; Zhang, Yi
2018-04-01
Under the background of building global energy interconnection and realizing green and low-carbon development, this article constructed the global energy interconnection development index system which based on the current situation of global energy interconnection development. Through using the entropy method for the weight analysis of global energy interconnection development index, and then using gray correlation method to analyze the selected countries, this article got the global energy interconnection development index ranking and level classification.
Development of a photogrammetric method of measuring tree taper outside bark
David R. Larsen
2006-01-01
A photogrammetric method is presented for measuring tree diameters outside bark using calibrated control ground-based digital photographs. The method was designed to rapidly collect tree taper information from subject trees for the development of tree taper equations. Software that is commercially available, but designed for a different purpose, can be readily adapted...
A Method of Assembling Compact Coherent Fiber-Optic Bundles
NASA Technical Reports Server (NTRS)
Martin, Stefan; Liu, Duncan; Levine, Bruce Martin; Shao, Michael; Wallace, James
2007-01-01
A method of assembling coherent fiber-optic bundles in which all the fibers are packed together as closely as possible is undergoing development. The method is based, straightforwardly, on the established concept of hexagonal close packing; hence, the development efforts are focused on fixtures and techniques for practical implementation of hexagonal close packing of parallel optical fibers.
Microsiemens or Milligrams: Measures of Ionic Mixtures
In December of 2016, EPA released the Draft Field-Based Methods for Developing Aquatic Life Criteria for Specific Conductivity for public comment. Once final, states and authorized tribes may use these methods to derive field-based ecoregional ambient Aquatic Life Ambient Water Q...
Recent technological advances have driven rapid development of DNA-based methods designed to facilitate detection and monitoring of invasive species in aquatic environments. These tools promise to significantly alleviate difficulties associated with traditional monitoring approac...
Governance for public health and health equity: The Tröndelag model for public health work.
Lillefjell, Monica; Magnus, Eva; Knudtsen, Margunn SkJei; Wist, Guri; Horghagen, Sissel; Espnes, Geir Arild; Maass, Ruca; Anthun, Kirsti Sarheim
2018-06-01
Multi-sectoral governance of population health is linked to the realization that health is the property of many societal systems. This study aims to contribute knowledge and methods that can strengthen the capacities of municipalities regarding how to work more systematically, knowledge-based and multi-sectoral in promoting health and health equity in the population. Process evaluation was conducted, applying a mixed-methods research design, combining qualitative and quantitative data collection methods. Processes strengthening systematic and multi-sectoral development, implementation and evaluation of research-based measures to promote health, quality of life, and health equity in, for and with municipalities were revealed. A step-by-step model, that emphasizes the promotion of knowledge-based, systematic, multi-sectoral public health work, as well as joint ownership of local resources, initiatives and policies has been developed. Implementation of systematic, knowledge-based and multi-sectoral governance of public health measures in municipalities demand shared understanding of the challenges, updated overview of the population health and impact factors, anchoring in plans, new skills and methods for selection and implementation of measures, as well as development of trust, ownership, shared ethics and goals among those involved.
NASA Astrophysics Data System (ADS)
Pfefer, Joshua; Agrawal, Anant
2012-03-01
In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.
Milne, Marjorie E; Steward, Christopher; Firestone, Simon M; Long, Sam N; O'Brien, Terrence J; Moffat, Bradford A
2016-04-01
To develop representative MRI atlases of the canine brain and to evaluate 3 methods of atlas-based segmentation (ABS). 62 dogs without clinical signs of epilepsy and without MRI evidence of structural brain disease. The MRI scans from 44 dogs were used to develop 4 templates on the basis of brain shape (brachycephalic, mesaticephalic, dolichocephalic, and combined mesaticephalic and dolichocephalic). Atlas labels were generated by segmenting the brain, ventricular system, hippocampal formation, and caudate nuclei. The MRI scans from the remaining 18 dogs were used to evaluate 3 methods of ABS (manual brain extraction and application of a brain shape-specific template [A], automatic brain extraction and application of a brain shape-specific template [B], and manual brain extraction and application of a combined template [C]). The performance of each ABS method was compared by calculation of the Dice and Jaccard coefficients, with manual segmentation used as the gold standard. Method A had the highest mean Jaccard coefficient and was the most accurate ABS method assessed. Measures of overlap for ABS methods that used manual brain extraction (A and C) ranged from 0.75 to 0.95 and compared favorably with repeated measures of overlap for manual extraction, which ranged from 0.88 to 0.97. Atlas-based segmentation was an accurate and repeatable method for segmentation of canine brain structures. It could be performed more rapidly than manual segmentation, which should allow the application of computer-assisted volumetry to large data sets and clinical cases and facilitate neuroimaging research and disease diagnosis.
Visvanathan, Rizliya; Jayathilake, Chathuni; Liyanage, Ruvini
2016-11-15
For the first time, a reliable, simple, rapid and high-throughput analytical method for the detection and quantification of α-amylase inhibitory activity using the glucose assay kit was developed. The new method facilitates rapid screening of a large number of samples, reduces labor, time and reagents and is also suitable for kinetic studies. This method is based on the reaction of maltose with glucose oxidase (GOD) and the development of a red quinone. The test is done in microtitre plates with a total volume of 260μL and an assay time of 40min including the pre-incubation steps. The new method is tested for linearity, sensitivity, precision, reproducibility and applicability. The new method is also compared with the most commonly used 3,5-dinitrosalicylic acid (DNSA) method for determining α-amylase activity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Using qualitative methods to develop a contextually tailored instrument: Lessons learned
Lee, Haeok; Kiang, Peter; Kim, Minjin; Semino-Asaro, Semira; Colten, Mary Ellen; Tang, Shirley S.; Chea, Phala; Peou, Sonith; Grigg-Saito, Dorcas C.
2015-01-01
Objective: To develop a population-specific instrument to inform hepatitis B virus (HBV) and human papilloma virus (HPV) prevention education and intervention based on data and evidence obtained from the targeted population of Khmer mothers reflecting their socio-cultural and health behaviors. Methods: The principles of community-based participatory research (CBPR) guided the development of a standardized survey interview. Four stages of development and testing of the survey instrument took place in order to inform the quantitative health survey used to collect data in stage five of the project. This article reports only on Stages 1-4. Results: This process created a new quantitative measure of HBV and HPV prevention behavior based on the revised Network Episode Model and informed by the targeted population. The CBPR method facilitated the application and translation of abstract theoretical ideas of HBV and HPV prevention behavior into culturally-relevant words and expressions of Cambodian Americans (CAs). Conclusions: The design of an instrument development process that accounts for distinctive socio-cultural backgrounds of CA refugee/immigrant women provides a model for use in developing future health surveys that are intended to aid minority-serving health care professionals and researchers as well as targeted minority populations. PMID:27981114
Zhang, Le; Lawson, Ken; Yeung, Bernice; Wypych, Jette
2015-01-06
A purity method based on capillary zone electrophoresis (CZE) has been developed for the separation of isoforms of a highly glycosylated protein. The separation was found to be driven by the number of sialic acids attached to each isoform. The method has been characterized using orthogonal assays and shown to have excellent specificity, precision and accuracy. We have demonstrated the CZE method is a useful in-process assay to support cell culture and purification development of this glycoprotein. Compared to isoelectric focusing (IEF), the CZE method provides more quantitative results and higher sample throughput with excellent accuracy, qualities that are required for process development. In addition, the CZE method has been applied in the stability testing of purified glycoprotein samples.
Kadam, Shantanu; Vanka, Kumar
2013-02-15
Methods based on the stochastic formulation of chemical kinetics have the potential to accurately reproduce the dynamical behavior of various biochemical systems of interest. However, the computational expense makes them impractical for the study of real systems. Attempts to render these methods practical have led to the development of accelerated methods, where the reaction numbers are modeled by Poisson random numbers. However, for certain systems, such methods give rise to physically unrealistic negative numbers for species populations. The methods which make use of binomial variables, in place of Poisson random numbers, have since become popular, and have been partially successful in addressing this problem. In this manuscript, the development of two new computational methods, based on the representative reaction approach (RRA), has been discussed. The new methods endeavor to solve the problem of negative numbers, by making use of tools like the stochastic simulation algorithm and the binomial method, in conjunction with the RRA. It is found that these newly developed methods perform better than other binomial methods used for stochastic simulations, in resolving the problem of negative populations. Copyright © 2012 Wiley Periodicals, Inc.
Tanuja, Penmatsa; Venugopal, Namburi; Sashidhar, Rao Beedu
2007-01-01
A simple thin-layer chromatography-digital image-based analytical method has been developed for the quantitation of the botanical pesticide, azadirachtin. The method was validated by analyzing azadirachtin in the spiked food matrixes and processed commercial pesticide formulations, using acidified vanillin reagent as a postchromatographic derivatizing agent. The separated azadirachtin was clearly identified as a green spot. The Rf value was found to be 0.55, which was similar to that of a reference standard. A standard calibration plot was established using a reference standard, based on the linear regression analysis [r2 = 0.996; y = 371.43 + (634.82)x]. The sensitivity of the method was found to be 0.875 microg azadirachtin. Spiking studies conducted at the 1 ppm (microg/g) level in various agricultural matrixes, such as brinjal, tomato, coffee, and cotton seeds, revealed the recoveries of azadirachtin in the range of 67-92%. Azadirachtin content of commercial neem formulations analyzed by the method was in the range of 190-1825 ppm (microg/mL). Further, the present method was compared with an immunoanalytical method enzyme-linked immonosorbent assay developed earlier in our laboratory. Statistical comparison of the 2 methods, using Fischer's F-test, indicated no significant difference in variance, suggesting that both methods are comparable.
NASA Astrophysics Data System (ADS)
Prigozhin, Leonid; Sokolovsky, Vladimir
2018-05-01
We consider the fast Fourier transform (FFT) based numerical method for thin film magnetization problems (Vestgården and Johansen 2012 Supercond. Sci. Technol. 25 104001), compare it with the finite element methods, and evaluate its accuracy. Proposed modifications of this method implementation ensure stable convergence of iterations and enhance its efficiency. A new method, also based on the FFT, is developed for 3D bulk magnetization problems. This method is based on a magnetic field formulation, different from the popular h-formulation of eddy current problems typically employed with the edge finite elements. The method is simple, easy to implement, and can be used with a general current–voltage relation; its efficiency is illustrated by numerical simulations.
ERIC Educational Resources Information Center
Eryilmaz, Ali
2015-01-01
The aim of the present study is investigate that the effectiveness of a teaching method which is based on subjective well-being increasing activities and engagement increasing activities, has been developed for university students in the present study. The method of the present study is a mixed method. Thus, the most important feature of it has…
NASA Technical Reports Server (NTRS)
Saravanos, D. A.
1993-01-01
The development of novel composite mechanics for the analysis of damping in composite laminates and structures and the more significant results of this effort are summarized. Laminate mechanics based on piecewise continuous in-plane displacement fields are described that can represent both intralaminar stresses and interlaminar shear stresses and the associated effects on the stiffness and damping characteristics of a composite laminate. Among other features, the mechanics can accurately model the static and damped dynamic response of either thin or thick composite laminates, as well as, specialty laminates with embedded compliant damping layers. The discrete laminate damping theory is further incorporated into structural analysis methods. In this context, an exact semi-analytical method for the simulation of the damped dynamic response of composite plates was developed. A finite element based method and a specialty four-node plate element were also developed for the analysis of composite structures of variable shape and boundary conditions. Numerous evaluations and applications demonstrate the quality and superiority of the mechanics in predicting the damped dynamic characteristics of composite structures. Finally, additional development was focused on the development of optimal tailoring methods for the design of thick composite structures based on the developed analytical capability. Applications on composite plates illustrated the influence of composite mechanics in the optimal design of composites and the potential for significant deviations in the resultant designs when more simplified (classical) laminate theories are used.
Flight-Test Evaluation of Flutter-Prediction Methods
NASA Technical Reports Server (NTRS)
Lind, RIck; Brenner, Marty
2003-01-01
The flight-test community routinely spends considerable time and money to determine a range of flight conditions, called a flight envelope, within which an aircraft is safe to fly. The cost of determining a flight envelope could be greatly reduced if there were a method of safely and accurately predicting the speed associated with the onset of an instability called flutter. Several methods have been developed with the goal of predicting flutter speeds to improve the efficiency of flight testing. These methods include (1) data-based methods, in which one relies entirely on information obtained from the flight tests and (2) model-based approaches, in which one relies on a combination of flight data and theoretical models. The data-driven methods include one based on extrapolation of damping trends, one that involves an envelope function, one that involves the Zimmerman-Weissenburger flutter margin, and one that involves a discrete-time auto-regressive model. An example of a model-based approach is that of the flutterometer. These methods have all been shown to be theoretically valid and have been demonstrated on simple test cases; however, until now, they have not been thoroughly evaluated in flight tests. An experimental apparatus called the Aerostructures Test Wing (ATW) was developed to test these prediction methods.
Determination of cellulose I crystallinity by FT-Raman spectroscopy
Umesh P. Agarwal; Richard S. Reiner; Sally A. Ralph
2009-01-01
Two new methods based on FT-Raman spectroscopy, one simple, based on band intensity ratio, and the other, using a partial least-squares (PLS) regression model, are proposed to determine cellulose I crystallinity. In the simple method, crystallinity in semicrystalline cellulose I samples was determined based on univariate regression that was first developed using the...
ERIC Educational Resources Information Center
Said, Asnah; Syarif, Edy
2016-01-01
This research aimed to evaluate of online tutorial program design by applying problem-based learning Research Methods currently implemented in the system of Open Distance Learning (ODL). The students must take a Research Methods course to prepare themselves for academic writing projects. Problem-based learning basically emphasizes the process of…
Data-Based Decision-Making: Developing a Method for Capturing Teachers' Understanding of CBM Graphs
ERIC Educational Resources Information Center
Espin, Christine A.; Wayman, Miya Miura; Deno, Stanley L.; McMaster, Kristen L.; de Rooij, Mark
2017-01-01
In this special issue, we explore the decision-making aspect of "data-based decision-making". The articles in the issue address a wide range of research questions, designs, methods, and analyses, but all focus on data-based decision-making for students with learning difficulties. In this first article, we introduce the topic of…
Efficient and accurate adverse outcome pathway (AOP) based high-throughput screening (HTS) methods use a systems biology based approach to computationally model in vitro cellular and molecular data for rapid chemical prioritization; however, not all HTS assays are grounded by rel...
Metallic Nanostructures Based on DNA Nanoshapes
Shen, Boxuan; Tapio, Kosti; Linko, Veikko; Kostiainen, Mauri A.; Toppari, Jari Jussi
2016-01-01
Metallic nanostructures have inspired extensive research over several decades, particularly within the field of nanoelectronics and increasingly in plasmonics. Due to the limitations of conventional lithography methods, the development of bottom-up fabricated metallic nanostructures has become more and more in demand. The remarkable development of DNA-based nanostructures has provided many successful methods and realizations for these needs, such as chemical DNA metallization via seeding or ionization, as well as DNA-guided lithography and casting of metallic nanoparticles by DNA molds. These methods offer high resolution, versatility and throughput and could enable the fabrication of arbitrarily-shaped structures with a 10-nm feature size, thus bringing novel applications into view. In this review, we cover the evolution of DNA-based metallic nanostructures, starting from the metallized double-stranded DNA for electronics and progress to sophisticated plasmonic structures based on DNA origami objects. PMID:28335274
Application of artificial neural networks in nonlinear analysis of trusses
NASA Technical Reports Server (NTRS)
Alam, J.; Berke, L.
1991-01-01
A method is developed to incorporate neural network model based upon the Backpropagation algorithm for material response into nonlinear elastic truss analysis using the initial stiffness method. Different network configurations are developed to assess the accuracy of neural network modeling of nonlinear material response. In addition to this, a scheme based upon linear interpolation for material data, is also implemented for comparison purposes. It is found that neural network approach can yield very accurate results if used with care. For the type of problems under consideration, it offers a viable alternative to other material modeling methods.
Probabilistic methods for rotordynamics analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
Dai, Sheng-Yun; Xu, Bing; Shi, Xin-Yuan; Xu, Xiang; Sun, Ying-Qiang; Qiao, Yan-Jiang
2017-03-01
This study is aimed to propose a continual improvement strategy based on quality by design (QbD). An ultra high performance liquid chromatography (UPLC) method was developed to accomplish the method transformation from HPLC to UPLC of Panax notogineng saponins (PNS) and achieve the continual improvement of PNS based on QbD, for example. Plackett-Burman screening design and Box-Behnken optimization design were employed to further understand the relationship between the critical method parameters (CMPs) and critical method attributes (CMAs). And then the Bayesian design space was built. The separation degree of the critical peaks (ginsenoside Rg₁ and ginsenoside Re) was over 2.0 and the analysis time was less than 17 min by a method chosen from the design space with 20% of the initial concentration of the acetonitrile, 10 min of the isocratic time and 6%•min⁻¹ of the gradient slope. At last, the optimum method was validated by accuracy profile. Based on the same analytical target profile (ATP), the comparison of HPLC and UPLC including chromatograph method, CMA identification, CMP-CMA model and system suitability test (SST) indicated that the UPLC method could shorten the analysis time, improve the critical separation and satisfy the requirement of the SST. In all, HPLC method could be replaced by UPLC for the quantity analysis of PNS. Copyright© by the Chinese Pharmaceutical Association.
Matsumoto, Hiroshi; Saito, Fumiyo; Takeyoshi, Masahiro
2015-12-01
Recently, the development of several gene expression-based prediction methods has been attempted in the fields of toxicology. CARCINOscreen® is a gene expression-based screening method to predict carcinogenicity of chemicals which target the liver with high accuracy. In this study, we investigated the applicability of the gene expression-based screening method to SD and Wistar rats by using CARCINOscreen®, originally developed with F344 rats, with two carcinogens, 2,4-diaminotoluen and thioacetamide, and two non-carcinogens, 2,6-diaminotoluen and sodium benzoate. After the 28-day repeated dose test was conducted with each chemical in SD and Wistar rats, microarray analysis was performed using total RNA extracted from each liver. Obtained gene expression data were applied to CARCINOscreen®. Predictive scores obtained by the CARCINOscreen® for known carcinogens were > 2 in all strains of rats, while non-carcinogens gave prediction scores below 0.5. These results suggested that the gene expression based screening method, CARCINOscreen®, can be applied to SD and Wistar rats, widely used strains in toxicological studies, by setting of an appropriate boundary line of prediction score to classify the chemicals into carcinogens and non-carcinogens.
Mesh quality oriented 3D geometric vascular modeling based on parallel transport frame.
Guo, Jixiang; Li, Shun; Chui, Yim Pan; Qin, Jing; Heng, Pheng Ann
2013-08-01
While a number of methods have been proposed to reconstruct geometrically and topologically accurate 3D vascular models from medical images, little attention has been paid to constantly maintain high mesh quality of these models during the reconstruction procedure, which is essential for many subsequent applications such as simulation-based surgical training and planning. We propose a set of methods to bridge this gap based on parallel transport frame. An improved bifurcation modeling method and two novel trifurcation modeling methods are developed based on 3D Bézier curve segments in order to ensure the continuous surface transition at furcations. In addition, a frame blending scheme is implemented to solve the twisting problem caused by frame mismatch of two successive furcations. A curvature based adaptive sampling scheme combined with a mesh quality guided frame tilting algorithm is developed to construct an evenly distributed, non-concave and self-intersection free surface mesh for vessels with distinct radius and high curvature. Extensive experiments demonstrate that our methodology can generate vascular models with better mesh quality than previous methods in terms of surface mesh quality criteria. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Quantum-Based Similarity Method in Virtual Screening.
Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal
2015-10-02
One of the most widely-used techniques for ligand-based virtual screening is similarity searching. This study adopted the concepts of quantum mechanics to present as state-of-the-art similarity method of molecules inspired from quantum theory. The representation of molecular compounds in mathematical quantum space plays a vital role in the development of quantum-based similarity approach. One of the key concepts of quantum theory is the use of complex numbers. Hence, this study proposed three various techniques to embed and to re-represent the molecular compounds to correspond with complex numbers format. The quantum-based similarity method that developed in this study depending on complex pure Hilbert space of molecules called Standard Quantum-Based (SQB). The recall of retrieved active molecules were at top 1% and top 5%, and significant test is used to evaluate our proposed methods. The MDL drug data report (MDDR), maximum unbiased validation (MUV) and Directory of Useful Decoys (DUD) data sets were used for experiments and were represented by 2D fingerprints. Simulated virtual screening experiment show that the effectiveness of SQB method was significantly increased due to the role of representational power of molecular compounds in complex numbers forms compared to Tanimoto benchmark similarity measure.
An Ellipsoidal Particle-Finite Element Method for Hypervelocity Impact Simulation. Chapter 1
NASA Technical Reports Server (NTRS)
Shivarama, Ravishankar; Fahrenthold, Eric P.
2004-01-01
A number of coupled particle-element and hybrid particle-element methods have been developed for the simulation of hypervelocity impact problems, to avoid certain disadvantages associated with the use of pure continuum based or pure particle based methods. To date these methods have employed spherical particles. In recent work a hybrid formulation has been extended to the ellipsoidal particle case. A model formulation approach based on Lagrange's equations, with particles entropies serving as generalized coordinates, avoids the angular momentum conservation problems which have been reported with ellipsoidal smooth particle hydrodynamics models.
Vissenberg, Charlotte; Nierkens, Vera; Uitewaal, Paul J. M.; Middelkoop, Barend J. C.; Nijpels, Giel; Stronks, Karien
2017-01-01
This article describes the development of the social network-based intervention Powerful Together with Diabetes which aims to improve diabetes self-management (DSM) among patients with type 2 diabetes living in socioeconomically deprived neighborhoods by stimulating social support for DSM and diminishing social influences hindering DSM (e.g., peer pressure and social norms). The intervention was specifically developed for patients with Dutch, Turkish, Moroccan, and Surinamese backgrounds. The intervention was developed according to Intervention Mapping. This article describes the first four steps of Intervention Mapping: (1) the needs assessment; (2) development of performance and change objectives; (3) selection of theory-based methods and strategies; and (4) the translation of these into an organized program. These four steps resulted in Powerful Together with Diabetes, a 10-month group-based intervention consisting of 24 meetings, 6 meetings for significant others, and 2 meetings for participants and their spouses. The IM method resulted in a tailored approach with a specific focus on the social networks of its participants. This article concludes that the IM method helped our planning team to tailor the intervention to the needs of our target population and facilitated our evaluation design. However, in hindsight, the intervention could have been improved by investing more in participatory planning and community involvement. PMID:29326916
Vissenberg, Charlotte; Nierkens, Vera; Uitewaal, Paul J M; Middelkoop, Barend J C; Nijpels, Giel; Stronks, Karien
2017-01-01
This article describes the development of the social network-based intervention Powerful Together with Diabetes which aims to improve diabetes self-management (DSM) among patients with type 2 diabetes living in socioeconomically deprived neighborhoods by stimulating social support for DSM and diminishing social influences hindering DSM (e.g., peer pressure and social norms). The intervention was specifically developed for patients with Dutch, Turkish, Moroccan, and Surinamese backgrounds. The intervention was developed according to Intervention Mapping. This article describes the first four steps of Intervention Mapping: (1) the needs assessment; (2) development of performance and change objectives; (3) selection of theory-based methods and strategies; and (4) the translation of these into an organized program. These four steps resulted in Powerful Together with Diabetes , a 10-month group-based intervention consisting of 24 meetings, 6 meetings for significant others, and 2 meetings for participants and their spouses. The IM method resulted in a tailored approach with a specific focus on the social networks of its participants. This article concludes that the IM method helped our planning team to tailor the intervention to the needs of our target population and facilitated our evaluation design. However, in hindsight, the intervention could have been improved by investing more in participatory planning and community involvement.
Some important considerations in the development of stress corrosion cracking test methods.
NASA Technical Reports Server (NTRS)
Wei, R. P.; Novak, S. R.; Williams, D. P.
1972-01-01
Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.
Frequency analysis of uncertain structures using imprecise probability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Modares, Mehdi; Bergerson, Joshua
2015-01-01
Two new methods for finite element based frequency analysis of a structure with uncertainty are developed. An imprecise probability formulation based on enveloping p-boxes is used to quantify the uncertainty present in the mechanical characteristics of the structure. For each element, independent variations are considered. Using the two developed methods, P-box Frequency Analysis (PFA) and Interval Monte-Carlo Frequency Analysis (IMFA), sharp bounds on natural circular frequencies at different probability levels are obtained. These methods establish a framework for handling incomplete information in structural dynamics. Numerical example problems are presented that illustrate the capabilities of the new methods along with discussionsmore » on their computational efficiency.« less
Wright, Adam; Laxmisan, Archana; Ottosen, Madelene J; McCoy, Jacob A; Butten, David; Sittig, Dean F
2012-01-01
Objective We describe a novel, crowdsourcing method for generating a knowledge base of problem–medication pairs that takes advantage of manually asserted links between medications and problems. Methods Through iterative review, we developed metrics to estimate the appropriateness of manually entered problem–medication links for inclusion in a knowledge base that can be used to infer previously unasserted links between problems and medications. Results Clinicians manually linked 231 223 medications (55.30% of prescribed medications) to problems within the electronic health record, generating 41 203 distinct problem–medication pairs, although not all were accurate. We developed methods to evaluate the accuracy of the pairs, and after limiting the pairs to those meeting an estimated 95% appropriateness threshold, 11 166 pairs remained. The pairs in the knowledge base accounted for 183 127 total links asserted (76.47% of all links). Retrospective application of the knowledge base linked 68 316 medications not previously linked by a clinician to an indicated problem (36.53% of unlinked medications). Expert review of the combined knowledge base, including inferred and manually linked problem–medication pairs, found a sensitivity of 65.8% and a specificity of 97.9%. Conclusion Crowdsourcing is an effective, inexpensive method for generating a knowledge base of problem–medication pairs that is automatically mapped to local terminologies, up-to-date, and reflective of local prescribing practices and trends. PMID:22582202
NASA Astrophysics Data System (ADS)
Sabale, Pramod M.; George, Jerrin Thomas; Srivatsan, Seergazhi G.
2014-08-01
Given the biological and therapeutic significance of telomeres and other G-quadruplex forming sequences in human genome, it is highly desirable to develop simple methods to study these structures, which can also be implemented in screening formats for the discovery of G-quadruplex binders. The majority of telomere detection methods developed so far are laborious and use elaborate assay and instrumental setups, and hence, are not amenable to discovery platforms. Here, we describe the development of a simple homogeneous fluorescence turn-on method, which uses a unique combination of an environment-sensitive fluorescent nucleobase analogue, the superior base pairing property of PNA, and DNA-binding and fluorescence quenching properties of graphene oxide, to detect human telomeric DNA repeats of varying lengths. Our results demonstrate that this method, which does not involve a rigorous assay setup, would provide new opportunities to study G-quadruplex structures.Given the biological and therapeutic significance of telomeres and other G-quadruplex forming sequences in human genome, it is highly desirable to develop simple methods to study these structures, which can also be implemented in screening formats for the discovery of G-quadruplex binders. The majority of telomere detection methods developed so far are laborious and use elaborate assay and instrumental setups, and hence, are not amenable to discovery platforms. Here, we describe the development of a simple homogeneous fluorescence turn-on method, which uses a unique combination of an environment-sensitive fluorescent nucleobase analogue, the superior base pairing property of PNA, and DNA-binding and fluorescence quenching properties of graphene oxide, to detect human telomeric DNA repeats of varying lengths. Our results demonstrate that this method, which does not involve a rigorous assay setup, would provide new opportunities to study G-quadruplex structures. Electronic supplementary information (ESI) available. Figures, tables, experimental procedures and NMR spectra. See DOI: 10.1039/c4nr00878b
2014-08-01
Methodology The methods for this research followed: Qualitative research methods based on specific criteria (Creswell, 2007) with a constructionist ...strategic questions are being asked, and what methods are being used to develop insight that help guide long- term strategies and short-term... methods used to develop insight *Note: The term VUCA stands for volatility, uncertainty, complexity, and ambiguity and is used interchangeably in
Behaviour of levee on softsoil caused by rapid drawdown
NASA Astrophysics Data System (ADS)
Upomo, Togani Cahyadi; Effendi, Mahmud Kori; Kusumawardani, Rini
2018-03-01
Rapid Drawdown is a condition where the water elevation that has reached the peak suddenly drops. As the water level reaches the peak, hydrostatic pressure helps in the stability of the slope. When water elevation decreases there will be two effects. First, reduced hydrostatic pressure and second, modification of pore water pressure. Rapid draw down usually comon in hydraulic structure such as dam and levee. This study will discuss behaviour of levee on softsoil caused by rapid drawdown. The analysis based on method which developed by US Army Corps Engineer and modified method which developed by Duncan, Wright, dan Wong. Results of analysis show that in drawdown condition, at 1 m drop of water, safety factor obtained based on US Army Corps Engineer method was 1.16 and 0.976 while based on Duncan, Wright, and Wong methods were 1.244 and 1.117. At 0.5 m water level, safety factor based on US Army Corps Engineer method was 1.287 and 1.09 while Duncan, Wright, and Wong were 1.357 and 1.194.
Dong, Xinran; Hao, Yun; Wang, Xiao; Tian, Weidong
2016-01-01
Pathway or gene set over-representation analysis (ORA) has become a routine task in functional genomics studies. However, currently widely used ORA tools employ statistical methods such as Fisher’s exact test that reduce a pathway into a list of genes, ignoring the constitutive functional non-equivalent roles of genes and the complex gene-gene interactions. Here, we develop a novel method named LEGO (functional Link Enrichment of Gene Ontology or gene sets) that takes into consideration these two types of information by incorporating network-based gene weights in ORA analysis. In three benchmarks, LEGO achieves better performance than Fisher and three other network-based methods. To further evaluate LEGO’s usefulness, we compare LEGO with five gene expression-based and three pathway topology-based methods using a benchmark of 34 disease gene expression datasets compiled by a recent publication, and show that LEGO is among the top-ranked methods in terms of both sensitivity and prioritization for detecting target KEGG pathways. In addition, we develop a cluster-and-filter approach to reduce the redundancy among the enriched gene sets, making the results more interpretable to biologists. Finally, we apply LEGO to two lists of autism genes, and identify relevant gene sets to autism that could not be found by Fisher. PMID:26750448
Dong, Xinran; Hao, Yun; Wang, Xiao; Tian, Weidong
2016-01-11
Pathway or gene set over-representation analysis (ORA) has become a routine task in functional genomics studies. However, currently widely used ORA tools employ statistical methods such as Fisher's exact test that reduce a pathway into a list of genes, ignoring the constitutive functional non-equivalent roles of genes and the complex gene-gene interactions. Here, we develop a novel method named LEGO (functional Link Enrichment of Gene Ontology or gene sets) that takes into consideration these two types of information by incorporating network-based gene weights in ORA analysis. In three benchmarks, LEGO achieves better performance than Fisher and three other network-based methods. To further evaluate LEGO's usefulness, we compare LEGO with five gene expression-based and three pathway topology-based methods using a benchmark of 34 disease gene expression datasets compiled by a recent publication, and show that LEGO is among the top-ranked methods in terms of both sensitivity and prioritization for detecting target KEGG pathways. In addition, we develop a cluster-and-filter approach to reduce the redundancy among the enriched gene sets, making the results more interpretable to biologists. Finally, we apply LEGO to two lists of autism genes, and identify relevant gene sets to autism that could not be found by Fisher.
Liang, Liang; Liu, Minliang; Martin, Caitlin; Sun, Wei
2018-05-09
Advances in structural finite element analysis (FEA) and medical imaging have made it possible to investigate the in vivo biomechanics of human organs such as blood vessels, for which organ geometries at the zero-pressure level need to be recovered. Although FEA-based inverse methods are available for zero-pressure geometry estimation, these methods typically require iterative computation, which are time-consuming and may be not suitable for time-sensitive clinical applications. In this study, by using machine learning (ML) techniques, we developed an ML model to estimate the zero-pressure geometry of human thoracic aorta given 2 pressurized geometries of the same patient at 2 different blood pressure levels. For the ML model development, a FEA-based method was used to generate a dataset of aorta geometries of 3125 virtual patients. The ML model, which was trained and tested on the dataset, is capable of recovering zero-pressure geometries consistent with those generated by the FEA-based method. Thus, this study demonstrates the feasibility and great potential of using ML techniques as a fast surrogate of FEA-based inverse methods to recover zero-pressure geometries of human organs. Copyright © 2018 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon
2015-10-01
An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.
Development of microwave assisted spectrophotometric method for the determination of glucose
NASA Astrophysics Data System (ADS)
Ali, Asif; Hussain, Zahid; Arain, Muhammad Balal; Shah, Nasrullah; Khan, Khalid Mohammad; Gulab, Hussain; Zada, Amir
2016-01-01
A spectrophotometric method was developed based on the microwave assisted synthesis of Maillard product. Various conditions of the reaction were optimized by varying the relative concentration of the reagents, operating temperature and volume of solutions used in the reaction in the microwave synthesizer. The absorbance of the microwave synthesized Maillard product was measured in the range of 360-740 nm using UV-Visible spectrophotometer. Based on the maximum absorbance, 370 nm was selected as the optimum wave length for further studies. The LOD and LOQ of glucose was found 3.08 μg mL- 1 and 9.33 μg mL- 1 with standard deviation of ± 0.05. The developed method was also applicable to urine sample.
Airframe Icing Research Gaps: NASA Perspective
NASA Technical Reports Server (NTRS)
Potapczuk, Mark
2009-01-01
qCurrent Airframe Icing Technology Gaps: Development of a full 3D ice accretion simulation model. Development of an improved simulation model for SLD conditions. CFD modeling of stall behavior for ice-contaminated wings/tails. Computational methods for simulation of stability and control parameters. Analysis of thermal ice protection system performance. Quantification of 3D ice shape geometric characteristics Development of accurate ground-based simulation of SLD conditions. Development of scaling methods for SLD conditions. Development of advanced diagnostic techniques for assessment of tunnel cloud conditions. Identification of critical ice shapes for aerodynamic performance degradation. Aerodynamic scaling issues associated with testing scale model ice shape geometries. Development of altitude scaling methods for thermal ice protections systems. Development of accurate parameter identification methods. Measurement of stability and control parameters for an ice-contaminated swept wing aircraft. Creation of control law modifications to prevent loss of control during icing encounters. 3D ice shape geometries. Collection efficiency data for ice shape geometries. SLD ice shape data, in-flight and ground-based, for simulation verification. Aerodynamic performance data for 3D geometries and various icing conditions. Stability and control parameter data for iced aircraft configurations. Thermal ice protection system data for simulation validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael S. Zhdanov
2005-03-09
The research during the first year of the project was focused on developing the foundations of a new geophysical technique for mineral exploration and mineral discrimination, based on electromagnetic (EM) methods. The proposed new technique is based on examining the spectral induced polarization effects in electromagnetic data using modern distributed acquisition systems and advanced methods of 3-D inversion. The analysis of IP phenomena is usually based on models with frequency dependent complex conductivity distribution. One of the most popular is the Cole-Cole relaxation model. In this progress report we have constructed and analyzed a different physical and mathematical model ofmore » the IP effect based on the effective-medium theory. We have developed a rigorous mathematical model of multi-phase conductive media, which can provide a quantitative tool for evaluation of the type of mineralization, using the conductivity relaxation model parameters. The parameters of the new conductivity relaxation model can be used for discrimination of the different types of rock formations, which is an important goal in mineral exploration. The solution of this problem requires development of an effective numerical method for EM forward modeling in 3-D inhomogeneous media. During the first year of the project we have developed a prototype 3-D IP modeling algorithm using the integral equation (IP) method. Our IE forward modeling code INTEM3DIP is based on the contraction IE method, which improves the convergence rate of the iterative solvers. This code can handle various types of sources and receivers to compute the effect of a complex resistivity model. We have tested the working version of the INTEM3DIP code for computer simulation of the IP data for several models including a southwest US porphyry model and a Kambalda-style nickel sulfide deposit. The numerical modeling study clearly demonstrates how the various complex resistivity models manifest differently in the observed EM data. These modeling studies lay a background for future development of the IP inversion method, directed at determining the electrical conductivity and the intrinsic chargeability distributions, as well as the other parameters of the relaxation model simultaneously. The new technology envisioned in this proposal, will be used for the discrimination of different rocks, and in this way will provide an ability to distinguish between uneconomic mineral deposits and the location of zones of economic mineralization and geothermal resources.« less
NASA Astrophysics Data System (ADS)
Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan
Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.
Trimming Line Design using New Development Method and One Step FEM
NASA Astrophysics Data System (ADS)
Chung, Wan-Jin; Park, Choon-Dal; Yang, Dong-yol
2005-08-01
In most of automobile panel manufacturing, trimming is generally performed prior to flanging. To find feasible trimming line is crucial in obtaining accurate edge profile after flanging. Section-based method develops blank along section planes and find trimming line by generating loop of end points. This method suffers from inaccurate results for regions with out-of-section motion. On the other hand, simulation-based method can produce more accurate trimming line by iterative strategy. However, due to limitation of time and lack of information in initial die design, it is still not widely accepted in the industry. In this study, new fast method to find feasible trimming line is proposed. One step FEM is used to analyze the flanging process because we can define the desired final shape after flanging and most of strain paths are simple in flanging. When we use one step FEM, the main obstacle is the generation of initial guess. Robust initial guess generation method is developed to handle bad-shaped mesh, very different mesh size and undercut part. The new method develops 3D triangular mesh in propagational way from final mesh onto the drawing tool surface. Also in order to remedy mesh distortion during development, energy minimization technique is utilized. Trimming line is extracted from the outer boundary after one step FEM simulation. This method shows many benefits since trimming line can be obtained in the early design stage. The developed method is successfully applied to the complex industrial applications such as flanging of fender and door outer.
Improved regulatory element prediction based on tissue-specific local epigenomic signatures
He, Yupeng; Gorkin, David U.; Dickel, Diane E.; Nery, Joseph R.; Castanon, Rosa G.; Lee, Ah Young; Shen, Yin; Visel, Axel; Pennacchio, Len A.; Ren, Bing; Ecker, Joseph R.
2017-01-01
Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulatory element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared with existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types. REPTILE is available at https://github.com/yupenghe/REPTILE/. PMID:28193886
NASA Astrophysics Data System (ADS)
Deka, Jashmini; Mojumdar, Aditya; Parisse, Pietro; Onesti, Silvia; Casalis, Loredana
2017-03-01
Helicase are essential enzymes which are widespread in all life-forms. Due to their central role in nucleic acid metabolism, they are emerging as important targets for anti-viral, antibacterial and anti-cancer drugs. The development of easy, cheap, fast and robust biochemical assays to measure helicase activity, overcoming the limitations of the current methods, is a pre-requisite for the discovery of helicase inhibitors through high-throughput screenings. We have developed a method which exploits the optical properties of DNA-conjugated gold nanoparticles (AuNP) and meets the required criteria. The method was tested with the catalytic domain of the human RecQ4 helicase and compared with a conventional FRET-based assay. The AuNP-based assay produced similar results but is simpler, more robust and cheaper than FRET. Therefore, our nanotechnology-based platform shows the potential to provide a useful alternative to the existing conventional methods for following helicase activity and to screen small-molecule libraries as potential helicase inhibitors.
A Strength-Based Approach to Teacher Professional Development
ERIC Educational Resources Information Center
Zwart, Rosanne C.; Korthagen, Fred A. J.; Attema-Noordewier, Saskia
2015-01-01
Based on positive psychology, self-determination theory and a perspective on teacher quality, this study proposes and examines a strength-based approach to teacher professional development. A mixed method pre-test/post-test design was adopted to study perceived outcomes of the approach for 93 teachers of six primary schools in the Netherlands and…
ERIC Educational Resources Information Center
Nickles, George
2007-01-01
This article describes using Work Action Analysis (WAA) as a method for identifying requirements for a web-based portal that supports a professional development program. WAA is a cognitive systems engineering method for modeling multi-agent systems to support design and evaluation. A WAA model of the professional development program of the…
Jung, Lan-Hee; Choi, Jeong-Hwa; Bang, Hyun-Mi; Shin, Jun-Ho; Heo, Young-Ran
2015-02-01
This research was conducted to compare lecture-and experience-based methods of nutritional education as well as provide fundamental data for developing an effective nutritional education program in elementary schools. A total of 110 students in three elementary schools in Jeollanam-do were recruited and randomly distributed in lecture-and experience-based groups. The effects of education on students' dietary knowledge, dietary behaviors, and dietary habits were analyzed using a pre/post-test. Lecture-and experience-based methods did not significantly alter total scores for dietary knowledge in any group, although lecture-based method led to improvement for some detailed questions. In the experience-based group, subjects showed significant alteration of dietary behaviors, whereas lecture-based method showed alteration of dietary habits. These outcomes suggest that lecture-and experience-based methods led to differential improvement of students' dietary habits, behaviors, and knowledge. To obtain better nutritional education results, both lectures and experiential activities need to be considered.
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
Googling DNA sequences on the World Wide Web.
Hajibabaei, Mehrdad; Singer, Gregory A C
2009-11-10
New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.
Recent advances in Lanczos-based iterative methods for nonsymmetric linear systems
NASA Technical Reports Server (NTRS)
Freund, Roland W.; Golub, Gene H.; Nachtigal, Noel M.
1992-01-01
In recent years, there has been a true revival of the nonsymmetric Lanczos method. On the one hand, the possible breakdowns in the classical algorithm are now better understood, and so-called look-ahead variants of the Lanczos process have been developed, which remedy this problem. On the other hand, various new Lanczos-based iterative schemes for solving nonsymmetric linear systems have been proposed. This paper gives a survey of some of these recent developments.
Software-engineering challenges of building and deploying reusable problem solvers.
O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A
2009-11-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.
Software-engineering challenges of building and deploying reusable problem solvers
O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.
2012-01-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031
Computational methods in drug discovery
Leelananda, Sumudu P
2016-01-01
The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed. PMID:28144341
Computational methods in drug discovery.
Leelananda, Sumudu P; Lindert, Steffen
2016-01-01
The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein-ligand docking, pharmacophore modeling and QSAR techniques are reviewed.
The Development and Evaluation of Speaking Learning Model by Cooperative Approach
ERIC Educational Resources Information Center
Darmuki, Agus; Andayani; Nurkamto, Joko; Saddhono, Kundharu
2018-01-01
A cooperative approach-based Speaking Learning Model (SLM) has been developed to improve speaking skill of Higher Education students. This research aimed at evaluating the effectiveness of cooperative-based SLM viewed from the development of student's speaking ability and its effectiveness on speaking activity. This mixed method study combined…
Physical Activity and Positive Youth Development: Impact of a School-Based Program
ERIC Educational Resources Information Center
Madsen, Kristine A.; Hicks, Katherine; Thompson, Hannah
2011-01-01
Background: Protective factors associated with positive youth development predict health and education outcomes. This study explored trends in these protective factors and in physical activity among low-income students, and determined the impact of a school-based youth development program on these trends. Methods: This study used a…
ERIC Educational Resources Information Center
Gonczi, Amanda L.; Maeng, Jennifer L.; Bell, Randy L.; Whitworth, Brooke A.
2016-01-01
This mixed-methods study sought to identify professional development implementation variables that may influence participant (a) adoption of simulations, and (b) use for inquiry-based science instruction. Two groups (Cohort 1, N = 52; Cohort 2, N = 104) received different professional development. Cohort 1 was focused on Web site use mechanics.…
ERIC Educational Resources Information Center
Rubin, Allen; Parrish, Danielle E.
2010-01-01
Objective: This report describes the development and preliminary findings regarding the reliability, validity, and sensitivity of a scale that has been developed to assess practitioners' perceived familiarity with, attitudes about, and implementation of the phases of the evidence-based practice (EBP) process. Method: After a panel of national…
Developing Results-Based Leadership Attributes and Team Cohesiveness through Action Learning
ERIC Educational Resources Information Center
Troupe, David
2010-01-01
Those who develop leaders in manufacturing settings have little data that describe the usefulness of action learning as a method of developing leaders' abilities to improve results-based leadership attributes or perceptions about their team's cohesiveness. The two purposes of this study were to evaluate an action learning program with regards to…
Validated HPLC Determination of 4-Dimethylaminoantipyrine in Different Suppository Bases
Kalmár, É; Kormányos, B.; Szakonyi, G.; Dombi, G.
2014-01-01
Suppositories are important tools for individual therapy, especially in paediatrics, and an instrumental assay method has become necessary for the quality control of dosage units. The aim of this work was to develop a rapid, effective high-performance liquid chromatography method to assay aminophenazone in extemporaneous suppositories prepared with two different suppository bases, adeps solidus and massa macrogoli. With a novel sample preparation method developed by the authors, 4-dimethylaminoantipyrine was determined in these suppository bases with 95-105% recovery. The measurements were carried out on a Shimadzu Prominence ultra high-performance liquid chromatography system equipped with a 20 μl sample loop. The separation was achieved on a Hypersil ODS column, with methanol, sodium acetate buffer (pH 5.5±0.05, 0.05 M, 60:40, v/v) as the mobile phase at a flow rate of 1.5 ml/min. The chromatograms were acquired at 253 nm. The chromatographic method was fully validated in accordance with current guidelines. The presented data demonstrate the successful development of a rapid, efficient and robust sample preparation and high-performance liquid chromatography method for the routine quality control of the dosage units of suppositories containing 4-dimethylaminoantipyrine. PMID:24799736
Estimating Development Cost of an Interactive Website Based Cancer Screening Promotion Program
Lairson, David R.; Chung, Tong Han; Smith, Lisa G.; Springston, Jeffrey K.; Champion, Victoria L.
2015-01-01
Objectives The aim of this study was to estimate the initial development costs for an innovative talk show format tailored intervention delivered via the interactive web, for increasing cancer screening in women 50 to 75 who were non-adherent to screening guidelines for colorectal cancer and/or breast cancer. Methods The cost of the intervention development was estimated from a societal perspective. Micro costing methods plus vendor contract costs were used to estimate cost. Staff logs were used to track personnel time. Non-personnel costs include all additional resources used to produce the intervention. Results Development cost of the interactive web based intervention was $.39 million, of which 77% was direct cost. About 98% of the cost was incurred in personnel time cost, contract cost and overhead cost. Conclusions The new web-based disease prevention medium required substantial investment in health promotion and media specialist time. The development cost was primarily driven by the high level of human capital required. The cost of intervention development is important information for assessing and planning future public and private investments in web-based health promotion interventions. PMID:25749548
Development and In Vitro Bioactivity Profiling of Alternative Sustainable Nanomaterials
Sustainable, environmentally benign nanomaterials (NMs) are being designed as alternatives based on functionality to conventional metal-based nanomaterials (NMs) in order to minimize potential risk to human health and the environment. Development of rapid methods to evaluate the ...
NASA Astrophysics Data System (ADS)
Ji, Yang; Chen, Hong; Tang, Hongwu
2017-06-01
A highly accurate wide-angle scheme, based on the generalized mutistep scheme in the propagation direction, is developed for the finite difference beam propagation method (FD-BPM). Comparing with the previously presented method, the simulation shows that our method results in a more accurate solution, and the step size can be much larger
ERIC Educational Resources Information Center
Williams, Gina E.
2017-01-01
The purpose of this case study was to examine the play based teaching and learning methods at one particular elementary school in southeastern Massachusetts with the aim of identifying methods and practices that are seen as essential in developing the academic and social skills in kindergarten students. This school of study has been utilizing…
ERIC Educational Resources Information Center
Wilcox, Amie K.; Shoulders, Catherine W.; Myers, Brian E.
2014-01-01
Calls for increased interdisciplinary education have led to the development of numerous teaching methods designed to help teachers provide meaningful experiences for their students. However, methods of guiding teachers in the successful adoption of innovative teaching methods are not firmly set. This qualitative study sought to better understand…
Invasive pulmonary aspergillosis: current diagnostic methodologies and a new molecular approach.
Moura, S; Cerqueira, L; Almeida, A
2018-05-13
The fungus Aspergillus fumigatus is the main pathogenic agent responsible for invasive pulmonary aspergillosis. Immunocompromised patients are more likely to develop this pathology due to a decrease in the immune system's defense capacity. Despite of the low occurrence of invasive pulmonary aspergillosis, this pathology presents high rates of mortality, mostly due to late and unspecific diagnosis. Currently, the diagnostic methods used to detect this fungal infection are conventional mycological examination (direct microscopic examination, histological examination, and culture), imaging, non-culture-based tests for the detection of galactomannan, β(1,3)-glucan and an extracellular glycoprotein, and molecular tests based on PCR. However, most of these methods do not detect the species A. fumigatus; they only allow the identification of genus Aspergillus. The development of more specific detection methods is of extreme importance. Fluorescent in situ hybridization-based molecular methods can be a good alternative to achieve this purpose. In this review, it is intended to point out that most of the methods used for the diagnosis of invasive pulmonary aspergillosis do not allow to detect the fungus at the species level and that fluorescence in situ hybridization-based molecular method will be a promising approach in the A. fumigatus detection.
Kamoun, Choumouss; Payen, Thibaut; Hua-Van, Aurélie; Filée, Jonathan
2013-10-11
Insertion Sequences (ISs) and their non-autonomous derivatives (MITEs) are important components of prokaryotic genomes inducing duplication, deletion, rearrangement or lateral gene transfers. Although ISs and MITEs are relatively simple and basic genetic elements, their detection remains a difficult task due to their remarkable sequence diversity. With the advent of high-throughput genome and metagenome sequencing technologies, the development of fast, reliable and sensitive methods of ISs and MITEs detection become an important challenge. So far, almost all studies dealing with prokaryotic transposons have used classical BLAST-based detection methods against reference libraries. Here we introduce alternative methods of detection either taking advantages of the structural properties of the elements (de novo methods) or using an additional library-based method using profile HMM searches. In this study, we have developed three different work flows dedicated to ISs and MITEs detection: the first two use de novo methods detecting either repeated sequences or presence of Inverted Repeats; the third one use 28 in-house transposase alignment profiles with HMM search methods. We have compared the respective performances of each method using a reference dataset of 30 archaeal and 30 bacterial genomes in addition to simulated and real metagenomes. Compared to a BLAST-based method using ISFinder as library, de novo methods significantly improve ISs and MITEs detection. For example, in the 30 archaeal genomes, we discovered 30 new elements (+20%) in addition to the 141 multi-copies elements already detected by the BLAST approach. Many of the new elements correspond to ISs belonging to unknown or highly divergent families. The total number of MITEs has even doubled with the discovery of elements displaying very limited sequence similarities with their respective autonomous partners (mainly in the Inverted Repeats of the elements). Concerning metagenomes, with the exception of short reads data (<300 bp) for which both techniques seem equally limited, profile HMM searches considerably ameliorate the detection of transposase encoding genes (up to +50%) generating low level of false positives compare to BLAST-based methods. Compared to classical BLAST-based methods, the sensitivity of de novo and profile HMM methods developed in this study allow a better and more reliable detection of transposons in prokaryotic genomes and metagenomes. We believed that future studies implying ISs and MITEs identification in genomic data should combine at least one de novo and one library-based method, with optimal results obtained by running the two de novo methods in addition to a library-based search. For metagenomic data, profile HMM search should be favored, a BLAST-based step is only useful to the final annotation into groups and families.
Systematic methods for the design of a class of fuzzy logic controllers
NASA Astrophysics Data System (ADS)
Yasin, Saad Yaser
2002-09-01
Fuzzy logic control, a relatively new branch of control, can be used effectively whenever conventional control techniques become inapplicable or impractical. Various attempts have been made to create a generalized fuzzy control system and to formulate an analytically based fuzzy control law. In this study, two methods, the left and right parameterization method and the normalized spline-base membership function method, were utilized for formulating analytical fuzzy control laws in important practical control applications. The first model was used to design an idle speed controller, while the second was used to control an inverted control problem. The results of both showed that a fuzzy logic control system based on the developed models could be used effectively to control highly nonlinear and complex systems. This study also investigated the application of fuzzy control in areas not fully utilizing fuzzy logic control. Three important practical applications pertaining to the automotive industries were studied. The first automotive-related application was the idle speed of spark ignition engines, using two fuzzy control methods: (1) left and right parameterization, and (2) fuzzy clustering techniques and experimental data. The simulation and experimental results showed that a conventional controller-like performance fuzzy controller could be designed based only on experimental data and intuitive knowledge of the system. In the second application, the automotive cruise control problem, a fuzzy control model was developed using parameters adaptive Proportional plus Integral plus Derivative (PID)-type fuzzy logic controller. Results were comparable to those using linearized conventional PID and linear quadratic regulator (LQR) controllers and, in certain cases and conditions, the developed controller outperformed the conventional PID and LQR controllers. The third application involved the air/fuel ratio control problem, using fuzzy clustering techniques, experimental data, and a conversion algorithm, to develop a fuzzy-based control algorithm. Results were similar to those obtained by recently published conventional control based studies. The influence of the fuzzy inference operators and parameters on performance and stability of the fuzzy logic controller was studied Results indicated that, the selections of certain parameters or combinations of parameters, affect greatly the performance and stability of the fuzzy controller. Diagnostic guidelines used to tune or change certain factors or parameters to improve controller performance were developed based on knowledge gained from conventional control methods and knowledge gained from the experimental and the simulation results of this study.
PREDICTING THE EFFECTIVENESS OF CHEMICAL-PROTECTIVE CLOTHING MODEL AND TEST METHOD DEVELOPMENT
A predictive model and test method were developed for determining the chemical resistance of protective polymeric gloves exposed to liquid organic chemicals. The prediction of permeation through protective gloves by solvents was based on theories of the solution thermodynamics of...
EPA's Office of Research and Development (ORD) develops innovative methods for use in environmental monitoring and assessment by scientists in Regions, states, and Tribes. Molecular-biology-based methods are not yet established in the environmental monitoring "tool box". SRI (Sci...
The ReaxFF reactive force-field: Development, applications, and future directions
Senftle, Thomas; Hong, Sungwook; Islam, Md Mahbubul; ...
2016-03-04
The reactive force-field (ReaxFF) interatomic potential is a powerful computational tool for exploring, developing and optimizing material properties. Methods based on the principles of quantum mechanics (QM), while offering valuable theoretical guidance at the electronic level, are often too computationally intense for simulations that consider the full dynamic evolution of a system. Alternatively, empirical interatomic potentials that are based on classical principles require significantly fewer computational resources, which enables simulations to better describe dynamic processes over longer timeframes and on larger scales. Such methods, however, typically require a predefined connectivity between atoms, precluding simulations that involve reactive events. The ReaxFFmore » method was developed to help bridge this gap. Approaching the gap from the classical side, ReaxFF casts the empirical interatomic potential within a bond-order formalism, thus implicitly describing chemical bonding without expensive QM calculations. As a result, this article provides an overview of the development, application, and future directions of the ReaxFF method.« less
Risk-Based Object Oriented Testing
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert
2000-01-01
Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.
SiC-Based Composite Materials Obtained by Siliconizing Carbon Matrices
NASA Astrophysics Data System (ADS)
Shikunov, S. L.; Kurlov, V. N.
2017-12-01
We have developed a method for fabrication of parts of complicated configuration from composite materials based on SiC ceramics, which employs the interaction of silicon melt with the carbon matrix having a certain composition and porosity. For elevating the operating temperatures of ceramic components, we have developed a method for depositing protective silicon-carbide coatings that is based on the interaction of the silicon melt and vapor with carbon obtained during thermal splitting of hydrocarbon molecules. The new structural ceramics are characterized by higher operating temperatures; chemical stability; mechanical strength; thermal shock, wear and radiation resistance; and parameters stability.
Ooka, Tadasuke; Terajima, Jun; Kusumoto, Masahiro; Iguchi, Atsushi; Kurokawa, Ken; Ogura, Yoshitoshi; Asadulghani, Md; Nakayama, Keisuke; Murase, Kazunori; Ohnishi, Makoto; Iyoda, Sunao; Watanabe, Haruo; Hayashi, Tetsuya
2009-09-01
Enterohemorrhagic Escherichia coli O157 (EHEC O157) is a food-borne pathogen that has raised worldwide public health concern. The development of simple and rapid strain-typing methods is crucial for the rapid detection and surveillance of EHEC O157 outbreaks. In the present study, we developed a multiplex PCR-based strain-typing method for EHEC O157, which is based on the variability in genomic location of IS629 among EHEC O157 strains. This method is very simple, in that the procedures are completed within 2 h, the analysis can be performed without the need for special equipment or techniques (requiring only conventional PCR and agarose gel electrophoresis systems), the results can easily be transformed into digital data, and the genes for the major virulence markers of EHEC O157 (the stx(1), stx(2), and eae genes) can be detected simultaneously. Using this method, 201 EHEC O157 strains showing different XbaI digestion patterns in pulsed-field gel electrophoresis (PFGE) analysis were classified into 127 types, and outbreak-related strains showed identical or highly similar banding patterns. Although this method is less discriminatory than PFGE, it may be useful as a primary screening tool for EHEC O157 outbreaks.
2014-01-01
Background The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. Methods We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Conclusions Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method. PMID:25052018
Masters, Kevin S; Ross, Kaile M; Hooker, Stephanie A; Wooldridge, Jennalee L
2018-05-18
There has been a notable disconnect between theories of behavior change and behavior change interventions. Because few interventions are both explicitly and adequately theory-based, investigators cannot assess the impact of theory on intervention effectiveness. Theory-based interventions, designed to deliberately engage the theory's proposed mechanisms of change, are needed to adequately test theories. Thus, systematic approaches to theory-based intervention development are needed. This article will introduce and discuss the psychometric method of developing theory-based interventions. The psychometric approach to intervention development utilizes basic psychometric principles at each step of the intervention development process in order to build a theoretically driven intervention to, subsequently, be tested in process (mechanism) and outcome studies. Five stages of intervention development are presented as follows: (i) Choice of theory; (ii) Identification and characterization of key concepts and expected relations; (iii) Intervention construction; (iv) Initial testing and revision; and (v) Empirical testing of the intervention. Examples of this approach from the Colorado Meaning-Activity Project (COMAP) are presented. Based on self-determination theory integrated with meaning or purpose, and utilizing a motivational interviewing approach, the COMAP intervention is individually based with an initial interview followed by smart phone-delivered interventions for increasing daily activity. The psychometric approach to intervention development is one method to ensure careful consideration of theory in all steps of intervention development. This structured approach supports developing a research culture that endorses deliberate and systematic operationalization of theory into behavior change intervention from the outset of intervention development.
Spline Approximation of Thin Shell Dynamics
NASA Technical Reports Server (NTRS)
delRosario, R. C. H.; Smith, R. C.
1996-01-01
A spline-based method for approximating thin shell dynamics is presented here. While the method is developed in the context of the Donnell-Mushtari thin shell equations, it can be easily extended to the Byrne-Flugge-Lur'ye equations or other models for shells of revolution as warranted by applications. The primary requirements for the method include accuracy, flexibility and efficiency in smart material applications. To accomplish this, the method was designed to be flexible with regard to boundary conditions, material nonhomogeneities due to sensors and actuators, and inputs from smart material actuators such as piezoceramic patches. The accuracy of the method was also of primary concern, both to guarantee full resolution of structural dynamics and to facilitate the development of PDE-based controllers which ultimately require real-time implementation. Several numerical examples provide initial evidence demonstrating the efficacy of the method.
Development of a paper-based carbon nanotube sensing microfluidic device for biological detection.
Yang, Shih-I; Lei, Kin Fong; Tsai, Shiao-Wen; Hsu, Hsiao-Ting
2013-01-01
Carbon nanotube (CNT) has been utilized for the biological detection due to its extremely sensitive to biological molecules. A paper-based CNT sensing microfluidic device has been developed for the detection of protein, i.e., biotin-avidin, binding. We have developed a fabrication method that allows controlled deposition of bundled CNTs with well-defined dimensions to form sensors on paper. Then, polydimethyl siloxane (PDMS) was used to pattern the hydrophobic boundary on paper to form the reaction sites. The proposed fabrication method is based on vacuum filtration process with a metal mask covering on a filter paper for the definition of the dimension of sensor. The length, width, and thickness of the CNT-based sensors are readily controlled by the metal mask and the weight of the CNT powder used during the filtration process, respectively. Homogeneous deposition of CNTs with well-defined dimensions can be achieved. The CNT-based sensor on paper has been demonstrated on the detection of the protein binding. Biotin was first immobilized on the CNT's sidewall and avidin suspended solution was applied to the site. The result of the biotin-avidin binding was measured by the resistance change of the sensor, which is a label-free detection method. It showed the CNT is sensitive to the biological molecules and the proposed paper-based CNT sensing device is a possible candidate for point-of-care biosensors. Thus, electrical bio-assays on paper-based microfluidics can be realized to develop low cost, sensitive, and specific diagnostic devices.
Xiao, Fengjun; Li, Chengzhi; Sun, Jiangman; Zhang, Lianjie
2017-01-01
To study the rapid growth of research on organic photovoltaic (OPV) technology, development trends in the relevant research are analyzed based on CiteSpace software of text mining and visualization in scientific literature. By this analytical method, the outputs and cooperation of authors, the hot research topics, the vital references and the development trend of OPV are identified and visualized. Different from the traditional review articles by the experts on OPV, this work provides a new method of visualizing information about the development of the OPV technology research over the past decade quantitatively.
NASA Astrophysics Data System (ADS)
Xiao, Fengjun; Li, Chengzhi; Sun, Jiangman; Zhang, Lianjie
2017-09-01
To study the rapid growth of research on organic photovoltaic (OPV) technology, development trends in the relevant research are analyzed based on CiteSpace software of text mining and visualization in scientific literature. By this analytical method, the outputs and cooperation of authors, the hot research topics, the vital references and the development trend of OPV are identified and visualized. Different from the traditional review articles by the experts on OPV, this work provides a new method of visualizing information about the development of the OPV technology research over the past decade quantitatively.
NASA Astrophysics Data System (ADS)
Volosovitch, Anatoly E.; Konopaltseva, Lyudmila I.
1995-11-01
Well-known methods of optical diagnostics, database for their storage, as well as expert system (ES) for their development are analyzed. A computer informational system is developed, which is based on a hybrid ES built on modern DBMS. As an example, the structural and constructive circuits of the hybrid integrated-optical devices based on laser diodes, diffusion waveguides, geodetic lenses, package-free linear photodiode arrays, etc. are presented. The features of methods and test results as well as the advanced directions of works related to the hybrid integrated-optical devices in the field of metrology are discussed.
NASA Astrophysics Data System (ADS)
Yakunin, Alexander N.; Aban'shin, Nikolay P.; Avetisyan, Yuri A.; Akchurin, Georgy G.; Akchurin, Garif G.
2018-04-01
A model for calculating the electrostatic field in the system "probe of a tunnel microscope - a nanostructure based on a DLC film" was developed. A finite-element modeling of the localization of the field was carried out, taking into account the morphological and topological features of the nanostructure. The obtained results and their interpretation contribute to the development of the concepts to the model of tunnel electric transport processes. The possibility for effective usage of the tunneling microscopy methods in the development of new nanophotonic devices is shown.
USDA-ARS?s Scientific Manuscript database
A Multilocus Sequence Typing (MLST) method based on allelic variation of 7 chromosomal loci was developed for characterizing genotypes within the genus Bradyrhizobium. With the method 29 distinct multilocus genotypes (GTs) were identified among 191 culture collection soybean strains. The occupancy ...
CFD Analysis of the SBXC Glider Airframe
2016-06-01
mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the previous research data...greater than 15 m/s. 14. SUBJECT TERMS finite element method, computational fluid dynamics, Y Plus, mesh element quality, aerodynamic data, fluid...based mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the
Development of a Contact Permeation Test Fixture and Method
2013-04-01
direct contact with the skin, indicates the need for a quantitative contact test method. Comparison tests were conducted with VX on a standardized...Guide for the Care and Use of Laboratory Animals (8th ed.; National Research Council: Washington, DC, 2011). This test was also performed in...1 1.2 Development of a Contact-Based Permeation Test Method ........................................ 1 2. EXPERIMENTAL PROCEDURES
Model-Free Optimal Tracking Control via Critic-Only Q-Learning.
Luo, Biao; Liu, Derong; Huang, Tingwen; Wang, Ding
2016-10-01
Model-free control is an important and promising topic in control fields, which has attracted extensive attention in the past few years. In this paper, we aim to solve the model-free optimal tracking control problem of nonaffine nonlinear discrete-time systems. A critic-only Q-learning (CoQL) method is developed, which learns the optimal tracking control from real system data, and thus avoids solving the tracking Hamilton-Jacobi-Bellman equation. First, the Q-learning algorithm is proposed based on the augmented system, and its convergence is established. Using only one neural network for approximating the Q-function, the CoQL method is developed to implement the Q-learning algorithm. Furthermore, the convergence of the CoQL method is proved with the consideration of neural network approximation error. With the convergent Q-function obtained from the CoQL method, the adaptive optimal tracking control is designed based on the gradient descent scheme. Finally, the effectiveness of the developed CoQL method is demonstrated through simulation studies. The developed CoQL method learns with off-policy data and implements with a critic-only structure, thus it is easy to realize and overcome the inadequate exploration problem.
Application of the critical pathway and integrated case teaching method to nursing orientation.
Goodman, D
1997-01-01
Nursing staff development programs must be responsive to current changes in healthcare. New nursing staff must be prepared to manage continuous change and to function competently in clinical practice. The orientation pathway, based on a case management model, is used as a structure for the orientation phase of staff development. The integrated case is incorporated as a teaching strategy in orientation. The integrated case method is based on discussion and analysis of patient situations with emphasis on role modeling and integration of theory and skill. The orientation pathway and integrated case teaching method provide a useful framework for orientation of new staff. Educators, preceptors and orientees find the structure provided by the orientation pathway very useful. Orientation that is developed, implemented and evaluated based on a case management model with the use of an orientation pathway and incorporation of an integrated case teaching method provides a standardized structure for orientation of new staff. This approach is designed for the adult learner, promotes conceptual reasoning, and encourages the social and contextual basis for continued learning.
Towards fully automated structure-based function prediction in structural genomics: a case study.
Watson, James D; Sanderson, Steve; Ezersky, Alexandra; Savchenko, Alexei; Edwards, Aled; Orengo, Christine; Joachimiak, Andrzej; Laskowski, Roman A; Thornton, Janet M
2007-04-13
As the global Structural Genomics projects have picked up pace, the number of structures annotated in the Protein Data Bank as hypothetical protein or unknown function has grown significantly. A major challenge now involves the development of computational methods to assign functions to these proteins accurately and automatically. As part of the Midwest Center for Structural Genomics (MCSG) we have developed a fully automated functional analysis server, ProFunc, which performs a battery of analyses on a submitted structure. The analyses combine a number of sequence-based and structure-based methods to identify functional clues. After the first stage of the Protein Structure Initiative (PSI), we review the success of the pipeline and the importance of structure-based function prediction. As a dataset, we have chosen all structures solved by the MCSG during the 5 years of the first PSI. Our analysis suggests that two of the structure-based methods are particularly successful and provide examples of local similarity that is difficult to identify using current sequence-based methods. No one method is successful in all cases, so, through the use of a number of complementary sequence and structural approaches, the ProFunc server increases the chances that at least one method will find a significant hit that can help elucidate function. Manual assessment of the results is a time-consuming process and subject to individual interpretation and human error. We present a method based on the Gene Ontology (GO) schema using GO-slims that can allow the automated assessment of hits with a success rate approaching that of expert manual assessment.
Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.
Menges, Achim
2012-03-01
Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.
Link, Daphna; Braginsky, Michael B; Joskowicz, Leo; Ben Sira, Liat; Harel, Shaul; Many, Ariel; Tarrasch, Ricardo; Malinger, Gustavo; Artzi, Moran; Kapoor, Cassandra; Miller, Elka; Ben Bashat, Dafna
2018-01-01
Accurate fetal brain volume estimation is of paramount importance in evaluating fetal development. The aim of this study was to develop an automatic method for fetal brain segmentation from magnetic resonance imaging (MRI) data, and to create for the first time a normal volumetric growth chart based on a large cohort. A semi-automatic segmentation method based on Seeded Region Growing algorithm was developed and applied to MRI data of 199 typically developed fetuses between 18 and 37 weeks' gestation. The accuracy of the algorithm was tested against a sub-cohort of ground truth manual segmentations. A quadratic regression analysis was used to create normal growth charts. The sensitivity of the method to identify developmental disorders was demonstrated on 9 fetuses with intrauterine growth restriction (IUGR). The developed method showed high correlation with manual segmentation (r2 = 0.9183, p < 0.001) as well as mean volume and volume overlap differences of 4.77 and 18.13%, respectively. New reference data on 199 normal fetuses were created, and all 9 IUGR fetuses were at or below the third percentile of the normal growth chart. The proposed method is fast, accurate, reproducible, user independent, applicable with retrospective data, and is suggested for use in routine clinical practice. © 2017 S. Karger AG, Basel.
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
Objective To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. Background In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. Methods In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Results Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. Conclusion The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies. PMID:25148262
An exploratory survey of methods used to develop measures of performance
NASA Astrophysics Data System (ADS)
Hamner, Kenneth L.; Lafleur, Charles A.
1993-09-01
Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.
NASA Technical Reports Server (NTRS)
Hinchey, Mike
2006-01-01
The explosion of capabilities and new products within ICT (Information and Communication Technology) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts in future exploration missions, which may well be the most ambitious computer-based systems ever developed. Such missions entail levels of complexity that beg for new methods for system development. NASA-led research in such areas as sensor networks, formal methods, autonomic computing, and requirements-based programming (to name but a few) will offer some innovative approaches to achieving correctness in complex system development.
Reiman, Anne; Pandey, Sarojini; Lloyd, Kate L; Dyer, Nigel; Khan, Mike; Crockard, Martin; Latten, Mark J; Watson, Tracey L; Cree, Ian A; Grammatopoulos, Dimitris K
2016-11-01
Background Detection of disease-associated mutations in patients with familial hypercholesterolaemia is crucial for early interventions to reduce risk of cardiovascular disease. Screening for these mutations represents a methodological challenge since more than 1200 different causal mutations in the low-density lipoprotein receptor has been identified. A number of methodological approaches have been developed for screening by clinical diagnostic laboratories. Methods Using primers targeting, the low-density lipoprotein receptor, apolipoprotein B, and proprotein convertase subtilisin/kexin type 9, we developed a novel Ion Torrent-based targeted re-sequencing method. We validated this in a West Midlands-UK small cohort of 58 patients screened in parallel with other mutation-targeting methods, such as multiplex polymerase chain reaction (Elucigene FH20), oligonucleotide arrays (Randox familial hypercholesterolaemia array) or the Illumina next-generation sequencing platform. Results In this small cohort, the next-generation sequencing method achieved excellent analytical performance characteristics and showed 100% and 89% concordance with the Randox array and the Elucigene FH20 assay. Investigation of the discrepant results identified two cases of mutation misclassification of the Elucigene FH20 multiplex polymerase chain reaction assay. A number of novel mutations not previously reported were also identified by the next-generation sequencing method. Conclusions Ion Torrent-based next-generation sequencing can deliver a suitable alternative for the molecular investigation of familial hypercholesterolaemia patients, especially when comprehensive mutation screening for rare or unknown mutations is required.
Detection and categorization of bacteria habitats using shallow linguistic analysis
2015-01-01
Background Information regarding bacteria biotopes is important for several research areas including health sciences, microbiology, and food processing and preservation. One of the challenges for scientists in these domains is the huge amount of information buried in the text of electronic resources. Developing methods to automatically extract bacteria habitat relations from the text of these electronic resources is crucial for facilitating research in these areas. Methods We introduce a linguistically motivated rule-based approach for recognizing and normalizing names of bacteria habitats in biomedical text by using an ontology. Our approach is based on the shallow syntactic analysis of the text that include sentence segmentation, part-of-speech (POS) tagging, partial parsing, and lemmatization. In addition, we propose two methods for identifying bacteria habitat localization relations. The underlying assumption for the first method is that discourse changes with a new paragraph. Therefore, it operates on a paragraph-basis. The second method performs a more fine-grained analysis of the text and operates on a sentence-basis. We also develop a novel anaphora resolution method for bacteria coreferences and incorporate it with the sentence-based relation extraction approach. Results We participated in the Bacteria Biotope (BB) Task of the BioNLP Shared Task 2013. Our system (Boun) achieved the second best performance with 68% Slot Error Rate (SER) in Sub-task 1 (Entity Detection and Categorization), and ranked third with an F-score of 27% in Sub-task 2 (Localization Event Extraction). This paper reports the system that is implemented for the shared task, including the novel methods developed and the improvements obtained after the official evaluation. The extensions include the expansion of the OntoBiotope ontology using the training set for Sub-task 1, and the novel sentence-based relation extraction method incorporated with anaphora resolution for Sub-task 2. These extensions resulted in promising results for Sub-task 1 with a SER of 68%, and state-of-the-art performance for Sub-task 2 with an F-score of 53%. Conclusions Our results show that a linguistically-oriented approach based on the shallow syntactic analysis of the text is as effective as machine learning approaches for the detection and ontology-based normalization of habitat entities. Furthermore, the newly developed sentence-based relation extraction system with the anaphora resolution module significantly outperforms the paragraph-based one, as well as the other systems that participated in the BB Shared Task 2013. PMID:26201262
Fu, Yili; Gao, Wenpeng; Chen, Xiaoguang; Zhu, Minwei; Shen, Weigao; Wang, Shuguo
2010-01-01
The reference system based on the fourth ventricular landmarks (including the fastigial point and ventricular floor plane) is used in medical image analysis of the brain stem. The objective of this study was to develop a rapid, robust, and accurate method for the automatic identification of this reference system on T1-weighted magnetic resonance images. The fully automated method developed in this study consisted of four stages: preprocessing of the data set, expectation-maximization algorithm-based extraction of the fourth ventricle in the region of interest, a coarse-to-fine strategy for identifying the fastigial point, and localization of the base point. The method was evaluated on 27 Brain Web data sets qualitatively and 18 Internet Brain Segmentation Repository data sets and 30 clinical scans quantitatively. The results of qualitative evaluation indicated that the method was robust to rotation, landmark variation, noise, and inhomogeneity. The results of quantitative evaluation indicated that the method was able to identify the reference system with an accuracy of 0.7 +/- 0.2 mm for the fastigial point and 1.1 +/- 0.3 mm for the base point. It took <6 seconds for the method to identify the related landmarks on a personal computer with an Intel Core 2 6300 processor and 2 GB of random-access memory. The proposed method for the automatic identification of the reference system based on the fourth ventricular landmarks was shown to be rapid, robust, and accurate. The method has potentially utility in image registration and computer-aided surgery.
Sutton, Patrice
2014-01-01
Background: Synthesizing what is known about the environmental drivers of health is instrumental to taking prevention-oriented action. Methods of research synthesis commonly used in environmental health lag behind systematic review methods developed in the clinical sciences over the past 20 years. Objectives: We sought to develop a proof of concept of the “Navigation Guide,” a systematic and transparent method of research synthesis in environmental health. Discussion: The Navigation Guide methodology builds on best practices in research synthesis in evidence-based medicine and environmental health. Key points of departure from current methods of expert-based narrative review prevalent in environmental health include a prespecified protocol, standardized and transparent documentation including expert judgment, a comprehensive search strategy, assessment of “risk of bias,” and separation of the science from values and preferences. Key points of departure from evidence-based medicine include assigning a “moderate” quality rating to human observational studies and combining diverse evidence streams. Conclusions: The Navigation Guide methodology is a systematic and rigorous approach to research synthesis that has been developed to reduce bias and maximize transparency in the evaluation of environmental health information. Although novel aspects of the method will require further development and validation, our findings demonstrated that improved methods of research synthesis under development at the National Toxicology Program and under consideration by the U.S. Environmental Protection Agency are fully achievable. The institutionalization of robust methods of systematic and transparent review would provide a concrete mechanism for linking science to timely action to prevent harm. Citation: Woodruff TJ, Sutton P. 2014. The Navigation Guide systematic review methodology: a rigorous and transparent method for translating environmental health science into better health outcomes. Environ Health Perspect 122:1007–1014; http://dx.doi.org/10.1289/ehp.1307175 PMID:24968373
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.
2015-12-01
Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.
Bhattacharyya, Sanghita; Srivastava, Aradhana; Knight, Marian
2014-11-13
In India there is a thrust towards promoting institutional delivery, resulting in problems of overcrowding and compromise to quality of care. Review of near-miss obstetric events has been suggested to be useful to investigate health system functioning, complementing maternal death reviews. The aim of this project was to identify the key elements required for a near-miss review programme for India. A structured review was conducted to identify methods used in assessing near-miss cases. The findings of the structured review were used to develop a suggested framework for conducting near-miss reviews in India. A pool of experts in near-miss review methods in low and middle income countries (LMICs) was identified for vetting the framework developed. Opinions were sought about the feasibility of implementing near-miss reviews in India, the processes to be followed, factors that made implementation successful and the associated challenges. A draft of the framework was revised based on the experts' opinions. Five broad methods of near-miss case review/audit were identified: Facility-based near-miss case review, confidential enquiries, criterion-based clinical audit, structured case review (South African Model) and home-based interviews. The opinion of the 11 stakeholders highlighted that the methods that a facility adopts should depend on the type and number of cases the facility handles, availability and maintenance of a good documentation system, and local leadership and commitment of staff. A proposed framework for conducting near-miss reviews was developed that included a combination of criterion-based clinical audit and near-miss review methods. The approach allowed for development of a framework for researchers and planners seeking to improve quality of maternal care not only at the facility level but also beyond, encompassing community health workers and referral. Further work is needed to evaluate the implementation of this framework to determine its efficacy in improving the quality of care and hence maternal and perinatal morbidity and mortality.
ERIC Educational Resources Information Center
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles
2017-01-01
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
ERIC Educational Resources Information Center
Kouhpayehzadeh, Jalil; Baradaran, Hamid; Arabshahi, Kamran Soltani; Knill-Jones, Robin
2006-01-01
Introduction: Evidence-based medicine (EBM) has been introduced in medical schools worldwide, but there is little known about effective methods for teaching EBM skills, particularly in developing countries. This study assesses the impact of an EBM workshop on clinical teachers' attitudes and use of EBM skills. Methods: Seventy-two clinical…
NASA Astrophysics Data System (ADS)
Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.
2017-05-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2017-12-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.
RRCRank: a fusion method using rank strategy for residue-residue contact prediction.
Jing, Xiaoyang; Dong, Qiwen; Lu, Ruqian
2017-09-02
In structural biology area, protein residue-residue contacts play a crucial role in protein structure prediction. Some researchers have found that the predicted residue-residue contacts could effectively constrain the conformational search space, which is significant for de novo protein structure prediction. In the last few decades, related researchers have developed various methods to predict residue-residue contacts, especially, significant performance has been achieved by using fusion methods in recent years. In this work, a novel fusion method based on rank strategy has been proposed to predict contacts. Unlike the traditional regression or classification strategies, the contact prediction task is regarded as a ranking task. First, two kinds of features are extracted from correlated mutations methods and ensemble machine-learning classifiers, and then the proposed method uses the learning-to-rank algorithm to predict contact probability of each residue pair. First, we perform two benchmark tests for the proposed fusion method (RRCRank) on CASP11 dataset and CASP12 dataset respectively. The test results show that the RRCRank method outperforms other well-developed methods, especially for medium and short range contacts. Second, in order to verify the superiority of ranking strategy, we predict contacts by using the traditional regression and classification strategies based on the same features as ranking strategy. Compared with these two traditional strategies, the proposed ranking strategy shows better performance for three contact types, in particular for long range contacts. Third, the proposed RRCRank has been compared with several state-of-the-art methods in CASP11 and CASP12. The results show that the RRCRank could achieve comparable prediction precisions and is better than three methods in most assessment metrics. The learning-to-rank algorithm is introduced to develop a novel rank-based method for the residue-residue contact prediction of proteins, which achieves state-of-the-art performance based on the extensive assessment.
A multivariate quadrature based moment method for LES based modeling of supersonic combustion
NASA Astrophysics Data System (ADS)
Donde, Pratik; Koo, Heeseok; Raman, Venkat
2012-07-01
The transported probability density function (PDF) approach is a powerful technique for large eddy simulation (LES) based modeling of scramjet combustors. In this approach, a high-dimensional transport equation for the joint composition-enthalpy PDF needs to be solved. Quadrature based approaches provide deterministic Eulerian methods for solving the joint-PDF transport equation. In this work, it is first demonstrated that the numerical errors associated with LES require special care in the development of PDF solution algorithms. The direct quadrature method of moments (DQMOM) is one quadrature-based approach developed for supersonic combustion modeling. This approach is shown to generate inconsistent evolution of the scalar moments. Further, gradient-based source terms that appear in the DQMOM transport equations are severely underpredicted in LES leading to artificial mixing of fuel and oxidizer. To overcome these numerical issues, a semi-discrete quadrature method of moments (SeQMOM) is formulated. The performance of the new technique is compared with the DQMOM approach in canonical flow configurations as well as a three-dimensional supersonic cavity stabilized flame configuration. The SeQMOM approach is shown to predict subfilter statistics accurately compared to the DQMOM approach.
Stochastic simulation by image quilting of process-based geological models
NASA Astrophysics Data System (ADS)
Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef
2017-09-01
Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.
Dalum, Peter; Schaalma, Herman; Kok, Gerjo
2012-02-01
The objective of this project was to develop a theory- and evidence-based adolescent smoking cessation intervention using both new and existing materials. We used the Intervention Mapping framework for planning health promotion programmes. Based on a needs assessment, we identified important and changeable determinants of cessation behaviour, specified change objectives for the intervention programme, selected theoretical change methods for accomplishing intervention objectives and finally operationalized change methods into practical intervention strategies. We found that guided practice, modelling, self-monitoring, coping planning, consciousness raising, dramatic relief and decisional balance were suitable methods for adolescent smoking cessation. We selected behavioural journalism, guided practice and Motivational Interviewing as strategies in our intervention. Intervention Mapping helped us to develop as systematic adolescent smoking cessation intervention with a clear link between behavioural goals, theoretical methods, practical strategies and materials and with a strong focus on implementation and recruitment. This paper does not present evaluation data.
Grape colour phenotyping: development of a method based on the reflectance spectrum.
Rustioni, Laura; Basilico, Roberto; Fiori, Simone; Leoni, Alessandra; Maghradze, David; Failla, Osvaldo
2013-01-01
The colour of fruit is an important quality factor for cultivar classification and phenotyping techniques. Besides the subjective visual evaluation, new instruments and techniques can be used. This work aims at developping an objective, fast, easy and non-destructive method as a useful support for evaluating grapes' colour under different cultural and environmental conditions, as well as for breeding process and germplasm evaluation, supporting the plant characterization and the biodiversity preservation. Colours of 120 grape varieties were studied using reflectance spectra. The classification was realized using cluster and discriminant analysis. Reflectance of the whole berries surface was also compared with absorption properties of single skin extracts. A phenotyping method based on the reflectance spectra was developed, producing reliable colour classifications. A cultivar-independent index for pigment content evaluation has also been obtained. This work allowed the classification of the berry colour using an objective method. Copyright © 2013 John Wiley & Sons, Ltd.
Uncertainty Propagation Methods for High-Dimensional Complex Systems
NASA Astrophysics Data System (ADS)
Mukherjee, Arpan
Researchers are developing ever smaller aircraft called Micro Aerial Vehicles (MAVs). The Space Robotics Group has joined the field by developing a dragonfly-inspired MAV. This thesis presents two contributions to this project. The first is the development of a dynamical model of the internal MAV components to be used for tuning design parameters and as a future plant model. This model is derived using the Lagrangian method and differs from others because it accounts for the internal dynamics of the system. The second contribution of this thesis is an estimation algorithm that can be used to determine prototype performance and verify the dynamical model from the first part. Based on the Gauss-Newton Batch Estimator, this algorithm uses a single camera and known points of interest on the wing to estimate the wing kinematic angles. Unlike other single-camera methods, this method is probabilistically based rather than being geometric.
Fluorescent Nanomaterials for the Development of Latent Fingerprints in Forensic Sciences
Li, Ming; Yu, Aoyang; Zhu, Ye
2018-01-01
This review presents an overview on the application of latent fingerprint development techniques in forensic sciences. At present, traditional developing methods such as powder dusting, cyanoacrylate fuming, chemical method, and small particle reagent method, have all been gradually compromised given their emerging drawbacks such as low contrast, sensitivity, and selectivity, as well as high toxicity. Recently, much attention has been paid to the use of fluorescent nanomaterials including quantum dots (QDs) and rare earth upconversion fluorescent nanomaterials (UCNMs) due to their unique optical and chemical properties. Thus, this review lays emphasis on latent fingerprint development based on QDs and UCNMs. Compared to latent fingerprint development by traditional methods, the new methods using fluorescent nanomaterials can achieve high contrast, sensitivity, and selectivity while showing reduced toxicity. Overall, this review provides a systematic overview on such methods. PMID:29657570
Laser Spot Tracking Based on Modified Circular Hough Transform and Motion Pattern Analysis
Krstinić, Damir; Skelin, Ana Kuzmanić; Milatić, Ivan
2014-01-01
Laser pointers are one of the most widely used interactive and pointing devices in different human-computer interaction systems. Existing approaches to vision-based laser spot tracking are designed for controlled indoor environments with the main assumption that the laser spot is very bright, if not the brightest, spot in images. In this work, we are interested in developing a method for an outdoor, open-space environment, which could be implemented on embedded devices with limited computational resources. Under these circumstances, none of the assumptions of existing methods for laser spot tracking can be applied, yet a novel and fast method with robust performance is required. Throughout the paper, we will propose and evaluate an efficient method based on modified circular Hough transform and Lucas–Kanade motion analysis. Encouraging results on a representative dataset demonstrate the potential of our method in an uncontrolled outdoor environment, while achieving maximal accuracy indoors. Our dataset and ground truth data are made publicly available for further development. PMID:25350502
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
Laser spot tracking based on modified circular Hough transform and motion pattern analysis.
Krstinić, Damir; Skelin, Ana Kuzmanić; Milatić, Ivan
2014-10-27
Laser pointers are one of the most widely used interactive and pointing devices in different human-computer interaction systems. Existing approaches to vision-based laser spot tracking are designed for controlled indoor environments with the main assumption that the laser spot is very bright, if not the brightest, spot in images. In this work, we are interested in developing a method for an outdoor, open-space environment, which could be implemented on embedded devices with limited computational resources. Under these circumstances, none of the assumptions of existing methods for laser spot tracking can be applied, yet a novel and fast method with robust performance is required. Throughout the paper, we will propose and evaluate an efficient method based on modified circular Hough transform and Lucas-Kanade motion analysis. Encouraging results on a representative dataset demonstrate the potential of our method in an uncontrolled outdoor environment, while achieving maximal accuracy indoors. Our dataset and ground truth data are made publicly available for further development.
Fully-Implicit Reconstructed Discontinuous Galerkin Method for Stiff Multiphysics Problems
NASA Astrophysics Data System (ADS)
Nourgaliev, Robert
2015-11-01
A new reconstructed Discontinuous Galerkin (rDG) method, based on orthogonal basis/test functions, is developed for fluid flows on unstructured meshes. Orthogonality of basis functions is essential for enabling robust and efficient fully-implicit Newton-Krylov based time integration. The method is designed for generic partial differential equations, including transient, hyperbolic, parabolic or elliptic operators, which are attributed to many multiphysics problems. We demonstrate the method's capabilities for solving compressible fluid-solid systems (in the low Mach number limit), with phase change (melting/solidification), as motivated by applications in Additive Manufacturing. We focus on the method's accuracy (in both space and time), as well as robustness and solvability of the system of linear equations involved in the linearization steps of Newton-based methods. The performance of the developed method is investigated for highly-stiff problems with melting/solidification, emphasizing the advantages from tight coupling of mass, momentum and energy conservation equations, as well as orthogonality of basis functions, which leads to better conditioning of the underlying (approximate) Jacobian matrices, and rapid convergence of the Krylov-based linear solver. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, and funded by the LDRD at LLNL under project tracking code 13-SI-002.
A Model-Driven Development Method for Management Information Systems
NASA Astrophysics Data System (ADS)
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
DOT National Transportation Integrated Search
2017-01-01
The findings from the proof of concept with mechanics-based models for flexible base suggest additional validation work should be performed, draft construction specification frameworks should be developed, and work extending the technology to stabili...
DOT National Transportation Integrated Search
2017-01-01
The findings from the proof of concept with mechanics-based models for flexible base suggest additional validation work should be performed, draft construction specification frameworks should be developed, and work extending the technology to stabili...
Code of Federal Regulations, 2010 CFR
2010-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2014 CFR
2014-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
How-Kit, Alexandre; Tost, Jörg
2015-01-01
A number of molecular diagnostic assays have been developed in the last years for mutation detection. Although these methods have become increasingly sensitive, most of them are incompatible with a sequencing-based readout and require prior knowledge of the mutation present in the sample. Consequently, coamplification at low denaturation (COLD)-PCR-based methods have been developed and combine a high analytical sensitivity due to mutation enrichment in the sample with the identification of known or unknown mutations by downstream sequencing experiments. Among these methods, the recently developed Enhanced-ice-COLD-PCR appeared as the most powerful method as it outperformed the other COLD-PCR-based methods in terms of the mutation enrichment and due to the simplicity of the experimental setup of the assay. Indeed, E-ice-COLD-PCR is very versatile as it can be used on all types of PCR platforms and is applicable to different types of samples including fresh frozen, FFPE, and plasma samples. The technique relies on the incorporation of an LNA containing blocker probe in the PCR reaction followed by selective heteroduplex denaturation enabling amplification of the mutant allele while amplification of the wild-type allele is prevented. Combined with Pyrosequencing(®), which is a very quantitative high-resolution sequencing technology, E-ice-COLD-PCR can detect and identify mutations with a limit of detection down to 0.01 %.
Enzymatic method for measuring starch gelatinization in dry products in situ
USDA-ARS?s Scientific Manuscript database
An enzymatic method based on hydrolysis of starch by amyloglucosidase and measurement of D-glucose released by glucose oxidase-peroxidase was developed to measure both gelatinized starch and hydrolyzable starch in situ of dried starchy products. Efforts focused on the development of sample handling ...
Directive and Non-Directive Movement in Child Therapy.
ERIC Educational Resources Information Center
Krason, Katarzyna; Szafraniec, Grazyna
1999-01-01
Presents a new authorship method of child therapy based on visualization through motion. Maintains that this method stimulates motor development and musical receptiveness, and promotes personality development. Suggests that improvised movement to music facilitates the projection mechanism and that directed movement starts the channeling phase.…
ERIC Educational Resources Information Center
Fox, Edward A.
1987-01-01
Discusses the CODER system, which was developed to investigate the application of artificial intelligence methods to increase the effectiveness of information retrieval systems, particularly those involving heterogeneous documents. Highlights include the use of PROLOG programing, blackboard-based designs, knowledge engineering, lexicological…
Advanced Small Modular Reactor Economics Model Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Thomas J.
2014-10-01
The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less
[Amperometric biosensor for lactate analysis in wines and grape must during fermentation].
Shkotova, L V; Horiushkina, T B; Slast'ia, E A; Soldatkin, O P; Tranh-Minh, S; Chovelon, J M; Dziadevych, S V
2005-01-01
The amperometric biosensor based on lactate oxidase for determination of lactate has been developed, and two methods of immobilization of lactate oxidase on the surface of industrial screen-printed platinum electrodes SensLab were compared. A sensor with immobilized in the Resydrol polymer lactate oxidase by the method of physical adsorption is characterized of narrow dynamic range and greater response value in comparison with a biosensor based on immobilised in poly(3,4-ethylenedioxythiophene) lactate oxidase by the method of electrochemical polymerization. Operational stability of the biosensor developed was studied and it was shown, that the immobilization method does not influence their stability. The analysis of the lactate in wine and during wine fermentation has been conducted. High correlation of the data obtained by means of amperometric lactate biosensor and a standard method of an ionic chromatography has been shown. The developed biosensor could be applied in the food industry for the control and optimization of the wine fermentation process, and quality control of wine.
Deep learning architecture for air quality predictions.
Li, Xiang; Peng, Ling; Hu, Yuan; Shao, Jing; Chi, Tianhe
2016-11-01
With the rapid development of urbanization and industrialization, many developing countries are suffering from heavy air pollution. Governments and citizens have expressed increasing concern regarding air pollution because it affects human health and sustainable development worldwide. Current air quality prediction methods mainly use shallow models; however, these methods produce unsatisfactory results, which inspired us to investigate methods of predicting air quality based on deep architecture models. In this paper, a novel spatiotemporal deep learning (STDL)-based air quality prediction method that inherently considers spatial and temporal correlations is proposed. A stacked autoencoder (SAE) model is used to extract inherent air quality features, and it is trained in a greedy layer-wise manner. Compared with traditional time series prediction models, our model can predict the air quality of all stations simultaneously and shows the temporal stability in all seasons. Moreover, a comparison with the spatiotemporal artificial neural network (STANN), auto regression moving average (ARMA), and support vector regression (SVR) models demonstrates that the proposed method of performing air quality predictions has a superior performance.
Readout electronics for the GEM detector
NASA Astrophysics Data System (ADS)
Kasprowicz, G.; Czarski, T.; Chernyshova, M.; Czyrkowski, H.; Dabrowski, R.; Dominik, W.; Jakubowska, K.; Karpinski, L.; Kierzkowski, K.; Kudla, I. M.; Pozniak, K.; Rzadkiewicz, J.; Salapa, Z.; Scholz, M.; Zabolotny, W.
2011-10-01
A novel approach to the Gas Electron Multiplier (GEM) detector readout is presented. Unlike commonly used methods, based on discriminators[2],[3] and analogue FIFOs[1], the method developed uses simultaneously sampling high speed ADCs and advanced FPGA-based processing logic to estimate the energy of every single photon. Such method is applied to every GEM strip signal. It is especially useful in case of crystal-based spectrometers for soft X-rays, where higher order reflections need to be identified and rejected[5].
Zhang, Hongshen; Chen, Ming
2013-11-01
In-depth studies on the recycling of typical automotive exterior plastic parts are significant and beneficial for environmental protection, energy conservation, and sustainable development of China. In the current study, several methods were used to analyze the recycling industry model for typical exterior parts of passenger vehicles in China. The strengths, weaknesses, opportunities, and challenges of the current recycling industry for typical exterior parts of passenger vehicles were analyzed comprehensively based on the SWOT method. The internal factor evaluation matrix and external factor evaluation matrix were used to evaluate the internal and external factors of the recycling industry. The recycling industry was found to respond well to all the factors and it was found to face good developing opportunities. Then, the cross-link strategies analysis for the typical exterior parts of the passenger car industry of China was conducted based on the SWOT analysis strategies and established SWOT matrix. Finally, based on the aforementioned research, the recycling industry model led by automobile manufacturers was promoted. Copyright © 2013 Elsevier Ltd. All rights reserved.
Radiometric Calibration of the Earth Observing System's Imaging Sensors
NASA Technical Reports Server (NTRS)
Slater, Philip N. (Principal Investigator)
1997-01-01
The work on the grant was mainly directed towards developing new, accurate, redundant methods for the in-flight, absolute radiometric calibration of satellite multispectral imaging systems and refining the accuracy of methods already in use. Initially the work was in preparation for the calibration of MODIS and HIRIS (before the development of that sensor was canceled), with the realization it would be applicable to most imaging multi- or hyper-spectral sensors provided their spatial or spectral resolutions were not too coarse. The work on the grant involved three different ground-based, in-flight calibration methods reflectance-based radiance-based and diffuse-to-global irradiance ratio used with the reflectance-based method. This continuing research had the dual advantage of: (1) developing several independent methods to create the redundancy that is essential for the identification and hopefully the elimination of systematic errors; and (2) refining the measurement techniques and algorithms that can be used not only for improving calibration accuracy but also for the reverse process of retrieving ground reflectances from calibrated remote-sensing data. The grant also provided the support necessary for us to embark on other projects such as the ratioing radiometer approach to on-board calibration (this has been further developed by SBRS as the 'solar diffuser stability monitor' and is incorporated into the most important on-board calibration system for MODIS)- another example of the work, which was a spin-off from the grant funding, was a study of solar diffuser materials. Journal citations, titles and abstracts of publications authored by faculty, staff, and students are also attached.
ERIC Educational Resources Information Center
Coholic, Diana; Eys, Mark; Lougheed, Sean
2012-01-01
We discuss preliminary findings from a study that investigated the effectiveness of a Holistic Arts-Based Group Program (HAP) for the development of resilience in children in need. The HAP teaches mindfulness using arts-based methods, and aims to teach children how to understand their feelings and develop their strengths. We assessed the…
Hamadani, Behrang H; Roller, John; Dougherty, Brian; Yoon, Howard W
2012-07-01
An absolute differential spectral response measurement system for solar cells is presented. The system couples an array of light emitting diodes with an optical waveguide to provide large area illumination. Two unique yet complementary measurement methods were developed and tested with the same measurement apparatus. Good agreement was observed between the two methods based on testing of a variety of solar cells. The first method is a lock-in technique that can be performed over a broad pulse frequency range. The second method is based on synchronous multifrequency optical excitation and electrical detection. An innovative scheme for providing light bias during each measurement method is discussed.
Abdul Kamal Nazer, Meeran Mohideen; Hameed, Abdul Rahman Shahul; Riyazuddin, Patel
2004-01-01
A simple and rapid potentiometric method for the estimation of ascorbic acid in pharmaceutical dosage forms has been developed. The method is based on treating ascorbic acid with iodine and titration of the iodide produced equivalent to ascorbic acid with silver nitrate using Copper Based Mercury Film Electrode (CBMFE) as an indicator electrode. Interference study was carried to check possible interference of usual excipients and other vitamins. The precision and accuracy of the method was assessed by the application of lack-of-fit test and other statistical methods. The results of the proposed method and British Pharmacopoeia method were compared using F and t-statistical tests of significance.
NASA Astrophysics Data System (ADS)
Annetta, Leonard A.; Frazier, Wendy M.; Folta, Elizabeth; Holmes, Shawn; Lamb, Richard; Cheng, Meng-Tzu
2013-02-01
Designed-based research principles guided the study of 51 secondary-science teachers in the second year of a 3-year professional development project. The project entailed the creation of student-centered, inquiry-based, science, video games. A professional development model appropriate for infusing innovative technologies into standards-based curricula was employed to determine how science teacher's attitudes and efficacy where impacted while designing science-based video games. The study's mixed-method design ascertained teacher efficacy on five factors (General computer use, Science Learning, Inquiry Teaching and Learning, Synchronous chat/text, and Playing Video Games) related to technology and gaming using a web-based survey). Qualitative data in the form of online blog posts was gathered during the project to assist in the triangulation and assessment of teacher efficacy. Data analyses consisted of an Analysis of Variance and serial coding of teacher reflective responses. Results indicated participants who used computers daily have higher efficacy while using inquiry-based teaching methods and science teaching and learning. Additional emergent findings revealed possible motivating factors for efficacy. This professional development project was focused on inquiry as a pedagogical strategy, standard-based science learning as means to develop content knowledge, and creating video games as technological knowledge. The project was consistent with the Technological Pedagogical Content Knowledge (TPCK) framework where overlapping circles of the three components indicates development of an integrated understanding of the suggested relationships. Findings provide suggestions for development of standards-based science education software, its integration into the curriculum and, strategies for implementing technology into teaching practices.
An adaptive signal-processing approach to online adaptive tutoring.
Bergeron, Bryan; Cline, Andrew
2011-01-01
Conventional intelligent or adaptive tutoring online systems rely on domain-specific models of learner behavior based on rules, deep domain knowledge, and other resource-intensive methods. We have developed and studied a domain-independent methodology of adaptive tutoring based on domain-independent signal-processing approaches that obviate the need for the construction of explicit expert and student models. A key advantage of our method over conventional approaches is a lower barrier to entry for educators who want to develop adaptive online learning materials.
ERIC Educational Resources Information Center
Sahrir, Muhammad Sabri; Alias, Nor Aziah; Ismail, Zawawi; Osman, Nurulhuda
2012-01-01
The design and development research, first proposed by Brown and Collins in the 1990s, is currently among the well-known methods in educational research to test theory and validate its practicality. The method is also known as developmental research, design research, design-based research, formative research and design-cased and possesses…
Johnson, K M; Jones, S C; Iverson, D
2009-09-01
To formulate 'best practice' guidelines for social marketing programmes for adolescents' and young adults' sun protection. A Delphi consensus process. Eleven experts in sun protection and social marketing participated in a Delphi consensus process, where they were asked to provide up to 10 key points, based on their knowledge and practical experience, which they felt were most important in developing social marketing interventions for the primary prevention of skin cancer among adolescents and young adults. After reaching consensus, the evidence base for each guideline was determined and graded via the Scottish Intercollegiate Guideline Network grading system. Participants were then asked to indicate how strongly they rated the finalized 15 recommendations based on all aspects relating to their knowledge and practical opinion, as well as the research evidence, on a visual analogue scale. The resultant 15 guidelines offer general principles for sun protection interventions utilizing a social marketing approach. This method of guideline development brought the expertise of practitioners to the forefront of guideline development, whilst still utilizing established methods of evidence confirmation. It thus offers a useful method for guideline development in a public health context.
Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.
1993-05-01
Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model
Methods for estimating flood frequency in Montana based on data through water year 1998
Parrett, Charles; Johnson, Dave R.
2004-01-01
Annual peak discharges having recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years (T-year floods) were determined for 660 gaged sites in Montana and in adjacent areas of Idaho, Wyoming, and Canada, based on data through water year 1998. The updated flood-frequency information was subsequently used in regression analyses, either ordinary or generalized least squares, to develop equations relating T-year floods to various basin and climatic characteristics, equations relating T-year floods to active-channel width, and equations relating T-year floods to bankfull width. The equations can be used to estimate flood frequency at ungaged sites. Montana was divided into eight regions, within which flood characteristics were considered to be reasonably homogeneous, and the three sets of regression equations were developed for each region. A measure of the overall reliability of the regression equations is the average standard error of prediction. The average standard errors of prediction for the equations based on basin and climatic characteristics ranged from 37.4 percent to 134.1 percent. Average standard errors of prediction for the equations based on active-channel width ranged from 57.2 percent to 141.3 percent. Average standard errors of prediction for the equations based on bankfull width ranged from 63.1 percent to 155.5 percent. In most regions, the equations based on basin and climatic characteristics generally had smaller average standard errors of prediction than equations based on active-channel or bankfull width. An exception was the Southeast Plains Region, where all equations based on active-channel width had smaller average standard errors of prediction than equations based on basin and climatic characteristics or bankfull width. Methods for weighting estimates derived from the basin- and climatic-characteristic equations and the channel-width equations also were developed. The weights were based on the cross correlation of residuals from the different methods and the average standard errors of prediction. When all three methods were combined, the average standard errors of prediction ranged from 37.4 percent to 120.2 percent. Weighting of estimates reduced the standard errors of prediction for all T-year flood estimates in four regions, reduced the standard errors of prediction for some T-year flood estimates in two regions, and provided no reduction in average standard error of prediction in two regions. A computer program for solving the regression equations, weighting estimates, and determining reliability of individual estimates was developed and placed on the USGS Montana District World Wide Web page. A new regression method, termed Region of Influence regression, also was tested. Test results indicated that the Region of Influence method was not as reliable as the regional equations based on generalized least squares regression. Two additional methods for estimating flood frequency at ungaged sites located on the same streams as gaged sites also are described. The first method, based on a drainage-area-ratio adjustment, is intended for use on streams where the ungaged site of interest is located near a gaged site. The second method, based on interpolation between gaged sites, is intended for use on streams that have two or more streamflow-gaging stations.
Ahmed, Amal B; Abdelrahman, Maha M; Abdelwahab, Nada S; Salama, Fathy M
2016-11-01
Newly established TLC-densitometric and RP-HPLC methods were developed and validated for the simultaneous determination of Piracetam (PIR) and Vincamine (VINC) in their pharmaceutical formulation and in the presence of PIR and VINC degradation products, PD and VD, respectively. The proposed TLC-densitometric method is based on the separation and quantitation of the studied components using a developing system that consists of chloroform-methanol-glacial acetic acid-triethylamine (8 + 2 + 0.1 + 0.1, v/v/v/v) on TLC silica gel 60 F254 plates, followed by densitometric scanning at 230 nm. On the other hand, the developed RP-HPLC method is based on the separation of the studied components using an isocratic elution of 0.05 M KH2PO4 (containing 0.1% triethylamine adjusted to pH 3 with orthophosphoric acid)-methanol (95 + 5, v/v) on a C8 column at a flow rate of 1 mL/min with diode-array detection at 230 nm. The developed methods were validated according to International Conference on Harmonization guidelines and demonstrated good accuracy and precision. Moreover, the developed TLC-densitometric and RP-HPLC methods are suitable as stability-indicating assay methods for the simultaneous determination of PD and VD either in bulk powder or pharmaceutical formulation. The results were statistically compared with those obtained by the reported RP-HPLC method using t- and F-tests.
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong
2016-05-01
With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.
Problem-Based Learning and Structural Redesign in a Choral Methods Course
ERIC Educational Resources Information Center
Freer, Patrick
2017-01-01
This article describes the process of structural redesign of an undergraduate music education choral methods course. A framework incorporating Problem-based Learning was developed to promote individualized student learning. Ten students participated in the accompanying research study, contributing an array of written and spoken comments as well as…
Explorations in Using Arts-Based Self-Study Methods
ERIC Educational Resources Information Center
Samaras, Anastasia P.
2010-01-01
Research methods courses typically require students to conceptualize, describe, and present their research ideas in writing. In this article, the author describes her exploration in using arts-based techniques for teaching research to support the development of students' self-study research projects. The pedagogical approach emerged from the…
Teaching/Learning Methods and Students' Classification of Food Items
ERIC Educational Resources Information Center
Hamilton-Ekeke, Joy-Telu; Thomas, Malcolm
2011-01-01
Purpose: This study aims to investigate the effectiveness of a teaching method (TLS (Teaching/Learning Sequence)) based on a social constructivist paradigm on students' conceptualisation of classification of food. Design/methodology/approach: The study compared the TLS model developed by the researcher based on the social constructivist paradigm…
USDA-ARS?s Scientific Manuscript database
Due to the availability of numerous spectral, spatial, and contextual features, the determination of optimal features and class separabilities can be a time consuming process in object-based image analysis (OBIA). While several feature selection methods have been developed to assist OBIA, a robust c...
Hydrologic impacts of climate change and urbanization in Las Vegas Wash Watershed, Nevada
In this study, a cell-based model for the Las Vegas Wash (LVW) Watershed in Clark County, Nevada, was developed by combining the traditional hydrologic modeling methods (Thornthwaite’s water balance model and the Soil Conservation Survey’s Curve Number method) with the pixel-base...
A method of evaluating crown fuels in forest stands.
Rodney W. Sando; Charles H. Wick
1972-01-01
A method of describing the crown fuels in a forest fuel complex based on crown weight and crown volume was developed. A computer program is an integral part of the method. Crown weight data are presented in graphical form and are separated into hardwood and coniferous fuels. The fuel complex is described using total crown weight per acre, mean height to the base of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kubic, William Louis; Jenkins, Rhodri W.; Moore, Cameron M.
Chemical pathways for converting biomass into fuels produce compounds for which key physical and chemical property data are unavailable. We developed an artificial neural network based group contribution method for estimating cetane and octane numbers that captures the complex dependence of fuel properties of pure compounds on chemical structure and is statistically superior to current methods.
Kubic, William Louis; Jenkins, Rhodri W.; Moore, Cameron M.; ...
2017-09-28
Chemical pathways for converting biomass into fuels produce compounds for which key physical and chemical property data are unavailable. We developed an artificial neural network based group contribution method for estimating cetane and octane numbers that captures the complex dependence of fuel properties of pure compounds on chemical structure and is statistically superior to current methods.
ERIC Educational Resources Information Center
??lekhina, ??rina Borisovna
2015-01-01
The present study examines the professional development problems of a high school teacher. A high school teacher is both a scientist and a teacher. Teaching and research activities are integrated by using methodical activity. Methodical competency of a teacher is defined as a basis in the context of Competence-based Education. The methodical…
Automatic rule generation for high-level vision
NASA Technical Reports Server (NTRS)
Rhee, Frank Chung-Hoon; Krishnapuram, Raghu
1992-01-01
A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.
Fast modular data acquisition system for GEM-2D detector
NASA Astrophysics Data System (ADS)
Kasprowicz, G.; Byszuk, Adrian; Wojeński, A.; Zienkiewicz, P.; Czarski, T.; Chernyshova, M.; Poźniak, K.; Rzadkiewicz, J.; Zabolotny, W.; Juszczyk, B.
2014-11-01
A novel approach to two dimensional Gas Electron Multiplier (GEM) detector readout is presented. Unlike commonly used methods, based on discriminators and analogue FIFOs, the method developed uses simulta- neously sampling high speed ADCs with fast hybrid integrator and advanced FPGA-based processing logic to estimate the energy of every single photon. Such a method is applied to every GEM strip / pixel signal. It is especially useful in case of crystal-based spectrometers for soft X-rays, 2D imaging for plasma tomography and all these applications where energy resolution of every single photon is required. For the purpose of the detector readout, a novel, highly modular and extendable conception of the measurement platform was developed. It is evolution of already deployed measurement system for JET Spectrometer.
Reverté, Laia; Soliño, Lucía; Carnicer, Olga; Diogène, Jorge; Campàs, Mònica
2014-01-01
The emergence of marine toxins in water and seafood may have a considerable impact on public health. Although the tendency in Europe is to consolidate, when possible, official reference methods based on instrumental analysis, the development of alternative or complementary methods providing functional or toxicological information may provide advantages in terms of risk identification, but also low cost, simplicity, ease of use and high-throughput analysis. This article gives an overview of the immunoassays, cell-based assays, receptor-binding assays and biosensors that have been developed for the screening and quantification of emerging marine toxins: palytoxins, ciguatoxins, cyclic imines and tetrodotoxins. Their advantages and limitations are discussed, as well as their possible integration in research and monitoring programs. PMID:25431968
Development of Pre-Service Science Teachers' Awareness of Sustainable Water Use
ERIC Educational Resources Information Center
Cankaya, Cemile; Filik Iscen, Cansu
2015-01-01
Water is a vital resource for sustainable development. The aim of this research was to develop and evaluate pre-service science teachers' awareness of sustainable water usage. This research was based on a mixed method. The qualitative part of the research was based on a single group pretest-posttest experimental design, and the qualitative data…
Methods to achieve accurate projection of regional and global raster databases
Usery, E.L.; Seong, J.C.; Steinwand, D.R.; Finn, M.P.
2002-01-01
This research aims at building a decision support system (DSS) for selecting an optimum projection considering various factors, such as pixel size, areal extent, number of categories, spatial pattern of categories, resampling methods, and error correction methods. Specifically, this research will investigate three goals theoretically and empirically and, using the already developed empirical base of knowledge with these results, develop an expert system for map projection of raster data for regional and global database modeling. The three theoretical goals are as follows: (1) The development of a dynamic projection that adjusts projection formulas for latitude on the basis of raster cell size to maintain equal-sized cells. (2) The investigation of the relationships between the raster representation and the distortion of features, number of categories, and spatial pattern. (3) The development of an error correction and resampling procedure that is based on error analysis of raster projection.
A New Design Method of Automotive Electronic Real-time Control System
NASA Astrophysics Data System (ADS)
Zuo, Wenying; Li, Yinguo; Wang, Fengjuan; Hou, Xiaobo
Structure and functionality of automotive electronic control system is becoming more and more complex. The traditional manual programming development mode to realize automotive electronic control system can't satisfy development needs. So, in order to meet diversity and speedability of development of real-time control system, combining model-based design approach and auto code generation technology, this paper proposed a new design method of automotive electronic control system based on Simulink/RTW. Fristly, design algorithms and build a control system model in Matlab/Simulink. Then generate embedded code automatically by RTW and achieve automotive real-time control system development in OSEK/VDX operating system environment. The new development mode can significantly shorten the development cycle of automotive electronic control system, improve program's portability, reusability and scalability and had certain practical value for the development of real-time control system.
Peripleural lung disease detection based on multi-slice CT images
NASA Astrophysics Data System (ADS)
Matsuhiro, M.; Suzuki, H.; Kawata, Y.; Niki, N.; Nakano, Y.; Ohmatsu, H.; Kusumoto, M.; Tsuchida, T.; Eguchi, K.; Kaneko, M.
2015-03-01
With the development of multi-slice CT technology, obtaining accurate 3D images of lung field in a short time become possible. To support that, a lot of image processing methods need to be developed. Detection peripleural lung disease is difficult due to its existence out of lung region, because lung extraction is often performed based on threshold processing. The proposed method uses thoracic inner region extracted by inner cavity of bone as well as air region, covers peripleural lung diseased cases such as lung nodule, calcification, pleural effusion and pleural plaque. We applied this method to 50 cases including 39 peripleural lung diseased cases. This method was able to detect 39 peripleural lung disease with 2.9 false positive per case.
Lima, Eliana Martins; Diniz, Danielle G Almeida; Antoniosi-Filho, Nelson R
2005-07-15
This paper describes the development of a gas chromatography (GC) method used for the assay of isotretinoin in its isolated form and in pharmaceutical formulations. Isotretinoin soft and hard gelatin capsules were prepared with various excipients. The performance of the proposed gas chromatography method was compared to that of traditional high performance liquid chromatography (HPLC) systems for this substance, and the GC parameters were established based on several preliminary tests, including thermal analysis of isotretinoin. Results showed that gas chromatography-flame ionization detector (GC-FID) exhibited a separation efficiency superior to that of HPLC, particularly for separating isotretinoin degradation products. This method was proven to be effectively applicable to stability evaluation assays of isotretinoin and isotretinoin based pharmaceuticals.
A geometrically based method for automated radiosurgery planning.
Wagner, T H; Yi, T; Meeks, S L; Bova, F J; Brechner, B L; Chen, Y; Buatti, J M; Friedman, W A; Foote, K D; Bouchet, L G
2000-12-01
A geometrically based method of multiple isocenter linear accelerator radiosurgery treatment planning optimization was developed, based on a target's solid shape. Our method uses an edge detection process to determine the optimal sphere packing arrangement with which to cover the planning target. The sphere packing arrangement is converted into a radiosurgery treatment plan by substituting the isocenter locations and collimator sizes for the spheres. This method is demonstrated on a set of 5 irregularly shaped phantom targets, as well as a set of 10 clinical example cases ranging from simple to very complex in planning difficulty. Using a prototype implementation of the method and standard dosimetric radiosurgery treatment planning tools, feasible treatment plans were developed for each target. The treatment plans generated for the phantom targets showed excellent dose conformity and acceptable dose homogeneity within the target volume. The algorithm was able to generate a radiosurgery plan conforming to the Radiation Therapy Oncology Group (RTOG) guidelines on radiosurgery for every clinical and phantom target examined. This automated planning method can serve as a valuable tool to assist treatment planners in rapidly and consistently designing conformal multiple isocenter radiosurgery treatment plans.
Williamson, J; Ranyard, R; Cuthbert, L
2000-05-01
This study is an evaluation of a process tracing method developed for naturalistic decisions, in this case a consumer choice task. The method is based on Huber et al.'s (1997) Active Information Search (AIS) technique, but develops it by providing spoken rather than written answers to respondents' questions, and by including think aloud instructions. The technique is used within a conversation-based situation, rather than the respondent thinking aloud 'into an empty space', as is conventionally the case in think aloud techniques. The method results in a concurrent verbal protocol as respondents make their decisions, and a retrospective report in the form of a post-decision summary. The method was found to be virtually non-reactive in relation to think aloud, although the variable of Preliminary Attribute Elicitation showed some evidence of reactivity. This was a methodological evaluation, and as such the data reported are essentially descriptive. Nevertheless, the data obtained indicate that the method is capable of producing information about decision processes which could have theoretical importance in terms of evaluating models of decision-making.
2014-01-01
Background Systematic planning could improve the generally moderate effectiveness of interventions to enhance adherence to clinical practice guidelines. The aim of our study was to demonstrate how the process of Intervention Mapping was used to develop an intervention to address the lack of adherence to the national CPG for low back pain by Dutch physical therapists. Methods We systematically developed a program to improve adherence to the Dutch physical therapy guidelines for low back pain. Based on multi-method formative research, we formulated program and change objectives. Selected theory-based methods of change and practical applications were combined into an intervention program. Implementation and evaluation plans were developed. Results Formative research revealed influential determinants for physical therapists and practice quality managers. Self-regulation was appropriate because both the physical therapists and the practice managers needed to monitor current practice and make and implement plans for change. The program stimulated interaction between practice levels by emphasizing collective goal setting. It combined practical applications, such as knowledge transfer and discussion-and-feedback, based on theory-based methods, such as consciousness raising and active learning. The implementation plan incorporated the wider environment. The evaluation plan included an effect and process evaluation. Conclusions Intervention Mapping is a useful framework for formative data in program planning in the field of clinical guideline implementation. However, a decision aid to select determinants of guideline adherence identified in the formative research to analyse the problem may increase the efficiency of the application of the Intervention Mapping process. PMID:24428945
A synthesis of fluorescent starch based on carbon nanoparticles for fingerprints detection
NASA Astrophysics Data System (ADS)
Li, Hongren; Guo, Xingjia; Liu, Jun; Li, Feng
2016-10-01
A pyrolysis method for synthesizing carbon nanoparticles (CNPs) were developed by using malic acid and ammonium oxalate as raw materials. The incorporation of a minor amount of carbon nanoparticles into starch powder imparts remarkable color-tunability. Based on this phenomenon, an environment friendly fluorescent starch powder for detecting latent fingerprints in non-porous surfaces was prepared. The fingerprints on different non-porous surfaces developed with this powder showed very good fluorescent images under ultraviolet excitation. The method using fluorescent starch powder as fluorescent marks is simple, rapid and green. Experimental results illustrated the effectiveness of proposed methods, enabling its practical applications in forensic sciences.
Tantishaiyakul, V; Poeaknapo, C; Sribun, P; Sirisuppanon, K
1998-06-01
A rapid, simple and direct assay procedure based on first-derivative spectrophotometry, using a zero-crossing and peak-to-base measurement at 234 and 324 nm, respectively, has been developed for the specific determination of dextromethorphan HBr and bromhexine HCl in tablets. Calibration graphs were linear with the correlation coefficients of 0.9999 for both analytes. The limit of detections were 0.033 and 0.103 microgram ml-1 for dextromethorphan HBr and bromhexine HCl, respectively. A HPLC method has been developed as the reference method. The results obtained by the first-derivative spectrophotometry were in good agreement with those found by the HPLC method.
Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz
2016-01-01
Objectives 1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; 2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; 3) To ensure the BN model can be used for interventional analysis; 4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. Method The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. Results When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. Conclusions This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. PMID:26830286
Developing points-based risk-scoring systems in the presence of competing risks.
Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P
2016-09-30
Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Principles for Developing Competency-Based Education Programs
ERIC Educational Resources Information Center
Johnstone, Sally M.; Soares, Louis
2014-01-01
The 2013 US college/university policy agenda, "Making College Affordable: A Better Agenda for the Middle Class," highlighted the role of developing technologies, institutional curriculum-design processes, and new delivery methods as keys to providing quality, affordable postsecondary education. Competency-based education (CBE) is given…
He, Jianbo; Li, Jijie; Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan
2015-01-01
Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Wilson-Sands, Cathy; Brahn, Pamela; Graves, Kristal
2015-01-01
Validating participants' ability to correctly perform cardiopulmonary resuscitation (CPR) skills during basic life support courses can be a challenge for nursing professional development specialists. This study compares two methods of basic life support training, instructor-led and computer-based learning with voice-activated manikins, to identify if one method is more effective for performance of CPR skills. The findings suggest that a computer-based learning course with voice-activated manikins is a more effective method of training for improved CPR performance.
NASA Astrophysics Data System (ADS)
Vasilevsky, A. M.; Konoplev, G. A.; Stepanova, O. S.; Toropov, D. K.; Zagorsky, A. L.
2016-04-01
A novel direct spectrophotometric method for quantitative determination of Oxiphore® drug substance (synthetic polyhydroquinone complex) in food supplements is developed. Absorption spectra of Oxiphore® water solutions in the ultraviolet region are presented. Samples preparation procedures and mathematical methods of spectra post-analytical procession are discussed. Basic characteristics of the automatic CCD-based UV spectrophotometer and special software implementing the developed method are described. The results of the trials of the developed method and software are analyzed: the error of determination for Oxiphore® concentration in water solutions of the isolated substance and singlecomponent food supplements did not exceed 15% (average error was 7…10%).