Sample records for widely employed method

  1. Developing Employability Skills in Information System Graduates: Traditional vs. Innovative Teaching Methods

    ERIC Educational Resources Information Center

    Osmani, Mohamad; Hindi, Nitham M.; Weerakkody, Vishanth

    2018-01-01

    It is widely acknowledged that traditional teaching methods such as lectures, textbooks and case study techniques on their own are not adequate to improving the most in-demand employability skills for graduates. The aim of this article is to explore the potential impact that novel learning and teaching methods can have on improving the…

  2. Employer Preferences for Resumes and Cover Letters

    ERIC Educational Resources Information Center

    Schullery, Nancy M.; Ickes, Linda; Schullery, Stephen E.

    2009-01-01

    This article reports the results of a survey of employers' preferences for resume style, resume delivery method, and cover letters. Employers still widely prefer the standard chronological resume, with only 3% desiring a scannable resume. The vast majority of employers prefer electronic delivery, either by email (46%) or at the company's Web site…

  3. Using CBPR Methods in College Health Research: Exploring Excessive Alcohol Consumption

    ERIC Educational Resources Information Center

    Bulmer, Sandra M.; Barton, Barbara A.; Liefeld, Julie; Montauti, Sara; Santos, Stephanie; Richard, Melissa; Hnath, Laura; Pelletier, Kara; Lalanne, Jude

    2016-01-01

    Community-based participatory research (CBPR) is a collaborative methodology that uniquely involves stakeholders in all stages of the research process. CBPR has been widely utilized in the field of public health, but not widely employed with college populations. This study utilized CBPR methods within a college community to gain insight into…

  4. Efficient Wide Baseline Structure from Motion

    NASA Astrophysics Data System (ADS)

    Michelini, Mario; Mayer, Helmut

    2016-06-01

    This paper presents a Structure from Motion approach for complex unorganized image sets. To achieve high accuracy and robustness, image triplets are employed and (an approximate) camera calibration is assumed to be known. The focus lies on a complete linking of images even in case of large image distortions, e.g., caused by wide baselines, as well as weak baselines. A method for embedding image descriptors into Hamming space is proposed for fast image similarity ranking. The later is employed to limit the number of pairs to be matched by a wide baseline method. An iterative graph-based approach is proposed formulating image linking as the search for a terminal Steiner minimum tree in a line graph. Finally, additional links are determined and employed to improve the accuracy of the pose estimation. By this means, loops in long image sequences are implicitly closed. The potential of the proposed approach is demonstrated by results for several complex image sets also in comparison with VisualSFM.

  5. Electrolytes for Wide Operating Temperature Lithium-Ion Cells

    NASA Technical Reports Server (NTRS)

    Smart, Marshall C. (Inventor); Bugga, Ratnakumar V. (Inventor)

    2016-01-01

    Provided herein are electrolytes for lithium-ion electrochemical cells, electrochemical cells employing the electrolytes, methods of making the electrochemical cells and methods of using the electrochemical cells over a wide temperature range. Included are electrolyte compositions comprising a lithium salt, a cyclic carbonate, a non-cyclic carbonate, and a linear ester and optionally comprising one or more additives.

  6. Application of Taguchi methods to infrared window design

    NASA Astrophysics Data System (ADS)

    Osmer, Kurt A.; Pruszynski, Charles J.

    1990-10-01

    Dr. Genichi Taguchi, a prominent quality consultant, reduced a branch of statistics known as "Design of Experiments" to a cookbook methodology that can be employed by any competent engineer. This technique has been extensively employed by Japanese manufacturers, and is widely credited with helping them attain their current level of success in low cost, high quality product design and fabrication. Although this technique was originally put forth as a tool to streamline the determination of improved production processes, it can also be applied to a wide range of engineering problems. As part of an internal research project, this method of experimental design has been adapted to window trade studies and materials research. Two of these analyses are presented herein, and have been chosen to illustrate the breadth of applications to which the Taguchi method can be utilized.

  7. Cell membrane antigen-antibody complex dissociation by the widely used glycine-HCL method: an unreliable procedure for studying antibody internalization.

    PubMed

    Tsaltas, G; Ford, C H

    1993-02-01

    Methods following the process of binding and internalization of antibodies to cell surface antigens have often employed low pH isoosmolar buffers in order to dissociate surface antigen-antibody complexes. One of the most widely used buffers is a 0.05 M glycine-HCL buffer pH 2.8. Since the efficacy of action of this buffer was critical to a series of internalization experiments employing monoclonal antibodies (Mabs) to carcinoembryonic antigen (CEA) expressing cancer cell lines in this laboratory, we tested its performance in a number of different assays. Our results indicate that this buffer only partially dissociates antigen-antibody bonds and therefore can introduce major inaccuracies in internalization experiments.

  8. Socrates and the Madness of Method

    ERIC Educational Resources Information Center

    Schneider, Jack

    2012-01-01

    What do we know about Socrates and the teaching method that, having taken his name, has become widely used from kindergarten through postgraduate seminars? The practitioners employing so-called Socratic methods include vastly different styles, the author says, noting that "we may be mistaking common phrasing for common practice." The differences…

  9. The Myth of "Scientific Method" in Contemporary Educational Research

    ERIC Educational Resources Information Center

    Rowbottom, Darrell Patrick; Aiston, Sarah Jane

    2006-01-01

    Whether educational research should employ the "scientific method" has been a recurring issue in its history. Hence, textbooks on research methods continue to perpetuate the idea that research students ought to choose between competing camps: "positivist" or "interpretivist". In reference to one of the most widely referred to educational research…

  10. A new method for solid surface topographical studies using nematic liquid crystals

    NASA Astrophysics Data System (ADS)

    Baber, N.; Strugalski, Z.

    1984-03-01

    A new simple method has been developed to investigate the topography of a wide range of solid surfaces using nematic liquid crystals. Polarizing microscopy is employed. The usefulness of the method for detecting weak mechanical effects has been demonstrated. An application in criminology is foreseen.

  11. School-Wide Positive Behavioral Interventions and Supports for Students with Emotional and Behavioral Disorders

    ERIC Educational Resources Information Center

    McCurdy, Barry L.; Thomas, Lisa; Truckenmiller, Adrea; Rich, Sara House; Hillis-Clark, Patricia; Lopez, Juan Carlos

    2016-01-01

    This investigation employed a participatory action research method involving school psychology consultants and educators to design and evaluate the impact of school-wide positive behavioral interventions and supports in a self-contained school serving students with emotional and behavioral disorders. The traditional practices of a universal…

  12. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  13. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Reid, Ray D. (Inventor); Hug, William F. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  14. Assessment in Simulations

    ERIC Educational Resources Information Center

    Raymond, Chad; Usherwood, Simon

    2013-01-01

    Simulations are employed widely as teaching tools in political science, yet evidence of their pedagogical effectiveness, in comparison to other methods of instruction, is mixed. The assessment of learning outcomes is often a secondary concern in simulation design, and the qualitative and quantitative methods used to evaluate outcomes are…

  15. Military applications and examples of near-surface seismic surface wave methods (Invited)

    NASA Astrophysics Data System (ADS)

    sloan, S.; Stevens, R.

    2013-12-01

    Although not always widely known or publicized, the military uses a variety of geophysical methods for a wide range of applications--some that are already common practice in the industry while others are truly novel. Some of those applications include unexploded ordnance detection, general site characterization, anomaly detection, countering improvised explosive devices (IEDs), and security monitoring, to name a few. Techniques used may include, but are not limited to, ground penetrating radar, seismic, electrical, gravity, and electromagnetic methods. Seismic methods employed include surface wave analysis, refraction tomography, and high-resolution reflection methods. Although the military employs geophysical methods, that does not necessarily mean that those methods enable or support combat operations--often times they are being used for humanitarian applications within the military's area of operations to support local populations. The work presented here will focus on the applied use of seismic surface wave methods, including multichannel analysis of surface waves (MASW) and backscattered surface waves, often in conjunction with other methods such as refraction tomography or body-wave diffraction analysis. Multiple field examples will be shown, including explosives testing, tunnel detection, pre-construction site characterization, and cavity detection.

  16. Comparison of employer productivity metrics to lost productivity estimated by commonly used questionnaires

    PubMed Central

    Gardner, Bethany T.; Dale, Ann Marie; Buckner-Petty, Skye; Van Dillen, Linda; Amick, Benjamin C.; Evanoff, Bradley

    2016-01-01

    Objective To assess construct and discriminant validity of four health-related work productivity loss questionnaires in relation to employer productivity metrics, and to describe variation in economic estimates of productivity loss provided by the questionnaires in healthy workers. Methods 58 billing office workers completed surveys including health information and four productivity loss questionnaires. Employer productivity metrics and work hours were also obtained. Results Productivity loss questionnaires were weakly to moderately correlated with employer productivity metrics. Workers with more health complaints reported greater health-related productivity loss than healthier workers, but showed no loss on employer productivity metrics. Economic estimates of productivity loss showed wide variation among questionnaires, yet no loss of actual productivity. Conclusions Additional studies are needed comparing questionnaires with objective measures in larger samples and other industries, to improve measurement methods for health-related productivity loss. PMID:26849261

  17. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  18. Organism and population-level ecological models for ...

    EPA Pesticide Factsheets

    Ecological risk assessment typically focuses on animal populations as endpoints for regulatory ecotoxicology. Scientists at USEPA are developing models for animal populations exposed to a wide range of chemicals from pesticides to emerging contaminants. Modeled taxa include aquatic and terrestrial invertebrates, fish, amphibians, and birds, and employ a wide range of methods, from matrix-based projection models to mechanistic bioenergetics models and spatially explicit population models. not applicable

  19. Force Enhancement Packages for Countering Nuclear Threats in the 2022-2027 Time Frame

    DTIC Science & Technology

    2015-09-01

    characterization methods . • Apply proper radioisotope identification techniques. c. A one-week CNT operations exercise at Fort Belvoir, Virginia. Team members...on experiments to seek better methods , holding active teaching until later. The team expects that better methods would involve collection using...conduct more effective wide-area searches than those commonly employed by civil law enforcement agencies. The IDA team suggests that better methods

  20. Measuring sap flow in plants

    USDA-ARS?s Scientific Manuscript database

    Sap flow measurements provide a powerful tool for quantifying plant water use and monitoring qualitative physiological responses of plants to environmental conditions. As such, sap flow methods are widely employed to invesitgate the agronomic, ecological and hydrological outcomes of plant growth. T...

  1. Critique of Sikkink and Keane's comparison of surface fuel sampling techniques

    Treesearch

    Clinton S. Wright; Roger D. Ottmar; Robert E. Vihnanek

    2010-01-01

    The 2008 paper of Sikkink and Keane compared several methods to estimate surface fuel loading in western Montana: two widely used inventory techniques (planar intersect and fixed-area plot) and three methods that employ photographs as visual guides (photo load, photoload macroplot and photo series). We feel, however, that their study design was inadequate to evaluate...

  2. The Brief History of Environmental Education and Its Changes from 1972 to Present in Iran

    ERIC Educational Resources Information Center

    Shobeiri, Seyed Mohammad; Meiboudi, Hossein; Kamali, Fatemeh Ahmadi

    2014-01-01

    The present study investigates environmental education (EE) before and after Iran's Islamic Revolution. The research method is case study, and among the case study methods, historical analysis has been used in this research. A wide array of sources were employed, from government performance reports to documents, records, books, and articles…

  3. SUMMARY REPORT CONTROL OF NOX EMISSIONS BY REBURNING

    EPA Science Inventory

    This report covers NOx control employing reburning technology: A new, effective method of controlling NOx emissions from a wide range of stationary combustion sources including large, coal-fired, utility boilers. Although reburning potentially is applicable ...

  4. Zu Problemen statistischer Methoden in der Sprachwissenschaft (Problems of Statistical Methods in Linguistics)

    ERIC Educational Resources Information Center

    Zorn, Klaus

    1973-01-01

    Discussion of statistical apparatus employed in L. Doncheva-Mareva's article on the wide-spread usage of the present and future tense forms with future meaning in German letters, Deutsch als Fremdsprache, n1 1971. (RS)

  5. Informal employment in high-income countries for a health inequalities research: A scoping review.

    PubMed

    Julià, Mireia; Tarafa, Gemma; O'Campo, Patricia; Muntaner, Carles; Jódar, Pere; Benach, Joan

    2015-01-01

    Informal employment (IE) is one of the least studied employment conditions in public health research, mainly due to the difficulty of its conceptualization and its measurement, producing a lack of a unique concept and a common method of measurement. The aim of this review is to identify literature on IE in order to improve its definition and methods of measurement, with special attention given to high-income countries, to be able to study the possible impact on health inequalities within and between countries. A scoping review of definitions and methods of measurement of IE was conducted reviewing relevant databases and grey literature and analyzing selected articles. We found a wide spectrum of terms for describing IE as well as definitions and methods of measurement. We provide a definition of IE to be used in health inequalities research in high-income countries. Direct methods such as surveys can capture more information about workers and firms in order to estimate IE. These results can be used in further investigations about the impacts of this IE on health inequalities. Public health research must improve monitoring and analysis of IE in order to know the impacts of this employment condition on health inequalities.

  6. Investigating the Utility of a GPA Institutional Adjustment Index

    ERIC Educational Resources Information Center

    Didier, Thomas; Kreiter, Clarence D.; Buri, Russell; Solow, Catherine

    2006-01-01

    Background: Grading standards vary widely across undergraduate institutions. If, during the medical school admissions process, GPA is considered without reference to the institution attended, it will disadvantage applicants from undergraduate institutions employing rigorous grading standards. Method: A regression-based GPA institutional equating…

  7. A Student-Designed Grammar Quiz on the Web: A Constructive Mode of Grammar Instruction

    ERIC Educational Resources Information Center

    Fukushima, Tatsuya

    2006-01-01

    The World Wide Web has frequently been incorporated into second/foreign language (L2 hereafter) course instruction during the past decade. However, many teaching methods essentially employed it as a reactive mode of L2 instruction. Many other methods have been designed to facilitate constructive L2 production. They are, however, limited in their…

  8. The Development of a Robot-Based Learning Companion: A User-Centered Design Approach

    ERIC Educational Resources Information Center

    Hsieh, Yi-Zeng; Su, Mu-Chun; Chen, Sherry Y.; Chen, Gow-Dong

    2015-01-01

    A computer-vision-based method is widely employed to support the development of a variety of applications. In this vein, this study uses a computer-vision-based method to develop a playful learning system, which is a robot-based learning companion named RobotTell. Unlike existing playful learning systems, a user-centered design (UCD) approach is…

  9. Community-based programmes to promote use of bicycle helmets in children aged 0-14 years: a systematic review.

    PubMed

    Spinks, Anneliese; Turner, Cathy; McClure, Rod; Acton, Caroline; Nixon, Jim

    2005-09-01

    Hospital-based research has shown that wearing a helmet reduces the risk of head injury in bicycle riders. These studies have provided the impetus for community-wide interventions to increase the numbers of cyclists who wear helmets; however, the effectiveness of such programmes is undetermined. This study employs extensive search strategies to review the scientific literature to establish the effectiveness of community-wide programmes to increase helmet use among cyclists. Thirteen community-wide intervention studies using substantive methodologies were located in 16 published papers. The community-wide interventions include mandating helmet wearing, education campaigns, distribution of free or subsidized helmets or, more frequently, combinations of all of these methods of influence. All studies reported success in influencing helmet wearing across communities. However, none of the studies reveals enough detail of the mix or techniques employed in the interventions to replicate the interventions. While it is encouraging that all of the studies showed positive results, the way forward for further implementation of helmet wearing is for adequate documentation of successful interventions.

  10. Self-recalibration of a robot-assisted structured-light-based measurement system.

    PubMed

    Xu, Jing; Chen, Rui; Liu, Shuntao; Guan, Yong

    2017-11-10

    The structured-light-based measurement method is widely employed in numerous fields. However, for industrial inspection, to achieve complete scanning of a work piece and overcome occlusion, the measurement system needs to be moved to different viewpoints. Moreover, frequent reconfiguration of the measurement system may be needed based on the size of the measured object, making the self-recalibration of extrinsic parameters indispensable. To this end, this paper proposes an automatic self-recalibration and reconstruction method, wherein a robot arm is employed to move the measurement system for complete scanning; the self-recalibration is achieved using fundamental matrix calculations and point cloud registration without the need for an accurate calibration gauge. Experimental results demonstrate the feasibility and accuracy of our method.

  11. Growth of Bulk Wide Bandgap Semiconductor Crystals and Their Potential Applications

    NASA Technical Reports Server (NTRS)

    Chen, Kuo-Tong; Shi, Detang; Morgan, S. H.; Collins, W. Eugene; Burger, Arnold

    1997-01-01

    Developments in bulk crystal growth research for electro-optical devices in the Center for Photonic Materials and Devices since its establishment have been reviewed. Purification processes and single crystal growth systems employing physical vapor transport and Bridgman methods were assembled and used to produce high purity and superior quality wide bandgap materials such as heavy metal halides and II-VI compound semiconductors. Comprehensive material characterization techniques have been employed to reveal the optical, electrical and thermodynamic properties of crystals, and the results were used to establish improved material processing procedures. Postgrowth treatments such as passivation, oxidation, chemical etching and metal contacting during the X-ray and gamma-ray device fabrication process have also been investigated and low noise threshold with improved energy resolution has been achieved.

  12. Quantification of Carbon Nanotubes in Different Environmental Matrices by a Microwave Induced Heating Method

    EPA Science Inventory

    Carbon nanotubes (CNTs) have been incorporated into numerous consumer products, and have also been employed in various industrial areas because of their extraordinary properties. The large scale production and wide applications of CNTs make their release into the environment a ma...

  13. Federal Consulting: Strategies and Tools for the Career Development Professional.

    ERIC Educational Resources Information Center

    Kahnweiler, Jennifer B.; Pressman, Sue

    The Federal Government is America's largest employer and is expanding consulting opportunities for career development professionals. Increased Federal mandates for outsourcing have opened wide doors for the entrepreneurial-spirited career counselors and created new challenges for traditional methods of offering career services. As consultants who…

  14. Remote Sensing of Soils for Environmental Assessment and Management.

    NASA Technical Reports Server (NTRS)

    DeGloria, Stephen D.; Irons, James R.; West, Larry T.

    2014-01-01

    The next generation of imaging systems integrated with complex analytical methods will revolutionize the way we inventory and manage soil resources across a wide range of scientific disciplines and application domains. This special issue highlights those systems and methods for the direct benefit of environmental professionals and students who employ imaging and geospatial information for improved understanding, management, and monitoring of soil resources.

  15. Integrated Processing in Planning and Understanding.

    DTIC Science & Technology

    1986-12-01

    to language analysis seemed necessary. The second observation was the rather commonsense one that it is easier to understand a foreign language ...syntactic analysis Probably the most widely employed method for natural language analysis is augmea ted transition network parsing, or ATNs (Thorne, Bratley...accomplished. It is for this reason that the programming language Prolog, which implements that general method , has proven so well-stilted to writing ATN

  16. Unreported sauna use in anorexia nervosa: evidence from the world-wide-web.

    PubMed

    Vähäsoini, A; Vazquez, R; Birmingham, C L; Gutierrez, E

    2004-03-01

    Weight loss methods employed in anorexia nervosa (AN) are vomiting, laxatives, diuretics, enemas, suppositories, ipecac, weight loss medications and inadequate insulin in diabetics. Some methods result in weight loss from fluid depletion and not a reduction in body fat. Sauna use causes rapid fluid loss, but has not been reported in the medical literature as a weight loss strategy used in AN. We found reports of sauna use in AN on the world-wide-web are rare. We hypothesize that the warming caused by the use of sauna, may result in physical improvement in AN and thereby reduce its acceptability as a weight loss strategy.

  17. Hiring Practices for Human Resource Professionals: Implications for Counseling and Curriculum Development.

    ERIC Educational Resources Information Center

    Goza, Barbara K.; Lau, Andrea DeBellis

    1992-01-01

    Employers (n=107) of human resource professionals described their hiring practices. Only 13 companies had human resource internship placements for college students. Most widely used methods for recruiting were newspapers, informal channels, and internal recruitment. Highest rating for initial screening criteria were given to job experience in…

  18. JOB REDESIGN FOR OLDER WORKERS--CASE STUDIES.

    ERIC Educational Resources Information Center

    ROTHBERG, HERMAN J.

    INDUSTRIAL ESTABLISHMENTS SUCCESSFULLY USED METHODS OF JOB REDESIGN TO MAINTAIN THE EMPLOYMENT AND PRODUCTIVITY, AS WELL AS THE MORALE, OF AGING EMPLOYEES. EXAMPLES OF JOB REDESIGN WERE FOUND IN A WIDE VARIETY OF MANUFACTURING INDUSTRIES. CASE STUDIES WERE MADE IN PLANTS PRODUCING AIRCRAFT ENGINES, ALUMINUM FRAMING, BUILDING MATERIALS, CARPETS,…

  19. EPR parameters of L-α-alanine radicals in aqueous solution: a first-principles study

    NASA Astrophysics Data System (ADS)

    Janbazi, Mehdi; T. Azar, Yavar; Ziaie, Farhood

    2018-07-01

    EPR (electron paramagnetic resonance) response for a wide range of possible alanine radicals has been analysed employing quantum chemical methods. The strong correlation between geometry and EPR parameter structure of these radicals has been shown in this research work. Significant solvent effect on EPR parameters has been shown employing both explicit and implicit solvent models. In a relatively good agreement with the experiment, stable conformation of these radicals in acidic and basic conditions was determined, and a new conformation was suggested based on possible proton transfer in the intermediate pH range. The employed methodology along with experimental results may be used for the characterisation of different radiation-induced amino acid radicals.

  20. Health Risk Reduction Programs in Employer-Sponsored Health Plans: Part II—Law and Ethics

    PubMed Central

    Rothstein, Mark A.; Harrell, Heather L.

    2011-01-01

    Objective We sought to examine the legal and ethical implications of workplace health risk reduction programs (HRRPs) using health risk assessments, individually focused risk reduction, and financial incentives to promote compliance. Methods We conducted a literature review, analyzed relevant statutes and regulations, and considered the effects of these programs on employee health privacy. Results A variety of laws regulate HRRPs, and there is little evidence that employer-sponsored HRRPs violate these provisions; infringement on individual health privacy is more difficult to assess. Conclusion Although current laws permit a wide range of employer health promotion activities, HRRPs also may entail largely unquantifiable costs to employee privacy and related interests. PMID:19625971

  1. [L.A. Blumenfeld and study of photosynthesis by spectroscopy methods at Chair of Biophysics, Faculty of Physics, Moscow State University].

    PubMed

    Kukushkin, A K

    2013-01-01

    Nowadays spectroscopy methods are widely employed to study photosynthesis. For instance, fluorescence methods are often in use to study virtually all steps of photosynthesis process. Theoretical models of phenomena under study are of importance for interpretation of experimental data. A decisive role of L.A. Blumenfeld, the former head of the Chair of Biophysics, Faculty of Physics, Moscow State University, in the study of photosynthesis process is shown in this work.

  2. Dynamically balanced fuel nozzle and method of operation

    DOEpatents

    Richards, George A.; Janus, Michael C.; Robey, Edward H.

    2000-01-01

    An apparatus and method of operation designed to reduce undesirably high pressure oscillations in lean premix combustion systems burning hydrocarbon fuels are provided. Natural combustion and nozzle acoustics are employed to generate multiple fuel pockets which, when burned in the combustor, counteract the oscillations caused by variations in heat release in the combustor. A hybrid of active and passive control techniques, the apparatus and method eliminate combustion oscillations over a wide operating range, without the use of moving parts or electronics.

  3. Combined high contrast and wide field of view in the scanning laser ophthalmoscope through dual detection of light paths

    NASA Astrophysics Data System (ADS)

    Carles, Guillem; Muyo, Gonzalo; van Hemert, Jano; Harvey, Andrew R.

    2017-11-01

    We demonstrate a multimode detection system in a scanning laser ophthalmoscope (SLO) that enables simultaneous operation in confocal, indirect, and direct modes to permit an agile trade between image contrast and optical sensitivity across the retinal field of view to optimize the overall imaging performance, enabling increased contrast in very wide-field operation. We demonstrate the method on a wide-field SLO employing a hybrid pinhole at its image plane, to yield a twofold increase in vasculature contrast in the central retina compared to its conventional direct mode while retaining high-quality imaging across a wide field of the retina, of up to 200 deg and 20 μm on-axis resolution.

  4. Comparison of Employer Productivity Metrics to Lost Productivity Estimated by Commonly Used Questionnaires.

    PubMed

    Gardner, Bethany T; Dale, Ann Marie; Buckner-Petty, Skye; Van Dillen, Linda; Amick, Benjamin C; Evanoff, Bradley

    2016-02-01

    The aim of the study was to assess construct and discriminant validity of four health-related work productivity loss questionnaires in relation to employer productivity metrics, and to describe variation in economic estimates of productivity loss provided by the questionnaires in healthy workers. Fifty-eight billing office workers completed surveys including health information and four productivity loss questionnaires. Employer productivity metrics and work hours were also obtained. Productivity loss questionnaires were weakly to moderately correlated with employer productivity metrics. Workers with more health complaints reported greater health-related productivity loss than healthier workers, but showed no loss on employer productivity metrics. Economic estimates of productivity loss showed wide variation among questionnaires, yet no loss of actual productivity. Additional studies are needed comparing questionnaires with objective measures in larger samples and other industries, to improve measurement methods for health-related productivity loss.

  5. Creating Sensitive Environments for Parent Involvement Meetings

    ERIC Educational Resources Information Center

    Warner, Laverne; Barrera, John

    2005-01-01

    The most important step to parent involvement is helping parents to value education. Successful parent involvement often hinges on employing a wide variety of presentation methods to meet parents' needs. Foremost, parents must learn to become effective collaborators with the school. When the focus is on the value of education, a plethora of topics…

  6. Ethnography of Novices' First Use of Web Search Engines: Affective Control in Cognitive Processing.

    ERIC Educational Resources Information Center

    Nahl, Diane

    1998-01-01

    This study of 18 novice Internet users employed a structured self-report method to investigate affective and cognitive operations in the following phases of World Wide Web searching: presearch formulation, search statement formulation, search strategy, and evaluation of results. Users also rated their self-confidence as searchers and satisfaction…

  7. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Bhartia, Rohit (Inventor); Reid, Ray D. (Inventor)

    2017-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  8. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2018-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  9. A Versatile Method for Nanostructuring Metals, Alloys and Metal Based Composites

    NASA Astrophysics Data System (ADS)

    Gurau, G.; Gurau, C.; Bujoreanu, L. G.; Sampath, V.

    2017-06-01

    A new severe plastic deformation method based on High Pressure Torsion is described. The method patented as High Speed High Pressure Torsion (HSHPT) shows a wide scope and excellent adaptability assuring large plastic deformation degree on metals, alloys even on hard to deform or brittle alloys. The paper present results obtained on aluminium, magnesium, titan, iron and coper alloys. In addition capability of HSHPT to process metallic composites is described. OM SEM, TEM, DSC, RDX and HV investigation methods were employed to confirm fine and ultrafine structure.

  10. Nonlinear Dynamic Behavior of Impact Damage in a Composite Skin-Stiffener Structure

    NASA Technical Reports Server (NTRS)

    Ooijevaar, T. H.; Rogge, M. D.; Loendersloot, R.; Warnet, L.; Akkerman, R.; deBoer, A.

    2013-01-01

    One of the key issues in composite structures for aircraft applications is the early identification of damage. Often, service induced damage does not involve visible plastic deformation, but internal matrix related damage, like delaminations. A wide range of technologies, comprising global vibration and local wave propagation methods can be employed for health monitoring purposes. Traditional low frequency modal analysis based methods are linear methods. The effectiveness of these methods is often limited since they rely on a stationary and linear approximation of the system. The nonlinear interaction between a low frequency wave field and a local impact induced skin-stiffener failure is experimentally demonstrated in this paper. The different mechanisms that are responsible for the nonlinearities (opening, closing and contact) of the distorted harmonic waveforms are separated with the help of phase portraits. A basic analytical model is employed to support the observations.

  11. Nano-sized Contrast Agents to Non-Invasively Detect Renal Inflammation by Magnetic Resonance Imaging

    PubMed Central

    Thurman, Joshua M.; Serkova, Natalie J.

    2013-01-01

    Several molecular imaging methods have been developed that employ nano-sized contrast agents to detect markers of inflammation within tissues. Renal inflammation contributes to disease progression in a wide range of autoimmune and inflammatory diseases, and a biopsy is currently the only method of definitively diagnosing active renal inflammation. However, the development of new molecular imaging methods that employ contrast agents capable of detecting particular immune cells or protein biomarkers will allow clinicians to evaluate inflammation throughout the kidneys, and to assess a patient's response to immunomodulatory drugs. These imaging tools will improve our ability to validate new therapies and to optimize the treatment of individual patients with existing therapies. This review describes the clinical need for new methods of monitoring renal inflammation, and recent advances in the development of nano-sized contrast agents for detection of inflammatory markers of renal disease. PMID:24206601

  12. A comprehensive review on green nanomaterials using biological systems: Recent perception and their future applications.

    PubMed

    Saratale, Rijuta Ganesh; Karuppusamy, Indira; Saratale, Ganesh Dattatraya; Pugazhendhi, Arivalagan; Kumar, Gopalakrishanan; Park, Yooheon; Ghodake, Gajanan S; Bharagava, Ram Naresh; Banu, J Rajesh; Shin, Han Seung

    2018-05-19

    Over the last few years, nanotechnology is increasingly developing in scientific sector, which has attracted a great deal of interest because of its abundant applications in almost all the areas. In recent times, green nanotechnology is a relative and multidisciplinary field that has emerged as a rapidly developing research area. This is serving as an important technique that spotlight on making the procedure which is clean, safe and in particular environtmentally friendly, in a gap with the currently employed methods such as chemical and physical methods for nanosynthesis. The present review recaps the existing knowledge on various biogenic synthesis methods relying on bacteria, fungi, yeast, algae, viruses and on using biomolecules. The green nanosynthesis refers to the employment of reducing and stabilizing agents from plants and other natural resources, to fabricate nanomaterials. The green synthesis method does not engage the use of exceedingly venomous chemicals or elevated energy inputs during the synthesis. Nanoparticles (NPs) with distinct shapes, sizes and bioactivity can be produced from the variations in the bio-reducing agents employed for nanosynthesis. Hence, this review article summarizes the present information regarding the biological methods which are employed to fabricate greener, safer, and environmentally sustainable nanosynthesis routes. This review mainly highlights the wide-scale fabrication of NPs via green synthesis for biomedical and agricultural applications. Copyright © 2018. Published by Elsevier B.V.

  13. Publishing biomedical journals on the World-Wide Web using an open architecture model.

    PubMed Central

    Shareck, E. P.; Greenes, R. A.

    1996-01-01

    BACKGROUND: In many respects, biomedical publications are ideally suited for distribution via the World-Wide Web, but economic concerns have prevented the rapid adoption of an on-line publishing model. PURPOSE: We report on our experiences with assisting biomedical journals in developing an online presence, issues that were encountered, and methods used to address these issues. Our approach is based on an open architecture that fosters adaptation and interconnection of biomedical resources. METHODS: We have worked with the New England Journal of Medicine (NEJM), as well as five other publishers. A set of tools and protocols was employed to develop a scalable and customizable solution for publishing journals on-line. RESULTS: In March, 1996, the New England Journal of Medicine published its first World-Wide Web issue. Explorations with other publishers have helped to generalize the model. CONCLUSIONS: Economic and technical issues play a major role in developing World-Wide Web publishing solutions. PMID:8947685

  14. Wide-field fluorescence diffuse optical tomography with epi-illumination of sinusoidal pattern

    NASA Astrophysics Data System (ADS)

    Li, Tongxin; Gao, Feng; Chen, Weiting; Qi, Caixia; Yan, Panpan; Zhao, Huijuan

    2017-02-01

    We present a wide-field fluorescence tomography with epi-illumination of sinusoidal pattern. In this scheme, a DMD projector is employed as a spatial light modulator to generate independently wide-field sinusoidal illumination patterns at varying spatial frequencies on a sample, and then the emitted photons at the sample surface were captured with a EM-CCD camera. This method results in a significantly reduced number of the optical field measurements as compared to the point-source-scanning ones and thereby achieves a fast data acquisition that is desired for a dynamic imaging application. Fluorescence yield images are reconstructed using the normalized-Born formulated inversion of the diffusion model. Experimental reconstructions are presented on a phantom embedding the fluorescent targets and compared for a combination of the multiply frequencies. The results validate the ability of the method to determine the target relative depth and quantification with an increasing accuracy.

  15. Atomic force microscopy of model lipid membranes.

    PubMed

    Morandat, Sandrine; Azouzi, Slim; Beauvais, Estelle; Mastouri, Amira; El Kirat, Karim

    2013-02-01

    Supported lipid bilayers (SLBs) are biomimetic model systems that are now widely used to address the biophysical and biochemical properties of biological membranes. Two main methods are usually employed to form SLBs: the transfer of two successive monolayers by Langmuir-Blodgett or Langmuir-Schaefer techniques, and the fusion of preformed lipid vesicles. The transfer of lipid films on flat solid substrates offers the possibility to apply a wide range of surface analytical techniques that are very sensitive. Among them, atomic force microscopy (AFM) has opened new opportunities for determining the nanoscale organization of SLBs under physiological conditions. In this review, we first focus on the different protocols generally employed to prepare SLBs. Then, we describe AFM studies on the nanoscale lateral organization and mechanical properties of SLBs. Lastly, we survey recent developments in the AFM monitoring of bilayer alteration, remodeling, or digestion, by incubation with exogenous agents such as drugs, proteins, peptides, and nanoparticles.

  16. Analysis of Genome-Wide Association Studies with Multiple Outcomes Using Penalization

    PubMed Central

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2012-01-01

    Genome-wide association studies have been extensively conducted, searching for markers for biologically meaningful outcomes and phenotypes. Penalization methods have been adopted in the analysis of the joint effects of a large number of SNPs (single nucleotide polymorphisms) and marker identification. This study is partly motivated by the analysis of heterogeneous stock mice dataset, in which multiple correlated phenotypes and a large number of SNPs are available. Existing penalization methods designed to analyze a single response variable cannot accommodate the correlation among multiple response variables. With multiple response variables sharing the same set of markers, joint modeling is first employed to accommodate the correlation. The group Lasso approach is adopted to select markers associated with all the outcome variables. An efficient computational algorithm is developed. Simulation study and analysis of the heterogeneous stock mice dataset show that the proposed method can outperform existing penalization methods. PMID:23272092

  17. Efficient Terahertz Wide-Angle NUFFT-Based Inverse Synthetic Aperture Imaging Considering Spherical Wavefront.

    PubMed

    Gao, Jingkun; Deng, Bin; Qin, Yuliang; Wang, Hongqiang; Li, Xiang

    2016-12-14

    An efficient wide-angle inverse synthetic aperture imaging method considering the spherical wavefront effects and suitable for the terahertz band is presented. Firstly, the echo signal model under spherical wave assumption is established, and the detailed wavefront curvature compensation method accelerated by 1D fast Fourier transform (FFT) is discussed. Then, to speed up the reconstruction procedure, the fast Gaussian gridding (FGG)-based nonuniform FFT (NUFFT) is employed to focus the image. Finally, proof-of-principle experiments are carried out and the results are compared with the ones obtained by the convolution back-projection (CBP) algorithm. The results demonstrate the effectiveness and the efficiency of the presented method. This imaging method can be directly used in the field of nondestructive detection and can also be used to provide a solution for the calculation of the far-field RCSs (Radar Cross Section) of targets in the terahertz regime.

  18. Q Methodology for Post-Social-Turn Research in SLA

    ERIC Educational Resources Information Center

    Irie, Kay

    2014-01-01

    Q methodology, an approach to inquiry on the subjective views about a complex phenomenon/issue which has been increasingly employed in a wide range of social science fields has not yet been applied in language learning and teaching research. It is a unique approach that has characteristics of both qualitative and quantitative research methods. The…

  19. Silviculture and multi-resource management case studies for southwestern pinyon-juniper woodlands

    Treesearch

    Gerald J. Gottfried

    2008-01-01

    Southwestern pinyon-juniper and juniper woodlands cover large areas of the Western United States. The woodlands are heterogeneous, consisting of numerous combinations of tree, shrub, and herbaceous species and stand densities that are representative of the wide range of sites and habitat types they occupy. Silvicultural methods can be employed on better sites to meet...

  20. 76 FR 5830 - FBI Records Management Division; National Name Check Program Section; New User Fees Schedule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... study employed the same Activity Based Cost (ABC) accounting method detailed in the Final Rule establishing the process for setting fees (75 FR 24796 (May 6, 2010)). The ABC methodology is consistent with widely accepted accounting principles and complies with the provisions of 31 U.S.C. 9701 and other...

  1. Design and Hospital-Wide Implementation of a Standardized Discharge Summary in an Electronic Health Record

    PubMed Central

    Dean, Shannon M; Gilmore-Bykovskyi, Andrea; Buchanan, Joel; Ehlenfeldt, Brad; Kind, Amy JH

    2016-01-01

    Background The hospital discharge summary is the primary method used to communicate a patient's plan of care to the next provider(s). Despite the existence of regulations and guidelines outlining the optimal content for the discharge summary and its importance in facilitating an effective transition to post-hospital care, incomplete discharge summaries remain a common problem that may contribute to poor post-hospital outcomes. Electronic health records (EHRs) are regularly used as a platform upon which standardization of content and format can be implemented. Objective We describe here the design and hospital-wide implementation of a standardized discharge summary using an EHR. Methods We employed the evidence-based Replicating Effective Programs implementation strategy to guide the development and implementation during this large-scale project. Results Within 18 months, 90% of all hospital discharge summaries were written using the standardized format. Hospital providers found the template helpful and easy to use, and recipient providers perceived an improvement in the quality of discharge summaries compared to those sent from our hospital previously. Conclusions Discharge summaries can be standardized and implemented hospital-wide with both author and recipient provider satisfaction, especially if evidence-based implementation strategies are employed. The use of EHR tools to guide clinicians in writing comprehensive discharge summaries holds promise in improving the existing deficits in communication at transitions of care. PMID:28334559

  2. A Complex Network Approach to Stylometry

    PubMed Central

    Amancio, Diego Raphael

    2015-01-01

    Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921

  3. Confocal laser scanning microscopic photoconversion: a new method to stabilize fluorescently labeled cellular elements for electron microscopic analysis.

    PubMed

    Colello, Raymond J; Tozer, Jordan; Henderson, Scott C

    2012-01-01

    Photoconversion, the method by which a fluorescent dye is transformed into a stable, osmiophilic product that can be visualized by electron microscopy, is the most widely used method to enable the ultrastructural analysis of fluorescently labeled cellular structures. Nevertheless, the conventional method of photoconversion using widefield fluorescence microscopy requires long reaction times and results in low-resolution cell targeting. Accordingly, we have developed a photoconversion method that ameliorates these limitations by adapting confocal laser scanning microscopy to the procedure. We have found that this method greatly reduces photoconversion times, as compared to conventional wide field microscopy. Moreover, region-of-interest scanning capabilities of a confocal microscope facilitate the targeting of the photoconversion process to individual cellular or subcellular elements within a fluorescent field. This reduces the area of the cell exposed to light energy, thereby reducing the ultrastructural damage common to this process when widefield microscopes are employed. © 2012 by John Wiley & Sons, Inc.

  4. SuperDCA for genome-wide epistasis analysis.

    PubMed

    Puranen, Santeri; Pesonen, Maiju; Pensar, Johan; Xu, Ying Ying; Lees, John A; Bentley, Stephen D; Croucher, Nicholas J; Corander, Jukka

    2018-05-29

    The potential for genome-wide modelling of epistasis has recently surfaced given the possibility of sequencing densely sampled populations and the emerging families of statistical interaction models. Direct coupling analysis (DCA) has previously been shown to yield valuable predictions for single protein structures, and has recently been extended to genome-wide analysis of bacteria, identifying novel interactions in the co-evolution between resistance, virulence and core genome elements. However, earlier computational DCA methods have not been scalable to enable model fitting simultaneously to 10 4 -10 5 polymorphisms, representing the amount of core genomic variation observed in analyses of many bacterial species. Here, we introduce a novel inference method (SuperDCA) that employs a new scoring principle, efficient parallelization, optimization and filtering on phylogenetic information to achieve scalability for up to 10 5 polymorphisms. Using two large population samples of Streptococcus pneumoniae, we demonstrate the ability of SuperDCA to make additional significant biological findings about this major human pathogen. We also show that our method can uncover signals of selection that are not detectable by genome-wide association analysis, even though our analysis does not require phenotypic measurements. SuperDCA, thus, holds considerable potential in building understanding about numerous organisms at a systems biological level.

  5. High order filtering methods for approximating hyperbolic systems of conservation laws

    NASA Technical Reports Server (NTRS)

    Lafon, F.; Osher, S.

    1991-01-01

    The essentially nonoscillatory (ENO) schemes, while potentially useful in the computation of discontinuous solutions of hyperbolic conservation-law systems, are computationally costly relative to simple central-difference methods. A filtering technique is presented which employs central differencing of arbitrarily high-order accuracy except where a local test detects the presence of spurious oscillations and calls upon the full ENO apparatus to remove them. A factor-of-three speedup is thus obtained over the full-ENO method for a wide range of problems, with high-order accuracy in regions of smooth flow.

  6. Defect Detection in Arc-Welding Processes by Means of the Line-to-Continuum Method and Feature Selection.

    PubMed

    Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M

    2009-01-01

    Plasma optical spectroscopy is widely employed in on-line welding diagnostics. The determination of the plasma electron temperature, which is typically selected as the output monitoring parameter, implies the identification of the atomic emission lines. As a consequence, additional processing stages are required with a direct impact on the real time performance of the technique. The line-to-continuum method is a feasible alternative spectroscopic approach and it is particularly interesting in terms of its computational efficiency. However, the monitoring signal highly depends on the chosen emission line. In this paper, a feature selection methodology is proposed to solve the uncertainty regarding the selection of the optimum spectral band, which allows the employment of the line-to-continuum method for on-line welding diagnostics. Field test results have been conducted to demonstrate the feasibility of the solution.

  7. Methods for the analysis of azo dyes employed in food industry--A review.

    PubMed

    Yamjala, Karthik; Nainar, Meyyanathan Subramania; Ramisetti, Nageswara Rao

    2016-02-01

    A wide variety of azo dyes are generally added for coloring food products not only to make them visually aesthetic but also to reinstate the original appearance lost during the production process. However, many countries in the world have banned the use of most of the azo dyes in food and their usage is highly regulated by domestic and export food supplies. The regulatory authorities and food analysts adopt highly sensitive and selective analytical methods for monitoring as well as assuring the quality and safety of food products. The present manuscript presents a comprehensive review of various analytical techniques used in the analysis of azo dyes employed in food industries of different parts of the world. A brief description on the use of different extraction methods such as liquid-liquid, solid phase and membrane extraction has also been presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Estimating the Octanol/Water Partition Coefficient for Aliphatic Organic Compounds Using Semi-Empirical Electrotopological Index

    PubMed Central

    Souza, Erica Silva; Zaramello, Laize; Kuhnen, Carlos Alberto; Junkes, Berenice da Silva; Yunes, Rosendo Augusto; Heinzen, Vilma Edite Fonseca

    2011-01-01

    A new possibility for estimating the octanol/water coefficient (log P) was investigated using only one descriptor, the semi-empirical electrotopological index (ISET). The predictability of four octanol/water partition coefficient (log P) calculation models was compared using a set of 131 aliphatic organic compounds from five different classes. Log P values were calculated employing atomic-contribution methods, as in the Ghose/Crippen approach and its later refinement, AlogP; using fragmental methods through the ClogP method; and employing an approach considering the whole molecule using topological indices with the MlogP method. The efficiency and the applicability of the ISET in terms of calculating log P were demonstrated through good statistical quality (r > 0.99; s < 0.18), high internal stability and good predictive ability for an external group of compounds in the same order as the widely used models based on the fragmental method, ClogP, and the atomic contribution method, AlogP, which are among the most used methods of predicting log P. PMID:22072945

  9. Estimating the octanol/water partition coefficient for aliphatic organic compounds using semi-empirical electrotopological index.

    PubMed

    Souza, Erica Silva; Zaramello, Laize; Kuhnen, Carlos Alberto; Junkes, Berenice da Silva; Yunes, Rosendo Augusto; Heinzen, Vilma Edite Fonseca

    2011-01-01

    A new possibility for estimating the octanol/water coefficient (log P) was investigated using only one descriptor, the semi-empirical electrotopological index (I(SET)). The predictability of four octanol/water partition coefficient (log P) calculation models was compared using a set of 131 aliphatic organic compounds from five different classes. Log P values were calculated employing atomic-contribution methods, as in the Ghose/Crippen approach and its later refinement, AlogP; using fragmental methods through the ClogP method; and employing an approach considering the whole molecule using topological indices with the MlogP method. The efficiency and the applicability of the I(SET) in terms of calculating log P were demonstrated through good statistical quality (r > 0.99; s < 0.18), high internal stability and good predictive ability for an external group of compounds in the same order as the widely used models based on the fragmental method, ClogP, and the atomic contribution method, AlogP, which are among the most used methods of predicting log P.

  10. Micropunching lithography for generating micro- and submicron-patterns on polymer substrates.

    PubMed

    Chakraborty, Anirban; Liu, Xinchuan; Luo, Cheng

    2012-07-02

    Conducting polymers have attracted great attention since the discovery of high conductivity in doped polyacetylene in 1977(1). They offer the advantages of low weight, easy tailoring of properties and a wide spectrum of applications(2,3). Due to sensitivity of conducting polymers to environmental conditions (e.g., air, oxygen, moisture, high temperature and chemical solutions), lithographic techniques present significant technical challenges when working with these materials(4). For example, current photolithographic methods, such as ultra-violet (UV), are unsuitable for patterning the conducting polymers due to the involvement of wet and/or dry etching processes in these methods. In addition, current micro/nanosystems mainly have a planar form(5,6). One layer of structures is built on the top surfaces of another layer of fabricated features. Multiple layers of these structures are stacked together to form numerous devices on a common substrate. The sidewall surfaces of the microstructures have not been used in constructing devices. On the other hand, sidewall patterns could be used, for example, to build 3-D circuits, modify fluidic channels and direct horizontal growth of nanowires and nanotubes. A macropunching method has been applied in the manufacturing industry to create macropatterns in a sheet metal for over a hundred years. Motivated by this approach, we have developed a micropunching lithography method (MPL) to overcome the obstacles of patterning conducting polymers and generating sidewall patterns. Like the macropunching method, the MPL also includes two operations (Fig. 1): (i) cutting; and (ii) drawing. The "cutting" operation was applied to pattern three conducting polymers(4), polypyrrole (PPy), Poly(3,4-ethylenedioxythiophen)-poly(4-styrenesulphonate) (PEDOT) and polyaniline (PANI). It was also employed to create Al microstructures(7). The fabricated microstructures of conducting polymers have been used as humidity(8), chemical(8), and glucose sensors(9). Combined microstructures of Al and conducting polymers have been employed to fabricate capacitors and various heterojunctions(9,10,11). The "cutting" operation was also applied to generate submicron-patterns, such as 100- and 500-nm-wide PPy lines as well as 100-nm-wide Au wires. The "drawing" operation was employed for two applications: (i) produce Au sidewall patterns on high density polyethylene (HDPE) channels which could be used for building 3D microsystems(12,13,14), and (ii) fabricate polydimethylsiloxane (PDMS) micropillars on HDPE substrates to increase the contact angle of the channel(15).

  11. The much exaggerated death of positivism

    NASA Astrophysics Data System (ADS)

    Kincheloe, Joe L.; Tobin, Kenneth

    2009-09-01

    Approaches to research in the social sciences often embrace schema that are consistent with positivism, even though it is widely held that positivism is discredited and essentially dead. Accordingly, many of the methods used in present day scholarship are supported by the tenets of positivism, and are sources of hegemony. We exhort researchers to employ reflexive methods to identify the epistemologies, ontologies and axiologies that are salient in their scholarship and, when necessary, transform practices such that forms of oppression associated with crypto-positivism are identified and extinguished.

  12. Enantioselective determination by capillary electrophoresis with cyclodextrins as chiral selectors.

    PubMed

    Fanali, S

    2000-04-14

    This review surveys the separation of enantiomers by capillary electrophoresis using cyclodextrins as chiral selector. Cyclodextrins or their derivatives have been widely employed for the direct chiral resolution of a wide number of enantiomers, mainly of pharmaceutical interest, selected examples are reported in the tables. For method optimisation, several parameters influencing the enantioresolution, e.g., cyclodextrin type and concentration, buffer pH and composition, presence of organic solvents or complexing additives in the buffer were considered and discussed. Finally, selected applications to real samples such as pharmaceutical formulations, biological and medical samples are also discussed.

  13. An Optimized Control for LLC Resonant Converter with Wide Load Range

    NASA Astrophysics Data System (ADS)

    Xi, Xia; Qian, Qinsong

    2017-05-01

    This paper presents an optimized control which makes LLC resonant converters operate with a wider load range and provides good closed-loop performance. The proposed control employs two paralleled digital compensations to guarantee the good closed-loop performance in a wide load range during the steady state, an optimized trajectory control will take over to change the gate-driving signals immediately at the load transients. Finally, the proposed control has been implemented and tested on a 150W 200kHz 400V/24V LLC resonant converter and the result validates the proposed method.

  14. Method, Philosophy of Education and the Sphere of the Practico-Inert

    ERIC Educational Resources Information Center

    Papastephanou, Marianna

    2009-01-01

    This essay discusses a conception of the relation of philosophy to education that has come to be widely held in both general philosophy and philosophy of education. This view is approached here through the employment of Jean-Paul Sartre's notion of the "practico-inert" as the realm of consolidated social objects, part of which is the institution…

  15. Multiple laser pulse ignition method and apparatus

    DOEpatents

    Early, James W.

    1998-01-01

    Two or more laser light pulses with certain differing temporal lengths and peak pulse powers can be employed sequentially to regulate the rate and duration of laser energy delivery to fuel mixtures, thereby improving fuel ignition performance over a wide range of fuel parameters such as fuel/oxidizer ratios, fuel droplet size, number density and velocity within a fuel aerosol, and initial fuel temperatures.

  16. Perceived Sleepiness, Sleep Habits and Sleep Concerns of Public School Teachers, Administrators and Other Personnel

    ERIC Educational Resources Information Center

    Amschler, Denise H.; McKenzie, James F.

    2010-01-01

    Background: Sleep deprivation is a world-wide health concern. Few studies have examined the sleep behaviors of those employed in the education field. Purpose: To describe the sleep habits and concerns of school personnel in a Midwest school corporation. Methods: A cross-sectional survey design was used to collect data about demographics, the…

  17. Comparison of GOES Cloud Classification Algorithms Employing Explicit and Implicit Physics

    NASA Technical Reports Server (NTRS)

    Bankert, Richard L.; Mitrescu, Cristian; Miller, Steven D.; Wade, Robert H.

    2009-01-01

    Cloud-type classification based on multispectral satellite imagery data has been widely researched and demonstrated to be useful for distinguishing a variety of classes using a wide range of methods. The research described here is a comparison of the classifier output from two very different algorithms applied to Geostationary Operational Environmental Satellite (GOES) data over the course of one year. The first algorithm employs spectral channel thresholding and additional physically based tests. The second algorithm was developed through a supervised learning method with characteristic features of expertly labeled image samples used as training data for a 1-nearest-neighbor classification. The latter's ability to identify classes is also based in physics, but those relationships are embedded implicitly within the algorithm. A pixel-to-pixel comparison analysis was done for hourly daytime scenes within a region in the northeastern Pacific Ocean. Considerable agreement was found in this analysis, with many of the mismatches or disagreements providing insight to the strengths and limitations of each classifier. Depending upon user needs, a rule-based or other postprocessing system that combines the output from the two algorithms could provide the most reliable cloud-type classification.

  18. Fast and Accurate Approximation to Significance Tests in Genome-Wide Association Studies

    PubMed Central

    Zhang, Yu; Liu, Jun S.

    2011-01-01

    Genome-wide association studies commonly involve simultaneous tests of millions of single nucleotide polymorphisms (SNP) for disease association. The SNPs in nearby genomic regions, however, are often highly correlated due to linkage disequilibrium (LD, a genetic term for correlation). Simple Bonferonni correction for multiple comparisons is therefore too conservative. Permutation tests, which are often employed in practice, are both computationally expensive for genome-wide studies and limited in their scopes. We present an accurate and computationally efficient method, based on Poisson de-clumping heuristics, for approximating genome-wide significance of SNP associations. Compared with permutation tests and other multiple comparison adjustment approaches, our method computes the most accurate and robust p-value adjustments for millions of correlated comparisons within seconds. We demonstrate analytically that the accuracy and the efficiency of our method are nearly independent of the sample size, the number of SNPs, and the scale of p-values to be adjusted. In addition, our method can be easily adopted to estimate false discovery rate. When applied to genome-wide SNP datasets, we observed highly variable p-value adjustment results evaluated from different genomic regions. The variation in adjustments along the genome, however, are well conserved between the European and the African populations. The p-value adjustments are significantly correlated with LD among SNPs, recombination rates, and SNP densities. Given the large variability of sequence features in the genome, we further discuss a novel approach of using SNP-specific (local) thresholds to detect genome-wide significant associations. This article has supplementary material online. PMID:22140288

  19. Development of a novel controllable, multidirectional, reusable metallic port with a wide working space.

    PubMed

    Hosaka, Seiji; Ohdaira, Takeshi; Umemoto, Satoshi; Hashizume, Makoto; Kawamoto, Shunji

    2013-12-01

    Endoscopic surgery is currently a standard procedure in many countries. Furthermore, conventional four-port laparoscopic cholecystectomy is developing into a single-port procedure. However, in many developing countries, disposable medical products are expensive and adequate medical waste disposable facilities are absent. Advanced medical treatments such as laparoscopic or single-port surgeries are not readily available in many areas of developing countries, and there are often no other sterilization methods besides autoclaving. Moreover, existing reusable metallic ports are impractical and are thus not widely used. We developed a novel controllable, multidirectional single-port device that can be autoclaved, and with a wide working space, which was employed in five patients. In all patients, laparoscopic cholecystectomy was accomplished without complications. Our device facilitates single-port surgery in areas of the world with limited sterilization methods and offers a novel alternative to conventional tools for creating a smaller incision, decrease postoperative pain, and improve cosmesis. This novel device can also lower the cost of medical treatment and offers a promising tool for major surgeries requiring a wide working space.

  20. Blood vessel segmentation in modern wide-field retinal images in the presence of additive Gaussian noise.

    PubMed

    Asem, Morteza Modarresi; Oveisi, Iman Sheikh; Janbozorgi, Mona

    2018-07-01

    Retinal blood vessels indicate some serious health ramifications, such as cardiovascular disease and stroke. Thanks to modern imaging technology, high-resolution images provide detailed information to help analyze retinal vascular features before symptoms associated with such conditions fully develop. Additionally, these retinal images can be used by ophthalmologists to facilitate diagnosis and the procedures of eye surgery. A fuzzy noise reduction algorithm was employed to enhance color images corrupted by Gaussian noise. The present paper proposes employing a contrast limited adaptive histogram equalization to enhance illumination and increase the contrast of retinal images captured from state-of-the-art cameras. Possessing directional properties, the multistructure elements method can lead to high-performance edge detection. Therefore, multistructure elements-based morphology operators are used to detect high-quality image ridges. Following this detection, the irrelevant ridges, which are not part of the vessel tree, were removed by morphological operators by reconstruction, attempting also to keep the thin vessels preserved. A combined method of connected components analysis (CCA) in conjunction with a thresholding approach was further used to identify the ridges that correspond to vessels. The application of CCA can yield higher efficiency when it is locally applied rather than applied on the whole image. The significance of our work lies in the way in which several methods are effectively combined and the originality of the database employed, making this work unique in the literature. Computer simulation results in wide-field retinal images with up to a 200-deg field of view are a testimony of the efficacy of the proposed approach, with an accuracy of 0.9524.

  1. The influence of solvent processing on polyester bioabsorbable polymers.

    PubMed

    Manson, Joanne; Dixon, Dorian

    2012-01-01

    Solvent-based methods are commonly employed for the production of polyester-based samples and coatings in both medical device production and research. The influence of solvent casting and subsequent drying time was studied using thermal analysis, spectroscopy and weight measurement for four grades of 50 : 50 poly(lactic-co-glycolic acid) (PLGA) produced by using chloroform, dichloromethane, and acetone. The results demonstrate that solvent choice and PLGA molecular weight are critical factors in terms of solvent removal rate and maintaining sample integrity, respectively. The protocols widely employed result in high levels of residual solvent and a new protocol is presented together with solutions to commonly encountered problems.

  2. Comparison of methods for measuring cholinesterase inhibition by carbamates

    PubMed Central

    Wilhelm, K.; Vandekar, M.; Reiner, E.

    1973-01-01

    The Acholest and tintometric methods are used widely for measuring blood cholinesterase activity after exposure to organophosphorus compounds. However, if applied for measuring blood cholinesterase activity in persons exposed to carbamates, the accuracy of the methods requires verification since carbamylated cholinesterases are unstable. The spectrophotometric method was used as a reference method and the two field methods were employed under controlled conditions. Human blood cholinesterases were inhibited in vitro by four methylcarbamates that are used as insecticides. When plasma cholinesterase activity was measured by the Acholest and spectrophotometric methods, no difference was found. The enzyme activity in whole blood determined by the tintometric method was ≤ 11% higher than when the same sample was measured by the spectrophotometric method. PMID:4541147

  3. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  4. System-wide identification of wild-type SUMO-2 conjugation sites

    PubMed Central

    Hendriks, Ivo A.; D'Souza, Rochelle C.; Chang, Jer-Gung; Mann, Matthias; Vertegaal, Alfred C. O.

    2015-01-01

    SUMOylation is a reversible post-translational modification (PTM) regulating all nuclear processes. Identification of SUMOylation sites by mass spectrometry (MS) has been hampered by bulky tryptic fragments, which thus far necessitated the use of mutated SUMO. Here we present a SUMO-specific protease-based methodology which circumvents this problem, dubbed Protease-Reliant Identification of SUMO Modification (PRISM). PRISM allows for detection of SUMOylated proteins as well as identification of specific sites of SUMOylation while using wild-type SUMO. The method is generic and could be widely applied to study lysine PTMs. We employ PRISM in combination with high-resolution MS to identify SUMOylation sites from HeLa cells under standard growth conditions and in response to heat shock. We identified 751 wild-type SUMOylation sites on endogenous proteins, including 200 dynamic SUMO sites in response to heat shock. Thus, we have developed a method capable of quantitatively studying wild-type mammalian SUMO at the site-specific and system-wide level. PMID:26073453

  5. Highlight removal based on the regional-projection fringe projection method

    NASA Astrophysics Data System (ADS)

    Qi, Zhaoshuai; Wang, Zhao; Huang, Junhui; Xing, Chao; Gao, Jianmin

    2018-04-01

    In fringe projection profilometry, highlight usually causes the saturation and blooming in captured fringes and reduces the measurement accuracy. To solve the problem, a regional-projection fringe projection (RP-FP) method is proposed. Regional projection patterns (RP patterns) are projected onto the tested object surface to avoid the saturation and blooming. Then, an image inpainting technique is employed to reconstruct the missing phases in the captured RP patterns and a complete surface of the tested object is obtained. Experiments verified the effectiveness of the proposed method. The method can be widely used in industrial inspections and quality controlling in mechanical and manufacturing industries.

  6. Wood Specific Gravity Variation with Height and Its Implications for Biomass Estimation

    Treesearch

    Michael C. Wiemann; G. Bruce Williamson

    2014-01-01

    Wood specific gravity (SG) is widely employed by ecologists as a key variable in estimates of biomass. When it is important to have nondestructive methods for sampling wood for SG measurements, cores are extracted with an increment borer. While boring is a relatively difficult task even at breast height sampling, it is impossible at ground level and arduous at heights...

  7. Multiple laser pulse ignition method and apparatus

    DOEpatents

    Early, J.W.

    1998-05-26

    Two or more laser light pulses with certain differing temporal lengths and peak pulse powers can be employed sequentially to regulate the rate and duration of laser energy delivery to fuel mixtures, thereby improving fuel ignition performance over a wide range of fuel parameters such as fuel/oxidizer ratios, fuel droplet size, number density and velocity within a fuel aerosol, and initial fuel temperatures. 18 figs.

  8. Auxiliary-field-based trial wave functions in quantum Monte Carlo calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Chia -Chen; Rubenstein, Brenda M.; Morales, Miguel A.

    2016-12-19

    Quantum Monte Carlo (QMC) algorithms have long relied on Jastrow factors to incorporate dynamic correlation into trial wave functions. While Jastrow-type wave functions have been widely employed in real-space algorithms, they have seen limited use in second-quantized QMC methods, particularly in projection methods that involve a stochastic evolution of the wave function in imaginary time. Here we propose a scheme for generating Jastrow-type correlated trial wave functions for auxiliary-field QMC methods. The method is based on decoupling the two-body Jastrow into one-body projectors coupled to auxiliary fields, which then operate on a single determinant to produce a multideterminant trial wavemore » function. We demonstrate that intelligent sampling of the most significant determinants in this expansion can produce compact trial wave functions that reduce errors in the calculated energies. Lastly, our technique may be readily generalized to accommodate a wide range of two-body Jastrow factors and applied to a variety of model and chemical systems.« less

  9. Wide-field two-photon microscopy with temporal focusing and HiLo background rejection

    NASA Astrophysics Data System (ADS)

    Yew, Elijah Y. S.; Choi, Heejin; Kim, Daekeun; So, Peter T. C.

    2011-03-01

    Scanningless depth-resolved microscopy is achieved through spatial-temporal focusing and has been demonstrated previously. The advantage of this method is that a large area may be imaged without scanning resulting in higher throughput of the imaging system. Because it is a widefield technique, the optical sectioning effect is considerably poorer than with conventional spatial focusing two-photon microscopy. Here we propose wide-field two-photon microscopy based on spatio-temporal focusing and employing background rejection based on the HiLo microscope principle. We demonstrate the effects of applying HiLo microscopy to widefield temporally focused two-photon microscopy.

  10. Wide-aperture aspherical lens for high-resolution terahertz imaging

    NASA Astrophysics Data System (ADS)

    Chernomyrdin, Nikita V.; Frolov, Maxim E.; Lebedev, Sergey P.; Reshetov, Igor V.; Spektor, Igor E.; Tolstoguzov, Viktor L.; Karasik, Valeriy E.; Khorokhorov, Alexei M.; Koshelev, Kirill I.; Schadko, Aleksander O.; Yurchenko, Stanislav O.; Zaytsev, Kirill I.

    2017-01-01

    In this paper, we introduce wide-aperture aspherical lens for high-resolution terahertz (THz) imaging. The lens has been designed and analyzed by numerical methods of geometrical optics and electrodynamics. It has been made of high-density polyethylene by shaping at computer-controlled lathe and characterized using a continuous-wave THz imaging setup based on a backward-wave oscillator and Golay detector. The concept of image contrast has been implemented to estimate image quality. According to the experimental data, the lens allows resolving two points spaced at 0.95λ distance with a contrast of 15%. To highlight high resolution in the THz images, the wide-aperture lens has been employed for studying printed electronic circuit board containing sub-wavelength-scale elements. The observed results justify the high efficiency of the proposed lens design.

  11. Deducing noninductive current profile from surface voltage evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Litwin, C.; Wukitch, S.; Hershkowitz, N.

    Solving the resistive diffusion equation in the presence of a noninductive current source determines the time-evolution of the surface voltage. By inverting the problem the current drive profile can be determined from the surface voltage evolution. We show that under wide range of conditions the deduced profile is unique. If the conductivity profile is known, this method can be employed to infer the noninductive current profile, and, ipso facto, the profile of the total current. We discuss the application of this method to analyze the Alfven wave current drive experiments in Phaedrus-T.

  12. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  13. Employment outcome for people with schizophrenia in rural v. urban China: population-based study

    PubMed Central

    Yang, Lawrence H.; Phillips, Michael R.; Li, Xianyun; Yu, Gary; Zhang, Jingxuan; Shi, Qichang; Song, Zhiqiang; Ding, Zhijie; Pang, Shutao; Susser, Ezra

    2013-01-01

    Background Although outcomes among people with schizophrenia differ by social context, this has rarely been examined across rural v. urban settings. For individuals with schizophrenia, employment is widely recognised as a critical ingredient of social integration. Aims To compare employment for people with schizophrenia in rural v. urban settings in China. Method In a large community-based study in four provinces representing 12% of China’s population, we identified 393 people with schizophrenia (112 never treated). We used adjusted Poisson regression models to compare employment for those living in rural (n = 297) v. urban (n = 96) settings. Results Although rural and urban residents had similar impairments due to symptoms, rural residents were three times more likely to be employed (adjusted relative risk 3.27, 95% CI 2.11–5.07, P<0.001). Conclusions People with schizophrenia have greater opportunities to use their capacities for productive work in rural than urban settings in China. Contextual mechanisms that may explain this result offer a useful focus for future research. PMID:23258768

  14. Psychological Stress and Parenting Behavior among Chinese Families: Findings from a Study on Parent Education for Economically Disadvantaged Families

    ERIC Educational Resources Information Center

    Lam, Ching Man

    2011-01-01

    With the recognition of the crucial role of family and with the belief that parents have the greatest influence on a child's life, family and parent education has been widely practiced in Hong Kong and many other countries as measure for poverty alleviation. A study, employed quantitative method of a cross-sectional parent survey (N = 10,386) was…

  15. Rapid End-Group Modification of Polysaccharides for Biomaterial Applications in Regenerative Medicine.

    PubMed

    Bondalapati, Somasekhar; Ruvinov, Emil; Kryukov, Olga; Cohen, Smadar; Brik, Ashraf

    2014-09-15

    Polysaccharides have emerged as important functional materials because of their unique properties such as biocompatibility, biodegradability, and availability of reactive sites for chemical modifications to optimize their properties. The overwhelming majority of the methods to modify polysaccharides employ random chemical modifications, which often improve certain properties while compromising others. On the other hand, the employed methods for selective modifications often require excess of coupling partners, long reaction times and are limited in their scope and wide applicability. To circumvent these drawbacks, aniline-catalyzed oxime formation is developed for selective modification of a variety of polysaccharides through their reducing end. Notably, it is found that for efficient oxime formation, different conditions are required depending on the composition of the specific polysaccharide. It is also shown how our strategy can be applied to improve the physical and functional properties of alginate hydrogels, which are widely used in tissue engineering and regenerative medicine applications. While the randomly and selectively modified alginate exhibits similar viscoelastic properties, the latter forms significantly more stable hydrogel and superior cell adhesive and functional properties. Our results show that the developed conjugation reaction is robust and should open new opportunities for preparing polysaccharide-based functional materials with unique properties. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Predicting conformational ensembles and genome-wide transcription factor binding sites from DNA sequences.

    PubMed

    Andrabi, Munazah; Hutchins, Andrew Paul; Miranda-Saavedra, Diego; Kono, Hidetoshi; Nussinov, Ruth; Mizuguchi, Kenji; Ahmad, Shandar

    2017-06-22

    DNA shape is emerging as an important determinant of transcription factor binding beyond just the DNA sequence. The only tool for large scale DNA shape estimates, DNAshape was derived from Monte-Carlo simulations and predicts four broad and static DNA shape features, Propeller twist, Helical twist, Minor groove width and Roll. The contributions of other shape features e.g. Shift, Slide and Opening cannot be evaluated using DNAshape. Here, we report a novel method DynaSeq, which predicts molecular dynamics-derived ensembles of a more exhaustive set of DNA shape features. We compared the DNAshape and DynaSeq predictions for the common features and applied both to predict the genome-wide binding sites of 1312 TFs available from protein interaction quantification (PIQ) data. The results indicate a good agreement between the two methods for the common shape features and point to advantages in using DynaSeq. Predictive models employing ensembles from individual conformational parameters revealed that base-pair opening - known to be important in strand separation - was the best predictor of transcription factor-binding sites (TFBS) followed by features employed by DNAshape. Of note, TFBS could be predicted not only from the features at the target motif sites, but also from those as far as 200 nucleotides away from the motif.

  17. Rationalization of Hubbard U in CeOx from first principles: Unveiling the role of local structure in screening

    NASA Astrophysics Data System (ADS)

    Lu, Deyu; Liu, Ping

    2014-03-01

    DFT+U method has been widely employed in theoretical studies on various ceria systems to correct the delocalization bias in local and semi-local DFT functionals with moderate computational cost. To rationalize the Hubbard U of Ce 4f, we employed the first principles linear response method to compute Hubbard U for Ce in ceria clusters, bulks, and surfaces. We found that in contrast to the commonly used approach treating U as a constant, the Hubbard U varies in a wide range from 4.1 eV to 6.7 eV, and exhibits a strong correlation with the Ce coordination numbers and Ce-O bond lengths, rather than the Ce 4f valence state. The variation of the Hubbard U can be explained by the changes in the strength of local screening due to O --> Ce intersite transition. Our study represents a systematic, quantitative investigation of the relationship between the Hubbard U and the local atomic arrangement, enabling a DFT+environment-dependent U scheme that can have potential impact on catalysis research of strongly correlated systems. This work is supported by the U.S. Department of Energy, Office of Basic Energy Sciences, under Contract No. DE-AC02-98CH10886.

  18. Nonlinear Analyte Concentration Gradients for One-Step Kinetic Analysis Employing Optical Microring Resonators

    PubMed Central

    Marty, Michael T.; Kuhnline Sloan, Courtney D.; Bailey, Ryan C.; Sligar, Stephen G.

    2012-01-01

    Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics. PMID:22686186

  19. Nonlinear analyte concentration gradients for one-step kinetic analysis employing optical microring resonators.

    PubMed

    Marty, Michael T; Sloan, Courtney D Kuhnline; Bailey, Ryan C; Sligar, Stephen G

    2012-07-03

    Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes, and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics.

  20. Rapid analysis of pharmaceutical drugs using LIBS coupled with multivariate analysis.

    PubMed

    Tiwari, P K; Awasthi, S; Kumar, R; Anand, R K; Rai, P K; Rai, A K

    2018-02-01

    Type 2 diabetes drug tablets containing voglibose having dose strengths of 0.2 and 0.3 mg of various brands have been examined, using laser-induced breakdown spectroscopy (LIBS) technique. The statistical methods such as the principal component analysis (PCA) and the partial least square regression analysis (PLSR) have been employed on LIBS spectral data for classifying and developing the calibration models of drug samples. We have developed the ratio-based calibration model applying PLSR in which relative spectral intensity ratios H/C, H/N and O/N are used. Further, the developed model has been employed to predict the relative concentration of element in unknown drug samples. The experiment has been performed in air and argon atmosphere, respectively, and the obtained results have been compared. The present model provides rapid spectroscopic method for drug analysis with high statistical significance for online control and measurement process in a wide variety of pharmaceutical industrial applications.

  1. VO2 estimation using 6-axis motion sensor with sports activity classification.

    PubMed

    Nagata, Takashi; Nakamura, Naoteru; Miyatake, Masato; Yuuki, Akira; Yomo, Hiroyuki; Kawabata, Takashi; Hara, Shinsuke

    2016-08-01

    In this paper, we focus on oxygen consumption (VO2) estimation using 6-axis motion sensor (3-axis accelerometer and 3-axis gyroscope) for people playing sports with diverse intensities. The VO2 estimated with a small motion sensor can be used to calculate the energy expenditure, however, its accuracy depends on the intensities of various types of activities. In order to achieve high accuracy over a wide range of intensities, we employ an estimation framework that first classifies activities with a simple machine-learning based classification algorithm. We prepare different coefficients of linear regression model for different types of activities, which are determined with training data obtained by experiments. The best-suited model is used for each type of activity when VO2 is estimated. The accuracy of the employed framework depends on the trade-off between the degradation due to classification errors and improvement brought by applying separate, optimum model to VO2 estimation. Taking this trade-off into account, we evaluate the accuracy of the employed estimation framework by using a set of experimental data consisting of VO2 and motion data of people with a wide range of intensities of exercises, which were measured by a VO2 meter and motion sensor, respectively. Our numerical results show that the employed framework can improve the estimation accuracy in comparison to a reference method that uses a common regression model for all types of activities.

  2. Eulerian adaptive finite-difference method for high-velocity impact and penetration problems

    NASA Astrophysics Data System (ADS)

    Barton, P. T.; Deiterding, R.; Meiron, D.; Pullin, D.

    2013-05-01

    Owing to the complex processes involved, faithful prediction of high-velocity impact events demands a simulation method delivering efficient calculations based on comprehensively formulated constitutive models. Such an approach is presented herein, employing a weighted essentially non-oscillatory (WENO) method within an adaptive mesh refinement (AMR) framework for the numerical solution of hyperbolic partial differential equations. Applied widely in computational fluid dynamics, these methods are well suited to the involved locally non-smooth finite deformations, circumventing any requirement for artificial viscosity functions for shock capturing. Application of the methods is facilitated through using a model of solid dynamics based upon hyper-elastic theory comprising kinematic evolution equations for the elastic distortion tensor. The model for finite inelastic deformations is phenomenologically equivalent to Maxwell's model of tangential stress relaxation. Closure relations tailored to the expected high-pressure states are proposed and calibrated for the materials of interest. Sharp interface resolution is achieved by employing level-set functions to track boundary motion, along with a ghost material method to capture the necessary internal boundary conditions for material interactions and stress-free surfaces. The approach is demonstrated for the simulation of high velocity impacts of steel projectiles on aluminium target plates in two and three dimensions.

  3. Rapid method for quantification of nine sulfonamides in bovine milk using HPLC/MS/MS and without using SPE.

    PubMed

    Nebot, Carolina; Regal, Patricia; Miranda, Jose Manuel; Fente, Cristina; Cepeda, Alberto

    2013-12-01

    Sulfonamides are antimicrobial agents widely employed in animal production and their residues in food could be an important risk to human health. In the dairy industry, large quantities of milk are monitored daily for the presence of sulfonamides. A simple and low-cost extraction protocol followed by a liquid chromatographic-tandem mass spectrometry method was developed for the simultaneous detection of nine sulfonamides in whole milk. The method was validated at the maximum residue limits established by European legislation. The limits of quantification obtained for most sulfonamides were between 12.5 and 25 μg kg(-1), detection capabilities ranged from 116 to 145 μg kg(-1), and recoveries, at 100 μg kg(-1), were greater than 89±12.5%. The method was employed to analyse 100 raw whole bovine milk samples collected from dairy farms in the northwest region of Spain. All of the samples were found to be compliant, but two were positive; one for sulfadiazine and the other for sulfamethoxipyridazine. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. A new nonlinear conjugate gradient coefficient under strong Wolfe-Powell line search

    NASA Astrophysics Data System (ADS)

    Mohamed, Nur Syarafina; Mamat, Mustafa; Rivaie, Mohd

    2017-08-01

    A nonlinear conjugate gradient method (CG) plays an important role in solving a large-scale unconstrained optimization problem. This method is widely used due to its simplicity. The method is known to possess sufficient descend condition and global convergence properties. In this paper, a new nonlinear of CG coefficient βk is presented by employing the Strong Wolfe-Powell inexact line search. The new βk performance is tested based on number of iterations and central processing unit (CPU) time by using MATLAB software with Intel Core i7-3470 CPU processor. Numerical experimental results show that the new βk converge rapidly compared to other classical CG method.

  5. Determining the Center Path of Ground Surface LIDAR Data

    DTIC Science & Technology

    2017-01-01

    2 Methods ………………………………………………………………………................... 5 Results………………………………………………………………………...................... 14 Conclusions...LIDAR ground surveying. The methods employed include a preliminary approximate ordering of the LIDAR coordinates and color data, followed by the...from a wide scan of a long walking trail, with hundreds to thousands of survey data per square meter. The output of the method is a uniformly

  6. Bandwidth correction for LED chromaticity based on Levenberg-Marquardt algorithm

    NASA Astrophysics Data System (ADS)

    Huang, Chan; Jin, Shiqun; Xia, Guo

    2017-10-01

    Light emitting diode (LED) is widely employed in industrial applications and scientific researches. With a spectrometer, the chromaticity of LED can be measured. However, chromaticity shift will occur due to the broadening effects of the spectrometer. In this paper, an approach is put forward to bandwidth correction for LED chromaticity based on Levenberg-Marquardt algorithm. We compare chromaticity of simulated LED spectra by using the proposed method and differential operator method to bandwidth correction. The experimental results show that the proposed approach achieves an excellent performance in bandwidth correction which proves the effectiveness of the approach. The method has also been tested on true blue LED spectra.

  7. Automatic Mosaicking of Satellite Imagery Considering the Clouds

    NASA Astrophysics Data System (ADS)

    Kang, Yifei; Pan, Li; Chen, Qi; Zhang, Tong; Zhang, Shasha; Liu, Zhang

    2016-06-01

    With the rapid development of high resolution remote sensing for earth observation technology, satellite imagery is widely used in the fields of resource investigation, environment protection, and agricultural research. Image mosaicking is an important part of satellite imagery production. However, the existence of clouds leads to lots of disadvantages for automatic image mosaicking, mainly in two aspects: 1) Image blurring may be caused during the process of image dodging, 2) Cloudy areas may be passed through by automatically generated seamlines. To address these problems, an automatic mosaicking method is proposed for cloudy satellite imagery in this paper. Firstly, modified Otsu thresholding and morphological processing are employed to extract cloudy areas and obtain the percentage of cloud cover. Then, cloud detection results are used to optimize the process of dodging and mosaicking. Thus, the mosaic image can be combined with more clear-sky areas instead of cloudy areas. Besides, clear-sky areas will be clear and distortionless. The Chinese GF-1 wide-field-of-view orthoimages are employed as experimental data. The performance of the proposed approach is evaluated in four aspects: the effect of cloud detection, the sharpness of clear-sky areas, the rationality of seamlines and efficiency. The evaluation results demonstrated that the mosaic image obtained by our method has fewer clouds, better internal color consistency and better visual clarity compared with that obtained by traditional method. The time consumed by the proposed method for 17 scenes of GF-1 orthoimages is within 4 hours on a desktop computer. The efficiency can meet the general production requirements for massive satellite imagery.

  8. Measurement of the distribution of non-structural carbohydrate composition in onion populations by a high-throughput microplate enzymatic assay.

    PubMed

    Revanna, Roopashree; Turnbull, Matthew H; Shaw, Martin L; Wright, Kathryn M; Butler, Ruth C; Jameson, Paula E; McCallum, John A

    2013-08-15

    Non-structural carbohydrate (NSC; glucose, fructose, sucrose and fructan) composition of onions (Allium cepa L.) varies widely and is a key determinant of market usage. To analyse the physiology and genetics of onion carbohydrate metabolism and to enable selective breeding, an inexpensive, reliable and practicable sugar assay is required to phenotype large numbers of samples. A rapid, reliable and cost-effective microplate-based assay was developed for NSC analysis in onions and used to characterise variation in tissue hexose, sucrose and fructan content in open-pollinated breeding populations and in mapping populations developed from a wide onion cross. Sucrose measured in microplates employing maltase as a hydrolytic enzyme was in agreement with HPLC-PAD results. The method revealed significant variation in bulb fructan content within open-pollinated 'Pukekohe Longkeeper' breeding populations over a threefold range. Very wide segregation from 80 to 600 g kg(-1) in fructan content was observed in bulbs of F2 genetic mapping populations from the wide onion cross 'Nasik Red × CUDH2150'. The microplate enzymatic assay is a reliable and practicable method for onion sugar analysis for genetics, breeding and food technology. Open-pollinated onion populations may harbour extensive within-population variability in carbohydrate content, which may be quantified and exploited using this method. The phenotypic data obtained from genetic mapping populations show that the method is well suited to detailed genetic and physiological analysis. © 2013 Society of Chemical Industry.

  9. The need for and cost of mandating private insurance coverage of contraception.

    PubMed

    Gold, R B

    1998-08-01

    A public policy debate in the US is considering whether it is in the public interest to mandate that private, employment-related health insurance plans cover contraception. Industry representatives oppose mandates as unnecessary and costly, but women's health advocates point out that mandates were necessary to remove other health insurance disadvantages to women. For example, the Pregnancy Discrimination Act of 1978 was necessary to mandate coverage for maternity care. US women rely on contraception to avoid pregnancy for approximately 20 years during their reproductive lives, but health insurance policies vary widely in the amount of contraceptive coverage provided. Some fail to cover contraception but cover sterilization and abortion. Coverage is important because women cite cost as a consideration when choosing a method, and some of the more effective methods are more costly. Estimates show that the cost of covering the full range of approved reversible contraception would be a minimal $21.40/employee/year, of which employers would pay $17.12, a 0.6% increase in costs. The cost of plans that already cover some reversible methods would increase even less. Public opinion overwhelmingly favors mandated contraception coverage, even if employee costs were to increase. Congress is considering legislation to mandate coverage in private, employment-related plans, and the industry has indicated that it will not fight the legislation.

  10. A Review on Microdialysis Calibration Methods: the Theory and Current Related Efforts.

    PubMed

    Kho, Chun Min; Enche Ab Rahim, Siti Kartini; Ahmad, Zainal Arifin; Abdullah, Norazharuddin Shah

    2017-07-01

    Microdialysis is a sampling technique first introduced in the late 1950s. Although this technique was originally designed to study endogenous compounds in animal brain, it is later modified to be used in other organs. Additionally, microdialysis is not only able to collect unbound concentration of compounds from tissue sites; this technique can also be used to deliver exogenous compounds to a designated area. Due to its versatility, microdialysis technique is widely employed in a number of areas, including biomedical research. However, for most in vivo studies, the concentration of substance obtained directly from the microdialysis technique does not accurately describe the concentration of the substance on-site. In order to relate the results collected from microdialysis to the actual in vivo condition, a calibration method is required. To date, various microdialysis calibration methods have been reported, with each method being capable to provide valuable insights of the technique itself and its applications. This paper aims to provide a critical review on various calibration methods used in microdialysis applications, inclusive of a detailed description of the microdialysis technique itself to start with. It is expected that this article shall review in detail, the various calibration methods employed, present examples of work related to each calibration method including clinical efforts, plus the advantages and disadvantages of each of the methods.

  11. Department-Store Education: An Account of the Training Methods Developed at the Boston School of Salesmanship under the Direction of Lucinda Wyman Prince. Bulletin, 1917, No. 9

    ERIC Educational Resources Information Center

    Norton, Helen Rich

    1917-01-01

    Vocational training, as a part of the great movement for industrial betterment is now widely recognized as an advantageous measure for both the worker and the industry, but it is not many years since such applied education was looked upon with disfavor by employers and employees alike. This report will deal specifically with the development of…

  12. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  13. Lessons from Outreach: What works; what doesn't

    NASA Astrophysics Data System (ADS)

    Sadler, Philip M.

    2011-05-01

    Outreach to teachers in the form of professional development can help to inform college instructors as to the effectiveness of methods aimed at increasing subject matter and pedagogical content knowledge. College faculty employ a wide range of activities in summer institute programs, often in all-day, residential programs. Comparing such immersion experiences can tell us quite a bit about learning using a variety of systematic approaches to teaching physics and astronomy under ideal conditions.

  14. Method for detection and imaging over a broad spectral range

    DOEpatents

    Yefremenko, Volodymyr; Gordiyenko, Eduard; Pishko, legal representative, Olga; Novosad, Valentyn; Pishko, deceased; Vitalii

    2007-09-25

    A method of controlling the coordinate sensitivity in a superconducting microbolometer employs localized light, heating or magnetic field effects to form normal or mixed state regions on a superconducting film and to control the spatial location. Electron beam lithography and wet chemical etching were applied as pattern transfer processes in epitaxial Y--Ba--Cu--O films. Two different sensor designs were tested: (i) a 3 millimeter long and 40 micrometer wide stripe and (ii) a 1.25 millimeters long, and 50 micron wide meandering-like structure. Scanning the laser beam along the stripe leads to physical displacement of the sensitive area, and, therefore, may be used as a basis for imaging over a broad spectral range. Forming the superconducting film as a meandering structure provides the equivalent of a two-dimensional detector array. Advantages of this approach are simplicity of detector fabrication, and simplicity of the read-out process requiring only two electrical terminals.

  15. Event time analysis of longitudinal neuroimage data.

    PubMed

    Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce

    2014-08-15

    This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Camouflage target reconnaissance based on hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Hua, Wenshen; Guo, Tong; Liu, Xun

    2015-08-01

    Efficient camouflaged target reconnaissance technology makes great influence on modern warfare. Hyperspectral images can provide large spectral range and high spectral resolution, which are invaluable in discriminating between camouflaged targets and backgrounds. Hyperspectral target detection and classification technology are utilized to achieve single class and multi-class camouflaged targets reconnaissance respectively. Constrained energy minimization (CEM), a widely used algorithm in hyperspectral target detection, is employed to achieve one class camouflage target reconnaissance. Then, support vector machine (SVM), a classification method, is proposed to achieve multi-class camouflage target reconnaissance. Experiments have been conducted to demonstrate the efficiency of the proposed method.

  17. Dehalococcoides as a Potential Biomarker Evidence for Uncharacterized Organohalides in Environmental Samples

    PubMed Central

    Lu, Qihong; Yu, Ling; Liang, Zhiwei; Yan, Qingyun; He, Zhili; Luan, Tiangang; Liang, Dawei; Wang, Shanquan

    2017-01-01

    The massive production and improper disposal of organohalides resulted in worldwide contamination in soil and water. However, their environmental survey based on chromatographic methods was hindered by challenges in testing the extremely wide variety of organohalides. Dehalococcoides as obligate organohalide-respiring bacteria exclusively use organohalides as electron acceptors to support their growth, of which the presence could be coupled with organohalides and, therefore, could be employed as a biomarker of the organohalide pollution. In this study, Dehalococcoides was screened in various samples of bioreactors and subsurface environments, showing the wide distribution of Dehalococcoides in sludge and sediment. Further laboratory cultivation confirmed the dechlorination activities of those Dehalococcoides. Among those samples, Dehalococcoides accounting for 1.8% of the total microbial community was found in an anaerobic granular sludge sample collected from a full-scale bioreactor treating petroleum wastewater. Experimental evidence suggested that the influent wastewater in the bioreactor contained bromomethane which support the growth of Dehalococcoides. This study demonstrated that Dehalococcoides could be employed as a promising biomarker to test the present of organohalides in wastestreams or other environmental samples. PMID:28919889

  18. Rethinking UK Small Employers' Skills Policies and the Role of Workplace Learning

    ERIC Educational Resources Information Center

    Kitching, John

    2008-01-01

    Small business employers in the UK are widely perceived as adopting a reactive, ad hoc approach to employee skill formation. Employer reliance on workplace learning is often treated, explicitly or implicitly, as evidence of such an approach. Small employers' approaches to skill creation are investigated using data from two employer samples. Three…

  19. Is there an occupational therapy employment crisis within Australia? An investigation into two consecutive cohorts of occupational therapy graduates from a single Victorian University identifying trends in employment.

    PubMed

    Fay, Pearse; Adamson, Lynne

    2017-12-01

    Within the context of growing concerns about a potential oversupply of occupational therapist, this research examines when, where and how long new graduates take to gain employment and identifies influences upon the health and university systems. A mixed method research design, using an online survey was adopted to investigate the topic. Two consecutive cohorts of graduates from a single university program were invited to participate. Seventy-five (58%) responses were received, with 63 (84%) currently employed in an occupational therapy role. Of the 12 (16%) not employed, only 3 (4%) described themselves as actively seeking employment in an occupational therapy role. A wide spread of employment settings and scope of practice areas was reported. Findings suggest that occupational therapy graduates are gaining employment in a range of settings and practice areas, relatively quickly. This research adds evidence to the conversation around graduate employment within a region of Australia. The Australian population, health system and university changes are possible factors influencing employment. The research reveals the difficulties in understanding the current situation with limitations in data collected, varied terminology and an ever changing job seeking environment. The research provides a starting point for the occupational therapy profession to further understand the directions the profession is taking. University programs may also benefit by using the research to tailor course content to assist graduates in gaining employment or to present students with the prospects of new employment opportunities. © 2017 Occupational Therapy Australia.

  20. Comparative study to develop a single method for retrieving wide class of recombinant proteins from classical inclusion bodies.

    PubMed

    Padhiar, Arshad Ahmed; Chanda, Warren; Joseph, Thomson Patrick; Guo, Xuefang; Liu, Min; Sha, Li; Batool, Samana; Gao, Yifan; Zhang, Wei; Huang, Min; Zhong, Mintao

    2018-03-01

    The formation of inclusion bodies (IBs) is considered as an Achilles heel of heterologous protein expression in bacterial hosts. Wide array of techniques has been developed to recover biochemically challenging proteins from IBs. However, acquiring the active state even from the same protein family was found to be an independent of single established method. Here, we present a new strategy for the recovery of wide sub-classes of recombinant protein from harsh IBs. We found that numerous methods and their combinations for reducing IB formation and producing soluble proteins were not effective, if the inclusion bodies were harsh in nature. On the other hand, different practices with mild solubilization buffers were able to solubilize IBs completely, yet the recovery of active protein requires large screening of refolding buffers. With the integration of previously reported mild solubilization techniques, we proposed an improved method, which comprised low sarkosyl concentration, ranging from 0.05 to 0.1% coupled with slow freezing (- 1 °C/min) and fast thaw (room temperature), resulting in greater solubility and the integrity of solubilized protein. Dilution method was employed with single buffer to restore activity for every sub-class of recombinant protein. Results showed that the recovered protein's activity was significantly higher compared with traditional solubilization/refolding approach. Solubilization of IBs by the described method was proved milder in nature, which restored native-like conformation of proteins within IBs.

  1. Improved quality-by-design compliant methodology for method development in reversed-phase liquid chromatography.

    PubMed

    Debrus, Benjamin; Guillarme, Davy; Rudaz, Serge

    2013-10-01

    A complete strategy dedicated to quality-by-design (QbD) compliant method development using design of experiments (DOE), multiple linear regressions responses modelling and Monte Carlo simulations for error propagation was evaluated for liquid chromatography (LC). The proposed approach includes four main steps: (i) the initial screening of column chemistry, mobile phase pH and organic modifier, (ii) the selectivity optimization through changes in gradient time and mobile phase temperature, (iii) the adaptation of column geometry to reach sufficient resolution, and (iv) the robust resolution optimization and identification of the method design space. This procedure was employed to obtain a complex chromatographic separation of 15 antipsychotic basic drugs, widely prescribed. To fully automate and expedite the QbD method development procedure, short columns packed with sub-2 μm particles were employed, together with a UHPLC system possessing columns and solvents selection valves. Through this example, the possibilities of the proposed QbD method development workflow were exposed and the different steps of the automated strategy were critically discussed. A baseline separation of the mixture of antipsychotic drugs was achieved with an analysis time of less than 15 min and the robustness of the method was demonstrated simultaneously with the method development phase. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. A Systematic Review of Current Understandings of Employability

    ERIC Educational Resources Information Center

    Williams, Stella; Dodd, Lorna J.; Steele, Catherine; Randall, Raymond

    2016-01-01

    A theoretical framework is essential for the effective evaluation of employability. However, there are a wide range of definitions of employability coexisting in current literature. A review into existing ways in which employability has been conceptualised is needed to inform a better understanding of the nature of contributions made by various…

  3. Hydrothermally derived nanoporous titanium dioxide nanorods/nanoparticles and their influence in dye-sensitized solar cell as a photoanode

    NASA Astrophysics Data System (ADS)

    Rajamanickam, Govindaraj; Narendhiran, Santhosh; Muthu, Senthil Pandian; Mukhopadhyay, Sumita; Perumalsamy, Ramasamy

    2017-12-01

    Titanium dioxide is a promising wide band gap semiconducting material for dye-sensitized solar cell. The poor electron transport properties still remain a challenge with conventional nanoparticles. Here, we synthesized TiO2 nanorods/nanoparticles by hydrothermal method to improve the charge transport properties. The structural and morphological information of the prepared nanorods/nanoparticles was analysed with X-ray diffraction and electron microscopy analysis, respectively. A high power conversion efficiency of 7.7% is achieved with nanorods/nanoparticles employed device under 100 mW/cm2. From the electrochemical impedance analysis, superior electron transport properties have been found for synthesized TiO2 nanorods/nanoparticles employed device than commercial P25 nanoparticles based device.

  4. A general panel method for the analysis and design of arbitrary configurations in incompressible flows. [boundary value problem

    NASA Technical Reports Server (NTRS)

    Johnson, F. T.

    1980-01-01

    A method for solving the linear integral equations of incompressible potential flow in three dimensions is presented. Both analysis (Neumann) and design (Dirichlet) boundary conditions are treated in a unified approach to the general flow problem. The method is an influence coefficient scheme which employs source and doublet panels as boundary surfaces. Curved panels possessing singularity strengths, which vary as polynomials are used, and all influence coefficients are derived in closed form. These and other features combine to produce an efficient scheme which is not only versatile but eminently suited to the practical realities of a user-oriented environment. A wide variety of numerical results demonstrating the method is presented.

  5. [The comparative assessment of the practical value of the currently employed methods for the recognition and species specificity of the blood].

    PubMed

    Grezina, N Iu; Suleĭmenova, G M

    2011-01-01

    The objective of the present study was to evaluate sensitivity and specificity of the HemDirect method on test-plates (Seratec) for detecting human hemoglobin (HHb). These characteristics were compared with those of other widely used methods designed for the detection of blood traces, viz. thin layer chromatography, hemotest, spectrofluorimetry, and identification of blood species specificity (by countercurrent immunoelectrophoresis in agar and on the acetate-cellulose film). It was shown that the HemDirect test is highly specific and far more sensitive than other techniques used for the same purpose in the practical work. It can be recommended as the method of choice for the detection of blood microtraces.

  6. Methods to examine reproductive biology in free-ranging, fully-marine mammals.

    PubMed

    Lanyon, Janet M; Burgess, Elizabeth A

    2014-01-01

    Historical overexploitation of marine mammals, combined with present-day pressures, has resulted in severely depleted populations, with many species listed as threatened or endangered. Understanding breeding patterns of threatened marine mammals is crucial to assessing population viability, potential recovery and conservation actions. However, determining reproductive parameters of wild fully-marine mammals (cetaceans and sirenians) is challenging due to their wide distributions, high mobility, inaccessible habitats, cryptic lifestyles and in many cases, large body size and intractability. Consequently, reproductive biologists employ an innovative suite of methods to collect useful information from these species. This chapter reviews historic, recent and state-of-the-art methods to examine diverse aspects of reproduction in fully-aquatic mammals.

  7. A survey of analytical methods employed for monitoring of Advanced Oxidation/Reduction Processes for decomposition of selected perfluorinated environmental pollutants.

    PubMed

    Trojanowicz, Marek; Bobrowski, Krzysztof; Szostek, Bogdan; Bojanowska-Czajka, Anna; Szreder, Tomasz; Bartoszewicz, Iwona; Kulisa, Krzysztof

    2018-01-15

    The monitoring of Advanced Oxidation/Reduction Processes (AO/RPs) for the evaluation of the yield and mechanisms of decomposition of perfluorinated compounds (PFCs) is often a more difficult task than their determination in the environmental, biological or food samples with complex matrices. This is mostly due to the formation of hundreds, or even thousands, of both intermediate and final products. The considered AO/RPs, involving free radical reactions, include photolytic and photocatalytic processes, Fenton reactions, sonolysis, ozonation, application of ionizing radiation and several wet oxidation processes. The main attention is paid to the most commonly occurring PFCs in the environment, namely PFOA and PFOS. The most powerful and widely exploited method for this purpose is without a doubt LC/MS/MS, which allows the identification and trace quantitation of all species with detectability and resolution power depending on the particular instrumental configurations. The GC/MS is often employed for the monitoring of volatile fluorocarbons, confirming the formation of radicals in the processes of C‒C and C‒S bonds cleavage. For the direct monitoring of radicals participating in the reactions of PFCs decomposition, the molecular spectrophotometry is employed, especially electron paramagnetic resonance (EPR). The UV/Vis spectrophotometry as a detection method is of special importance in the evaluation of kinetics of radical reactions with the use of pulse radiolysis methods. The most commonly employed for the determination of the yield of mineralization of PFCs is ion-chromatography, but there is also potentiometry with ion-selective electrode and the measurements of general parameters such as Total Organic Carbon and Total Organic Fluoride. The presented review is based on about 100 original papers published in both analytical and environmental journals. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  9. Specific and sensitive detection of the conifer pathogen Gremmeniella abietina by nested PCR

    PubMed Central

    Zeng, Qing-Yin; Hansson, Per; Wang, Xiao-Ru

    2005-01-01

    Background Gremmeniella abietina (Lagerb.) Morelet is an ascomycete fungus that causes stem canker and shoot dieback in many conifer species. The fungus is widespread and causes severe damage to forest plantations in Europe, North America and Asia. To facilitate early diagnosis and improve measures to control the spread of the disease, rapid, specific and sensitive detection methods for G. abietina in conifer hosts are needed. Results We designed two pairs of specific primers for G. abietina based on the 18S rDNA sequence variation pattern. These primers were validated against a wide range of fungi and 14 potential conifer hosts. Based on these specific primers, two nested PCR systems were developed. The first system employed universal fungal primers to enrich the fungal DNA targets in the first round, followed by a second round selective amplification of the pathogen. The other system employed G. abietina-specific primers in both PCR steps. Both approaches can detect the presence of G. abietina in composite samples with high sensitivity, as little as 7.5 fg G. abietina DNA in the host genomic background. Conclusion The methods described here are rapid and can be applied directly to a wide range of conifer species, without the need for fungal isolation and cultivation. Therefore, it represents a promising alternative to disease inspection in forest nurseries, plantations and quarantine control facilities. PMID:16280082

  10. Supported employment for persons with serious mental illness: current status and future directions.

    PubMed

    Mueser, K T; McGurk, S R

    2014-06-01

    The individual placement and supported (IPS) model of supported employment is the most empirically validated model of vocational rehabilitation for persons with schizophrenia or another serious mental illness. Over 18 randomized controlled trials have been conducted throughout the world demonstrating the effectiveness of supported employment at improving competitive work compared to other vocational programs: IPS supported employment is defined by the following principles: 1) inclusion of all clients who want to work; 2) integration of vocational and clinical services; 3) focus on competitive employment; 4) rapid job search and no required prevocational skills training; 5) job development by the employment specialist; 6) attention to client preferences about desired work and disclosure of mental illness to prospective employers; 7) benefits counseling; and 8) follow-along supports after a job is obtained. Supported employment has been successfully implemented in a wide range of cultural and clinical populations, although challenges to implementation are also encountered. Common challenges are related to problems such as the failure to access technical assistance, system issues, negative beliefs and attitudes of providers, funding restrictions, and poor leadership. These challenges can be overcome by tapping expertise in IPS supported employment, including standardized and tested models of training and consultation. Efforts are underway to increase the efficiency of training methods for supported employment and the overall program, and to improve its effectiveness for those clients who do not benefit. Progress in IPS supported employment offers people with a serious mental illness realistic hope for achieving their work goals, and taking greater control over their lives. Copyright © 2014. Published by Elsevier Masson SAS.

  11. Conserving the linear momentum in stochastic dynamics: Dissipative particle dynamics as a general strategy to achieve local thermostatization in molecular dynamics simulations.

    PubMed

    Passler, Peter P; Hofer, Thomas S

    2017-02-15

    Stochastic dynamics is a widely employed strategy to achieve local thermostatization in molecular dynamics simulation studies; however, it suffers from an inherent violation of momentum conservation. Although this short-coming has little impact on structural and short-time dynamic properties, it can be shown that dynamics in the long-time limit such as diffusion is strongly dependent on the respective thermostat setting. Application of the methodically similar dissipative particle dynamics (DPD) provides a simple, effective strategy to ensure the advantages of local, stochastic thermostatization while at the same time the linear momentum of the system remains conserved. In this work, the key parameters to employ the DPD thermostats in the framework of periodic boundary conditions are investigated, in particular the dependence of the system properties on the size of the DPD-region as well as the treatment of forces near the cutoff. Structural and dynamical data for light and heavy water as well as a Lennard-Jones fluid have been compared to simulations executed via stochastic dynamics as well as via use of the widely employed Nose-Hoover chain and Berendsen thermostats. It is demonstrated that a small size of the DPD region is sufficient to achieve local thermalization, while at the same time artifacts in the self-diffusion characteristic for stochastic dynamics are eliminated. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. Automation of sample processing for ICP-MS determination of 90Sr radionuclide at ppq level for nuclear technology and environmental purposes.

    PubMed

    Kołacińska, Kamila; Chajduk, Ewelina; Dudek, Jakub; Samczyński, Zbigniew; Łokas, Edyta; Bojanowska-Czajka, Anna; Trojanowicz, Marek

    2017-07-01

    90 Sr is a widely determined radionuclide for environmental purposes, nuclear waste control, and can be also monitored in coolants in nuclear reactor plants. In the developed method, the ICP-MS detection was employed together with sample processing in sequential injection analysis (SIA) setup, equipped with a lab-on-valve with mechanized renewal of sorbent bed for solid-phase extraction. The optimized conditions of determination included preconcentration of 90 Sr on cation-exchange column and removal of different type of interferences using extraction Sr-resin. The limit of detection of the developed procedure depends essentially on the configuration of the employed ICP-MS spectrometer and on the available volume of the sample to be analyzed. For 1L initial sample volume, the method detection limit (MDL) value was evaluated as 2.9ppq (14.5BqL -1 ). The developed method was applied to analyze spiked river water samples, water reference materials, and also simulated and real samples of the nuclear reactor coolant. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Temperature Control in a Franz Diffusion Cell Skin Sonoporation Setup

    NASA Astrophysics Data System (ADS)

    Robertson, Jeremy; Becker, Sid

    2017-11-01

    In vitro experimental studies that investigate ultrasound enhanced transdermal drug delivery employ Franz diffusion cells. Because of absorption, the temperature of the coupling fluid often increases drastically during the ultrasound application. The current methodologies for controlling the coupling fluid temperature require either replacement of the coupling fluid during the experiment or the application of a time consuming duty cycle. This paper introduces a novel method for temperature control that allows for a wide variety of coupling fluid temperatures to be maintained. This method employs a peristaltic pump to circulate the coupling fluid through a thermoelectric cooling device. This temperature control method allowed for an investigation into the role of coupling fluid temperature on the inertial cavitation that impacts the skin aperture (inertial cavitation is thought to be the main cause of ultrasound induced skin permeability increase). Both foil pitting and passive cavitation detection experiments indicated that effective inertial cavitation activity decreases with increasing coupling fluid temperature. This finding suggests that greater skin permeability enhancement can be achieved if a lower coupling fluid temperature is maintained during skin insonation.

  14. Theoretical and numerical investigations towards a new geoid model for the Mediterranean Sea - The GEOMED2 project

    NASA Astrophysics Data System (ADS)

    Barzaghi, Riccardo; Vergos, Georgios S.; Albertella, Alberta; Carrion, Daniela; Cazzaniga, Noemi; Tziavos, Ilias N.; Grigoriadis, Vassilios N.; Natsiopoulos, Dimitrios A.; Bruinsma, Sean; Bonvalot, Sylvain; Lequentrec-Lalancette, Marie-Françoise; Bonnefond, Pascal; Knudsen, Per; Andersen, Ole; Simav, Mehmet; Yildiz, Hasan; Basic, Tomislav; Gil, Antonio J.

    2016-04-01

    The unique features of the Mediterranean Sea, with its large gravity variations, complex circulation, and geodynamic peculiarities have always constituted this semi-enclosed sea area as a unique geodetic, geodynamics and ocean laboratory. The main scope of the GEOMED 2 project is the collection of all available gravity, topography/bathymetry and satellite altimetry data in order to improve the representation of the marine geoid and estimate the Mean Dynamic sea surface Topography (MDT) and the circulation with higher accuracy and resolution. Within GEOMED2, the data employed are land and marine gravity data, GOCE/GRACE based Global Geopotential Models and a combination after proper validation of MISTRAL, HOMONIM and SRTM/bathymetry terrain models. In this work we present the results achieved for an inner test region spanning the Adriatic Sea area, bounded between 36o < φ < 48o and 10o < λ < 22o. Within this test region, the available terrain/bathymetry models have been evaluated in terms of their contribution to geoid modeling, the processing methodologies have been tested in terms of the provided geoid accuracy and finally some preliminary results on the MDT determination have been compiled. The aforementioned will server as the guide for the Mediterranean-wide marine geoid estimation. The processing methodology was based on the well-known remove-compute-restore method following both stochastic and spectral methods. Classic least-squares collocation (LSC) with errors has been employed, along with fast Fourier transform (FFT)-based techniques, the Least-Squares Modification of Stokes' Formula (KTH) method and windowed LSC. All methods have been evaluated against in-situ collocated GPS/Levelling geoid heights, using EGM2008 as a reference, in order to conclude on the one(s) to be used for the basin-wide geoid evaluation.

  15. A fully automated temperature-dependent resistance measurement setup using van der Pauw method

    NASA Astrophysics Data System (ADS)

    Pandey, Shivendra Kumar; Manivannan, Anbarasu

    2018-03-01

    The van der Pauw (VDP) method is widely used to identify the resistance of planar homogeneous samples with four contacts placed on its periphery. We have developed a fully automated thin film resistance measurement setup using the VDP method with the capability of precisely measuring a wide range of thin film resistances from few mΩ up to 10 GΩ under controlled temperatures from room-temperature up to 600 °C. The setup utilizes a robust, custom-designed switching network board (SNB) for measuring current-voltage characteristics automatically at four different source-measure configurations based on the VDP method. Moreover, SNB is connected with low noise shielded coaxial cables that reduce the effect of leakage current as well as the capacitance in the circuit thereby enhancing the accuracy of measurement. In order to enable precise and accurate resistance measurement of the sample, wide range of sourcing currents/voltages are pre-determined with the capability of auto-tuning for ˜12 orders of variation in the resistances. Furthermore, the setup has been calibrated with standard samples and also employed to investigate temperature dependent resistance (few Ω-10 GΩ) measurements for various chalcogenide based phase change thin films (Ge2Sb2Te5, Ag5In5Sb60Te30, and In3SbTe2). This setup would be highly helpful for measurement of temperature-dependent resistance of wide range of materials, i.e., metals, semiconductors, and insulators illuminating information about structural change upon temperature as reflected by change in resistances, which are useful for numerous applications.

  16. Meshless Method for Simulation of Compressible Flow

    NASA Astrophysics Data System (ADS)

    Nabizadeh Shahrebabak, Ebrahim

    In the present age, rapid development in computing technology and high speed supercomputers has made numerical analysis and computational simulation more practical than ever before for large and complex cases. Numerical simulations have also become an essential means for analyzing the engineering problems and the cases that experimental analysis is not practical. There are so many sophisticated and accurate numerical schemes, which do these simulations. The finite difference method (FDM) has been used to solve differential equation systems for decades. Additional numerical methods based on finite volume and finite element techniques are widely used in solving problems with complex geometry. All of these methods are mesh-based techniques. Mesh generation is an essential preprocessing part to discretize the computation domain for these conventional methods. However, when dealing with mesh-based complex geometries these conventional mesh-based techniques can become troublesome, difficult to implement, and prone to inaccuracies. In this study, a more robust, yet simple numerical approach is used to simulate problems in an easier manner for even complex problem. The meshless, or meshfree, method is one such development that is becoming the focus of much research in the recent years. The biggest advantage of meshfree methods is to circumvent mesh generation. Many algorithms have now been developed to help make this method more popular and understandable for everyone. These algorithms have been employed over a wide range of problems in computational analysis with various levels of success. Since there is no connectivity between the nodes in this method, the challenge was considerable. The most fundamental issue is lack of conservation, which can be a source of unpredictable errors in the solution process. This problem is particularly evident in the presence of steep gradient regions and discontinuities, such as shocks that frequently occur in high speed compressible flow problems. To solve this discontinuity problem, this research study deals with the implementation of a conservative meshless method and its applications in computational fluid dynamics (CFD). One of the most common types of collocating meshless method the RBF-DQ, is used to approximate the spatial derivatives. The issue with meshless methods when dealing with highly convective cases is that they cannot distinguish the influence of fluid flow from upstream or downstream and some methodology is needed to make the scheme stable. Therefore, an upwinding scheme similar to one used in the finite volume method is added to capture steep gradient or shocks. This scheme creates a flexible algorithm within which a wide range of numerical flux schemes, such as those commonly used in the finite volume method, can be employed. In addition, a blended RBF is used to decrease the dissipation ensuing from the use of a low shape parameter. All of these steps are formulated for the Euler equation and a series of test problems used to confirm convergence of the algorithm. The present scheme was first employed on several incompressible benchmarks to validate the framework. The application of this algorithm is illustrated by solving a set of incompressible Navier-Stokes problems. Results from the compressible problem are compared with the exact solution for the flow over a ramp and compared with solutions of finite volume discretization and the discontinuous Galerkin method, both requiring a mesh. The applicability of the algorithm and its robustness are shown to be applied to complex problems.

  17. Retail Trade. Industry Training Monograph No. 7.

    ERIC Educational Resources Information Center

    Dumbrell, Tom

    Australia's retailing sector is the largest single industry of employment, with more than 1.2 million workers. It is characterized by high levels of part-time and casual employment; a young work force, including many young people still in full-time education; and employment widely distributed geographically. Over the past 10 years, employment has…

  18. Fourier transform methods in local gravity modeling

    NASA Technical Reports Server (NTRS)

    Harrison, J. C.; Dickinson, M.

    1989-01-01

    New algorithms were derived for computing terrain corrections, all components of the attraction of the topography at the topographic surface and the gradients of these attractions. These algoriithms utilize fast Fourier transforms, but, in contrast to methods currently in use, all divergences of the integrals are removed during the analysis. Sequential methods employing a smooth intermediate reference surface were developed to avoid the very large transforms necessary when making computations at high resolution over a wide area. A new method for the numerical solution of Molodensky's problem was developed to mitigate the convergence difficulties that occur at short wavelengths with methods based on a Taylor series expansion. A trial field on a level surface is continued analytically to the topographic surface, and compared with that predicted from gravity observations. The difference is used to compute a correction to the trial field and the process iterated. Special techniques are employed to speed convergence and prevent oscillations. Three different spectral methods for fitting a point-mass set to a gravity field given on a regular grid at constant elevation are described. Two of the methods differ in the way that the spectrum of the point-mass set, which extends to infinite wave number, is matched to that of the gravity field which is band-limited. The third method is essentially a space-domain technique in which Fourier methods are used to solve a set of simultaneous equations.

  19. Assessment of Pansharpening Methods Applied to WorldView-2 Imagery Fusion.

    PubMed

    Li, Hui; Jing, Linhai; Tang, Yunwei

    2017-01-05

    Since WorldView-2 (WV-2) images are widely used in various fields, there is a high demand for the use of high-quality pansharpened WV-2 images for different application purposes. With respect to the novelty of the WV-2 multispectral (MS) and panchromatic (PAN) bands, the performances of eight state-of-art pan-sharpening methods for WV-2 imagery including six datasets from three WV-2 scenes were assessed in this study using both quality indices and information indices, along with visual inspection. The normalized difference vegetation index, normalized difference water index, and morphological building index, which are widely used in applications related to land cover classification, the extraction of vegetation areas, buildings, and water bodies, were employed in this work to evaluate the performance of different pansharpening methods in terms of information presentation ability. The experimental results show that the Haze- and Ratio-based, adaptive Gram-Schmidt, Generalized Laplacian pyramids (GLP) methods using enhanced spectral distortion minimal model and enhanced context-based decision model methods are good choices for producing fused WV-2 images used for image interpretation and the extraction of urban buildings. The two GLP-based methods are better choices than the other methods, if the fused images will be used for applications related to vegetation and water-bodies.

  20. Assessment of Pansharpening Methods Applied to WorldView-2 Imagery Fusion

    PubMed Central

    Li, Hui; Jing, Linhai; Tang, Yunwei

    2017-01-01

    Since WorldView-2 (WV-2) images are widely used in various fields, there is a high demand for the use of high-quality pansharpened WV-2 images for different application purposes. With respect to the novelty of the WV-2 multispectral (MS) and panchromatic (PAN) bands, the performances of eight state-of-art pan-sharpening methods for WV-2 imagery including six datasets from three WV-2 scenes were assessed in this study using both quality indices and information indices, along with visual inspection. The normalized difference vegetation index, normalized difference water index, and morphological building index, which are widely used in applications related to land cover classification, the extraction of vegetation areas, buildings, and water bodies, were employed in this work to evaluate the performance of different pansharpening methods in terms of information presentation ability. The experimental results show that the Haze- and Ratio-based, adaptive Gram-Schmidt, Generalized Laplacian pyramids (GLP) methods using enhanced spectral distortion minimal model and enhanced context-based decision model methods are good choices for producing fused WV-2 images used for image interpretation and the extraction of urban buildings. The two GLP-based methods are better choices than the other methods, if the fused images will be used for applications related to vegetation and water-bodies. PMID:28067770

  1. Introduction to methodology of dose-response meta-analysis for binary outcome: With application on software.

    PubMed

    Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang

    2018-05-01

    Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  2. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  3. Geology

    NASA Technical Reports Server (NTRS)

    Stewart, R. K.; Sabins, F. F., Jr.; Rowan, L. C.; Short, N. M.

    1975-01-01

    Papers from private industry reporting applications of remote sensing to oil and gas exploration were presented. Digitally processed LANDSAT images were successfully employed in several geologic interpretations. A growing interest in digital image processing among the geologic user community was shown. The papers covered a wide geographic range and a wide technical and application range. Topics included: (1) oil and gas exploration, by use of radar and multisensor studies as well as by use of LANDSAT imagery or LANDSAT digital data, (2) mineral exploration, by mapping from LANDSAT and Skylab imagery and by LANDSAT digital processing, (3) geothermal energy studies with Skylab imagery, (4) environmental and engineering geology, by use of radar or LANDSAT and Skylab imagery, (5) regional mapping and interpretation, and digital and spectral methods.

  4. Initial Assessment of U.S. Refineries for Purposes of Potential Bio-Based Oil Insertions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Charles J.; Jones, Susanne B.; Padmaperuma, Asanga B.

    2013-04-01

    In order to meet U.S. biofuel objectives over the coming decade the conversion of a broad range of biomass feedstocks, using diverse processing options, will be required. Further, the production of both gasoline and diesel biofuels will employ biomass conversion methods that produce wide boiling range intermediate oils requiring treatment similar to conventional refining processes (i.e. fluid catalytic cracking, hydrocracking, and hydrotreating). As such, it is widely recognized that leveraging existing U.S. petroleum refining infrastructure is key to reducing overall capital demands. This study examines how existing U.S. refining location, capacities and conversion capabilities match in geography and processing capabilitiesmore » with the needs projected from anticipated biofuels production.« less

  5. Multiple sequence alignment using multi-objective based bacterial foraging optimization algorithm.

    PubMed

    Rani, R Ranjani; Ramyachitra, D

    2016-12-01

    Multiple sequence alignment (MSA) is a widespread approach in computational biology and bioinformatics. MSA deals with how the sequences of nucleotides and amino acids are sequenced with possible alignment and minimum number of gaps between them, which directs to the functional, evolutionary and structural relationships among the sequences. Still the computation of MSA is a challenging task to provide an efficient accuracy and statistically significant results of alignments. In this work, the Bacterial Foraging Optimization Algorithm was employed to align the biological sequences which resulted in a non-dominated optimal solution. It employs Multi-objective, such as: Maximization of Similarity, Non-gap percentage, Conserved blocks and Minimization of gap penalty. BAliBASE 3.0 benchmark database was utilized to examine the proposed algorithm against other methods In this paper, two algorithms have been proposed: Hybrid Genetic Algorithm with Artificial Bee Colony (GA-ABC) and Bacterial Foraging Optimization Algorithm. It was found that Hybrid Genetic Algorithm with Artificial Bee Colony performed better than the existing optimization algorithms. But still the conserved blocks were not obtained using GA-ABC. Then BFO was used for the alignment and the conserved blocks were obtained. The proposed Multi-Objective Bacterial Foraging Optimization Algorithm (MO-BFO) was compared with widely used MSA methods Clustal Omega, Kalign, MUSCLE, MAFFT, Genetic Algorithm (GA), Ant Colony Optimization (ACO), Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Hybrid Genetic Algorithm with Artificial Bee Colony (GA-ABC). The final results show that the proposed MO-BFO algorithm yields better alignment than most widely used methods. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Measurement of thermal conductivity and thermal diffusivity using a thermoelectric module

    NASA Astrophysics Data System (ADS)

    Beltrán-Pitarch, Braulio; Márquez-García, Lourdes; Min, Gao; García-Cañadas, Jorge

    2017-04-01

    A proof of concept of using a thermoelectric module to measure both thermal conductivity and thermal diffusivity of bulk disc samples at room temperature is demonstrated. The method involves the calculation of the integral area from an impedance spectrum, which empirically correlates with the thermal properties of the sample through an exponential relationship. This relationship was obtained employing different reference materials. The impedance spectroscopy measurements are performed in a very simple setup, comprising a thermoelectric module, which is soldered at its bottom side to a Cu block (heat sink) and thermally connected with the sample at its top side employing thermal grease. Random and systematic errors of the method were calculated for the thermal conductivity (18.6% and 10.9%, respectively) and thermal diffusivity (14.2% and 14.7%, respectively) employing a BCR724 standard reference material. Although errors are somewhat high, the technique could be useful for screening purposes or high-throughput measurements at its current state. This new method establishes a new application for thermoelectric modules as thermal properties sensors. It involves the use of a very simple setup in conjunction with a frequency response analyzer, which provides a low cost alternative to most of currently available apparatus in the market. In addition, impedance analyzers are reliable and widely spread equipment, which facilities the sometimes difficult access to thermal conductivity facilities.

  7. Real-time point-of-care measurement of impaired renal function in a rat acute injury model employing exogenous fluorescent tracer agents

    NASA Astrophysics Data System (ADS)

    Dorshow, Richard B.; Fitch, Richard M.; Galen, Karen P.; Wojdyla, Jolette K.; Poreddy, Amruta R.; Freskos, John N.; Rajagopalan, Raghavan; Shieh, Jeng-Jong; Demirjian, Sevag G.

    2013-02-01

    Renal function assessment is needed for the detection of acute kidney injury and chronic kidney disease. Glomerular filtration rate (GFR) is now widely accepted as the best indicator of renal function, and current clinical guidelines advocate its use in the staging of kidney disease. The optimum measure of GFR is by the use of exogenous tracer agents. However current clinically employed agents lack sensitivity or are cumbersome to use. An exogenous GFR fluorescent tracer agent, whose elimination rate could be monitored noninvasively through skin would provide a substantial improvement over currently available methods. We developed a series of novel aminopyrazine analogs for use as exogenous fluorescent GFR tracer agents that emit light in the visible region for monitoring GFR noninvasively over skin. In rats, these compounds are eliminated by the kidney with urine recovery greater than 90% of injected dose, are not broken down or metabolized in vivo, are not secreted by the renal tubules, and have clearance values similar to a GFR reference compound, iothalamate. In addition, biological half-life of these compounds measured in rats by noninvasive optical methods correlated with plasma derived methods. In this study, we show that this noninvasive methodology with our novel fluorescent tracer agents can detect impaired renal function. A 5/6th nephrectomy rat model is employed.

  8. Effect of immobilization technique on performance ZnO nanorods based enzymatic electrochemical glucose biosensor

    NASA Astrophysics Data System (ADS)

    Shukla, Mayoorika; Pramila; Palani, I. A.; Singh, Vipul

    2017-11-01

    In this paper, ZnO Nanorods (ZNR) have been synthesized over Platinum (Pt) coated glass substrate with in-situ addition KMnO4 during hydrothermal growth process. Significant variation in ZnO nanostructures was observed by KMnO4 addition during the growth. Glucose oxidase was later immobilized over ZNRs. The as-prepared ZNRs were further utilized for glucose detection by employing amperometric electrochemical transduction method. In order to optimize the performance of the prepared biosensor two different immobilization techniques i.e. physical adsorption and cross linking have been employed and compared. Further investigations suggest that immobilization via cross linking method resulted in the improvement of the biosensor performance, thereby significantly affecting the sensitivity and linear range of the fabricated biosensor. Among the two types of biosensors fabricated using ZNR, the best performance was shown by cross linked electrodes. The sensitivity for the same was found to be 17.7 mA-cm-2-M-1, along with a wide linear range of 0.5-8.5 mM.

  9. Integrated Method for Purification and Single-Particle Characterization of Lentiviral Vector Systems by Size Exclusion Chromatography and Tunable Resistive Pulse Sensing.

    PubMed

    Heider, Susanne; Muzard, Julien; Zaruba, Marianne; Metzner, Christoph

    2017-07-01

    Elements derived from lentiviral particles such as viral vectors or virus-like particles are commonly used for biotechnological and biomedical applications, for example in mammalian protein expression, gene delivery or therapy, and vaccine development. Preparations of high purity are necessary in most cases, especially for clinical applications. For purification, a wide range of methods are available, from density gradient centrifugation to affinity chromatography. In this study we have employed size exclusion columns specifically designed for the easy purification of extracellular vesicles including exosomes. In addition to viral marker protein and total protein analysis, a well-established single-particle characterization technology, termed tunable resistive pulse sensing, was employed to analyze fractions of highest particle load and purity and characterize the preparations by size and surface charge/electrophoretic mobility. With this study, we propose an integrated platform combining size exclusion chromatography and tunable resistive pulse sensing for monitoring production and purification of viral particles.

  10. Convective flows in enclosures with vertical temperature or concentration gradients

    NASA Technical Reports Server (NTRS)

    Wang, L. W.; Chai, A. T.; Sun, D. J.

    1988-01-01

    The transport process in the fluid phase during the growth of a crystal has a profound influence on the structure and quality of the solid phase. In vertical growth techniques the fluid phase is often subjected to vertical temperature and concentration gradients. The main objective is to obtain more experimental data on convective flows in enclosures with vertical temperature or concentration gradients. Among actual crystal systems the parameters vary widely. The parametric ranges studied for mass transfer are mainly dictated by the electrochemical system employed to impose concentration gradients. Temperature or concentration difference are maintained between two horizontal end walls. The other walls are kept insulated. Experimental measurements and observations were made of the heat transfer or mass transfer, flow patterns, and the mean and fluctuating temperature distribution. The method used to visualize the flow pattern in the thermal cases is an electrochemical pH-indicator method. Laser shadowgraphs are employed to visualize flow patterns in the solutal cases.

  11. Convective flows in enclosures with vertical temperature or concentration gradients

    NASA Technical Reports Server (NTRS)

    Wang, L. W.; Chai, A. T.; Sun, D. J.

    1989-01-01

    The transport process in the fluid phase during the growth of a crystal has a profound influence on the structure and quality of the solid phase. In vertical growth techniques the fluid phase is often subjected to vertical temperature and concentration gradients. The main objective is to obtain more experimental data on convective flows in enclosures with vertical temperature or concentration gradients. Among actual crystal systems the parameters vary widely. The parametric ranges studied for mass transfer are mainly dictated by the electrochemical system employed to impose concentration gradients. Temperature or concentration difference are maintained between two horizontal end walls. The other walls are kept insulated. Experimental measurements and observations were made of the heat transfer or mass transfer, flow patterns, and the mean and fluctuating temperature distribution. The method used to visualize the flow pattern in the thermal cases is an electrochemical pH-indicator method. Laser shadowgraphs are employed to visualize flow patterns in the solutal cases.

  12. Employer Engagement in Education: Literature Review

    ERIC Educational Resources Information Center

    Mann, Anthony; Dawkins, James

    2014-01-01

    The subject of this paper is employer engagement in education. In this, the authors consider the range of different ways that employers can support the learning and progression of young people in British schools. The paper draws on a wide range of source material to ask: What are the typical benefits of different types of employer engagement? Do…

  13. Employability Skills, the Student Path, and the Role of the Academic Library and Partners

    ERIC Educational Resources Information Center

    Tyrer, Gwyneth; Ives, Joanne; Corke, Charlotte

    2013-01-01

    This case study explores the introduction of a university wide employability program by the World of Work Careers Centre (WOWCC) at Liverpool John Moores University (LJMU). The article reports the background against which an employability program was implemented; the justification and growing demand for more emphasis on employability skills in…

  14. The Senior Community Service Employment Program: The First 25 Years.

    ERIC Educational Resources Information Center

    Salisbury, Karen, Ed.

    The Senior Community Service Employment Program (SCSEP) provides subsidized, part-time employment to low-income persons age 55 and older. Participants work an average of 20 hours a week and are employed in a wide variety of community service activities and facilities, including home health care, adult day care, and nutritional services. The 11…

  15. Direct one-pot reductive amination of aldehydes with nitroarenes in a domino fashion: catalysis by gum-acacia-stabilized palladium nanoparticles.

    PubMed

    Sreedhar, B; Reddy, P Surendra; Devi, D Keerthi

    2009-11-20

    This note describes the direct reductive amination of carbonyl compounds with nitroarenes using gum acacia-palladium nanoparticles, employing molecular hydrogen as the reductant. This methodology is found to be applicable to both aliphatic and aromatic aldehydes and a wide range of nitroarenes. The operational simplicity and the mild reaction conditions add to the value of this method as a practical alternative to the reductive amination of carbonyl compounds.

  16. High-pressure torsion for new hydrogen storage materials.

    PubMed

    Edalati, Kaveh; Akiba, Etsuo; Horita, Zenji

    2018-01-01

    High-pressure torsion (HPT) is widely used as a severe plastic deformation technique to create ultrafine-grained structures with promising mechanical and functional properties. Since 2007, the method has been employed to enhance the hydrogenation kinetics in different Mg-based hydrogen storage materials. Recent studies showed that the method is effective not only for increasing the hydrogenation kinetics but also for improving the hydrogenation activity, for enhancing the air resistivity and more importantly for synthesizing new nanostructured hydrogen storage materials with high densities of lattice defects. This manuscript reviews some major findings on the impact of HPT process on the hydrogen storage performance of different titanium-based and magnesium-based materials.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Suvam; Naghma, Rahla; Kaur, Jaspreet

    The total and ionization cross sections for electron scattering by benzene, halobenzenes, toluene, aniline, and phenol are reported over a wide energy domain. The multi-scattering centre spherical complex optical potential method has been employed to find the total elastic and inelastic cross sections. The total ionization cross section is estimated from total inelastic cross section using the complex scattering potential-ionization contribution method. In the present article, the first theoretical calculations for electron impact total and ionization cross section have been performed for most of the targets having numerous practical applications. A reasonable agreement is obtained compared to existing experimental observationsmore » for all the targets reported here, especially for the total cross section.« less

  18. β-L-Arabinofuranosylation Conducted by 5-O-(2-pyridinecarbonyl)-L-arabinofuranosyl Trichloroacetimidate.

    PubMed

    Li, Hong-Zhan; Ding, Jie; Cheng, Chun-Ru; Chen, Yue; Liang, Xing-Yong

    2018-05-02

    We describe a β-L-arabinofuranosylation method by employing the 5-O-(2-pyridinecarbonyl)-L-arabinofuranosyl trichloroacetimidate 10 as a donor. This approach allows a wide range of acceptor substrates, especially amino acid acceptors, to be used. Stereoselective synthesis of β-(1,4)-L-arabinofuranosyl-(2S, 4R)-4-hydroxy-L-proline (β-L-Araf-L-Hyp 4 ) and its dimer is achieved readily by this method. Both the stereoselectivities and yields of the reactions are excellent. To demonstrate the utility of this methodology, the preparation of a trisaccharide in a one-pot manner was carried out. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. A time series modeling approach in risk appraisal of violent and sexual recidivism.

    PubMed

    Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E

    2010-10-01

    For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.

  20. Fitting ordinary differential equations to short time course data.

    PubMed

    Brewer, Daniel; Barenco, Martino; Callard, Robin; Hubank, Michael; Stark, Jaroslav

    2008-02-28

    Ordinary differential equations (ODEs) are widely used to model many systems in physics, chemistry, engineering and biology. Often one wants to compare such equations with observed time course data, and use this to estimate parameters. Surprisingly, practical algorithms for doing this are relatively poorly developed, particularly in comparison with the sophistication of numerical methods for solving both initial and boundary value problems for differential equations, and for locating and analysing bifurcations. A lack of good numerical fitting methods is particularly problematic in the context of systems biology where only a handful of time points may be available. In this paper, we present a survey of existing algorithms and describe the main approaches. We also introduce and evaluate a new efficient technique for estimating ODEs linear in parameters particularly suited to situations where noise levels are high and the number of data points is low. It employs a spline-based collocation scheme and alternates linear least squares minimization steps with repeated estimates of the noise-free values of the variables. This is reminiscent of expectation-maximization methods widely used for problems with nuisance parameters or missing data.

  1. Using GIS-based methods and lidar data to estimate rooftop solar technical potential in US cities

    DOE PAGES

    Margolis, Robert; Gagnon, Pieter; Melius, Jennifer; ...

    2017-07-06

    Here, we estimate the technical potential of rooftop solar photovoltaics (PV) for select US cities by combining light detection and ranging (lidar) data, a validated analytical method for determining rooftop PV suitability employing geographic information systems, and modeling of PV electricity generation. We find that rooftop PV's ability to meet estimated city electricity consumption varies widely - from meeting 16% of annual consumption (in Washington, DC) to meeting 88% (in Mission Viejo, CA). Important drivers include average rooftop suitability, household footprint/per-capita roof space, the quality of the solar resource, and the city's estimated electricity consumption. In addition to city-wide results,more » we also estimate the ability of aggregations of households to offset their electricity consumption with PV. In a companion article, we will use statistical modeling to extend our results and estimate national rooftop PV technical potential. In addition, our publically available data and methods may help policy makers, utilities, researchers, and others perform customized analyses to meet their specific needs.« less

  2. Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.

    PubMed

    Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si

    2017-07-01

    Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.

  3. Minimally-invasive Sampling of Interleukin-1α and Interleukin-1 Receptor Antagonist from the Skin: A Systematic Review of In vivo Studies in Humans.

    PubMed

    Falcone, Denise; Spee, Pieter; van de Kerkhof, Peter C M; van Erp, Piet E J

    2017-10-02

    Interleukin-1α (IL-1α) and its receptor antagonist IL-1RA play a pivotal role in skin homeostasis and disease. Although the use of biopsies to sample these cytokines from human skin is widely employed in dermatological practice, knowledge about less invasive, in vivo sampling methods is scarce. The aim of this study was to provide an overview of such methods by systematically reviewing studies in Medline, EMBASE, Web of Science and Cochrane Library using combinations of the terms "IL-1α", IL-1RA", "skin", "human", including all possible synonyms. Quality was assessed using the STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist. The search, performed on 14 October 2016, revealed 10 different sampling methods, with varying degrees of invasiveness and wide application spectrum, including assessment of both normal and diseased skin, from several body sites. The possibility to sample quantifiable amounts of cytokines from human skin with no or minimal discomfort holds promise for linking clinical outcomes to molecular profiles of skin inflammation.

  4. Using GIS-based methods and lidar data to estimate rooftop solar technical potential in US cities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margolis, Robert; Gagnon, Pieter; Melius, Jennifer

    Here, we estimate the technical potential of rooftop solar photovoltaics (PV) for select US cities by combining light detection and ranging (lidar) data, a validated analytical method for determining rooftop PV suitability employing geographic information systems, and modeling of PV electricity generation. We find that rooftop PV's ability to meet estimated city electricity consumption varies widely - from meeting 16% of annual consumption (in Washington, DC) to meeting 88% (in Mission Viejo, CA). Important drivers include average rooftop suitability, household footprint/per-capita roof space, the quality of the solar resource, and the city's estimated electricity consumption. In addition to city-wide results,more » we also estimate the ability of aggregations of households to offset their electricity consumption with PV. In a companion article, we will use statistical modeling to extend our results and estimate national rooftop PV technical potential. In addition, our publically available data and methods may help policy makers, utilities, researchers, and others perform customized analyses to meet their specific needs.« less

  5. Soft X-ray-assisted detection method for airborne molecular contaminations (AMCs)

    NASA Astrophysics Data System (ADS)

    Kim, Changhyuk; Zuo, Zhili; Finger, Hartmut; Haep, Stefan; Asbach, Christof; Fissan, Heinz; Pui, David Y. H.

    2015-03-01

    Airborne molecular contaminations (AMCs) represent a wide range of gaseous contaminants in cleanrooms. Due to the unintentional nanoparticle or haze formation as well as doping caused by AMCs, improved monitoring and controlling methods for AMCs are urgent in the semiconductor industry. However, measuring ultra-low concentrations of AMCs in cleanrooms is difficult, especially, behind a gas filter. In this study, a novel detection method for AMCs, which is on-line, economical, and applicable for diverse AMCs, was developed by employing gas-to-particle conversion with soft X-ray, and then measuring the generated nanoparticles. Feasibility study of this method was conducted through the evaluations of granular-activated carbons (GACs), which are widely used AMC filter media. Sulfur dioxide (SO2) was used as an AMC for the feasibility study. Using this method, the ultra-low concentrations of SO2 behind GACs were determined in terms of concentrations of generated sulfuric acid (H2SO4) nanoparticles. By calculating SO2 concentrations from the nanoparticle concentrations using empirical correlation equations between them, remarkable sensitivity of this method to SO2 was shown, down to parts-per-trillions, which are too low to detect using commercial gas sensors. Also, the calculated SO2 concentrations showed good agreement with those measured simultaneously by a commercial SO2 monitor at parts-per-billions.

  6. An asymptotic theory for cross-correlation between auto-correlated sequences and its application on neuroimaging data.

    PubMed

    Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng

    2018-04-20

    Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.

  7. The employment impacts of economy-wide investments in renewable energy and energy efficiency

    NASA Astrophysics Data System (ADS)

    Garrett-Peltier, Heidi

    This dissertation examines the employment impacts of investments in renewable energy and energy efficiency in the U.S. A broad expansion of the use of renewable energy in place of carbon-based energy, in addition to investments in energy efficiency, comprise a prominent strategy to slow or reverse the effects of anthropogenic climate change. This study first explores the literature on the employment impacts of these investments. This literature to date consists mainly of input-output (I-O) studies or case studies of renewable energy and energy efficiency (REEE). Researchers are constrained, however, by their ability to use the I-O model to study REEE, since currently industrial codes do not recognize this industry as such. I develop and present two methods to use the I-O framework to overcome this constraint: the synthetic and integrated approaches. In the former, I proxy the REEE industry by creating a vector of final demand based on the industrial spending patterns of REEE firms as found in the secondary literature. In the integrated approach, I collect primary data through a nationwide survey of REEE firms and integrate these data into the existing I-O tables to explicitly identify the REEE industry and estimate the employment impacts resulting from both upstream and downstream linkages with other industries. The size of the REEE employment multiplier is sensitive to the choice of method, and is higher using the synthetic approach than using the integrated approach. I find that using both methods, the employment level per $1 million demand is approximately three times greater for the REEE industry than for fossil fuel (FF) industries. This implies that a shift to clean energy will result in positive net employment impacts. The positive effects stem mainly from the higher labor intensity of REEE in relation to FF, as well as from higher domestic content and lower average wages. The findings suggest that as we transition away from a carbon-based energy system to more sustainable and low-carbon energy sources, approximately three jobs will be created in clean energy sectors for each job lost in the fossil fuel sector.

  8. In vivo evaluation of drug delivery after ultrasound application: A new use for the photoacoustic technique

    NASA Astrophysics Data System (ADS)

    Barja, P. R.; Acosta-Avalos, D.; Rompe, P. C. B.; Dos Anjos, F. H.; Marciano, F. R.; da Silva, M. D.

    2005-06-01

    Ultrasound application is a therapeutical resource widely employed in physiotherapy. One of its applications is the phonophoresis, a technique in which the ultrasound radiation is utilized to deliver drugs through the skin to soft tissues. The proposal of our study was to employ the Photoacoustic Technique to evaluate the efficacy of such treatment, analyzing if phonophoresis could enhance drug delivery through skin when compared to the more traditional method of manual massage. The configuration of the system employed was such that it was possible to perform in vivo measurements, which is a pre-requisite for this kind of study. The changes observed in the photoacoustic signal amplitude after each form of drug application were attributed to changes in the thermal effusivity of the system, due to penetration of the drug. The technique was able to detect differences in drug delivery between the specified physiotherapy treatments, indicating that phonophoresis enhances drug absorption by tissue.

  9. The Sociology of Discrimination: Racial Discrimination in Employment, Housing, Credit, and Consumer Markets

    PubMed Central

    Pager, Devah; Shepherd, Hana

    2010-01-01

    Persistent racial inequality in employment, housing, and a wide range of other social domains has renewed interest in the possible role of discrimination. And yet, unlike in the pre–civil rights era, when racial prejudice and discrimination were overt and widespread, today discrimination is less readily identifiable, posing problems for social scientific conceptualization and measurement. This article reviews the relevant literature on discrimination, with an emphasis on racial discrimination in employment, housing, credit markets, and consumer interactions. We begin by defining discrimination and discussing relevant methods of measurement. We then provide an overview of major findings from studies of discrimination in each of the four domains; and, finally, we turn to a discussion of the individual, organizational, and structural mechanisms that may underlie contemporary forms of discrimination. This discussion seeks to orient readers to some of the key debates in the study of discrimination and to provide a roadmap for those interested in building upon this long and important line of research. PMID:20689680

  10. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    PubMed Central

    Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-01-01

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem. PMID:29597286

  11. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    PubMed

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  12. Application of Jacobian-free Newton–Krylov method in implicitly solving two-fluid six-equation two-phase flow problems: Implementation, validation and benchmark

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2016-03-09

    This work represents a first-of-its-kind successful application to employ advanced numerical methods in solving realistic two-phase flow problems with two-fluid six-equation two-phase flow model. These advanced numerical methods include high-resolution spatial discretization scheme with staggered grids (high-order) fully implicit time integration schemes, and Jacobian-free Newton–Krylov (JFNK) method as the nonlinear solver. The computer code developed in this work has been extensively validated with existing experimental flow boiling data in vertical pipes and rod bundles, which cover wide ranges of experimental conditions, such as pressure, inlet mass flux, wall heat flux and exit void fraction. Additional code-to-code benchmark with the RELAP5-3Dmore » code further verifies the correct code implementation. The combined methods employed in this work exhibit strong robustness in solving two-phase flow problems even when phase appearance (boiling) and realistic discrete flow regimes are considered. Transitional flow regimes used in existing system analysis codes, normally introduced to overcome numerical difficulty, were completely removed in this work. As a result, this in turn provides the possibility to utilize more sophisticated flow regime maps in the future to further improve simulation accuracy.« less

  13. Object-Oriented Image Clustering Method Using UAS Photogrammetric Imagery

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Larson, A.; Schultz-Fellenz, E. S.; Sussman, A. J.; Swanson, E.; Coppersmith, R.

    2016-12-01

    Unmanned Aerial Systems (UAS) have been used widely as an imaging modality to obtain remotely sensed multi-band surface imagery, and are growing in popularity due to their efficiency, ease of use, and affordability. Los Alamos National Laboratory (LANL) has employed the use of UAS for geologic site characterization and change detection studies at a variety of field sites. The deployed UAS equipped with a standard visible band camera to collect imagery datasets. Based on the imagery collected, we use deep sparse algorithmic processing to detect and discriminate subtle topographic features created or impacted by subsurface activities. In this work, we develop an object-oriented remote sensing imagery clustering method for land cover classification. To improve the clustering and segmentation accuracy, instead of using conventional pixel-based clustering methods, we integrate the spatial information from neighboring regions to create super-pixels to avoid salt-and-pepper noise and subsequent over-segmentation. To further improve robustness of our clustering method, we also incorporate a custom digital elevation model (DEM) dataset generated using a structure-from-motion (SfM) algorithm together with the red, green, and blue (RGB) band data for clustering. In particular, we first employ an agglomerative clustering to create an initial segmentation map, from where every object is treated as a single (new) pixel. Based on the new pixels obtained, we generate new features to implement another level of clustering. We employ our clustering method to the RGB+DEM datasets collected at the field site. Through binary clustering and multi-object clustering tests, we verify that our method can accurately separate vegetation from non-vegetation regions, and are also able to differentiate object features on the surface.

  14. A New Approach of evaluating the damage in simply-supported reinforced concrete beam by Local mean decomposition (LMD)

    NASA Astrophysics Data System (ADS)

    Zhang, Xuebing; Liu, Ning; Xi, Jiaxin; Zhang, Yunqi; Zhang, Wenchun; Yang, Peipei

    2017-08-01

    How to analyze the nonstationary response signals and obtain vibration characters is extremely important in the vibration-based structural diagnosis methods. In this work, we introduce a more reasonable time-frequency decomposition method termed local mean decomposition (LMD) to instead the widely-used empirical mode decomposition (EMD). By employing the LMD method, one can derive a group of component signals, each of which is more stationary, and then analyze the vibration state and make the assessment of structural damage of a construction or building. We illustrated the effectiveness of LMD by a synthetic data and an experimental data recorded in a simply-supported reinforced concrete beam. Then based on the decomposition results, an elementary method of damage diagnosis was proposed.

  15. Induction slag reduction process for purifying metals

    DOEpatents

    Traut, Davis E.; Fisher, II, George T.; Hansen, Dennis A.

    1991-01-01

    A continuous method is provided for purifying and recovering transition metals such as neodymium and zirconium that become reactive at temperatures above about 500.degree. C. that comprises the steps of contacting the metal ore with an appropriate fluorinating agent such as an alkaline earth metal fluosilicate to form a fluometallic compound, and reducing the fluometallic compound with a suitable alkaline earth or alkali metal compound under molten conditions, such as provided in an induction slag metal furnace. The method of the invention is advantageous in that it is simpler and less expensive than methods used previously to recover pure metals, and it may be employed with a wide range of transition metals that were reactive with enclosures used in the prior art methods and were hard to obtain in uncontaminated form.

  16. Pressure balance cross-calibration method using a pressure transducer as transfer standard

    PubMed Central

    Olson, D; Driver, R. G.; Yang, Y

    2016-01-01

    Piston gauges or pressure balances are widely used to realize the SI unit of pressure, the pascal, and to calibrate pressure sensing devices. However, their calibration is time consuming and requires a lot of technical expertise. In this paper, we propose an alternate method of performing a piston gauge cross calibration that incorporates a pressure transducer as an immediate in-situ transfer standard. For a sufficiently linear transducer, the requirement to exactly balance the weights on the two pressure gauges under consideration is greatly relaxed. Our results indicate that this method can be employed without a significant increase in measurement uncertainty. Indeed, in the test case explored here, our results agreed with the traditional method within standard uncertainty, which was less than 6 parts per million. PMID:28303167

  17. Next-generation genome-scale models for metabolic engineering.

    PubMed

    King, Zachary A; Lloyd, Colton J; Feist, Adam M; Palsson, Bernhard O

    2015-12-01

    Constraint-based reconstruction and analysis (COBRA) methods have become widely used tools for metabolic engineering in both academic and industrial laboratories. By employing a genome-scale in silico representation of the metabolic network of a host organism, COBRA methods can be used to predict optimal genetic modifications that improve the rate and yield of chemical production. A new generation of COBRA models and methods is now being developed--encompassing many biological processes and simulation strategies-and next-generation models enable new types of predictions. Here, three key examples of applying COBRA methods to strain optimization are presented and discussed. Then, an outlook is provided on the next generation of COBRA models and the new types of predictions they will enable for systems metabolic engineering. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Simplified welding distortion analysis for fillet welding using composite shell elements

    NASA Astrophysics Data System (ADS)

    Kim, Mingyu; Kang, Minseok; Chung, Hyun

    2015-09-01

    This paper presents the simplified welding distortion analysis method to predict the welding deformation of both plate and stiffener in fillet welds. Currently, the methods based on equivalent thermal strain like Strain as Direct Boundary (SDB) has been widely used due to effective prediction of welding deformation. Regarding the fillet welding, however, those methods cannot represent deformation of both members at once since the temperature degree of freedom is shared at the intersection nodes in both members. In this paper, we propose new approach to simulate deformation of both members. The method can simulate fillet weld deformations by employing composite shell element and using different thermal expansion coefficients according to thickness direction with fixed temperature at intersection nodes. For verification purpose, we compare of result from experiments, 3D thermo elastic plastic analysis, SDB method and proposed method. Compared of experiments results, the proposed method can effectively predict welding deformation for fillet welds.

  19. Light field rendering with omni-directional camera

    NASA Astrophysics Data System (ADS)

    Todoroki, Hiroshi; Saito, Hideo

    2003-06-01

    This paper presents an approach to capture visual appearance of a real environment such as an interior of a room. We propose the method for generating arbitrary viewpoint images by building light field with the omni-directional camera, which can capture the wide circumferences. Omni-directional camera used in this technique is a special camera with the hyperbolic mirror in the upper part of a camera, so that we can capture luminosity in the environment in the range of 360 degree of circumferences in one image. We apply the light field method, which is one technique of Image-Based-Rendering(IBR), for generating the arbitrary viewpoint images. The light field is a kind of the database that records the luminosity information in the object space. We employ the omni-directional camera for constructing the light field, so that we can collect many view direction images in the light field. Thus our method allows the user to explore the wide scene, that can acheive realistic representation of virtual enviroment. For demonstating the proposed method, we capture image sequence in our lab's interior environment with an omni-directional camera, and succesfully generate arbitray viewpoint images for virual tour of the environment.

  20. Mass spectrometry applied to the identification of Mycobacterium tuberculosis and biomarker discovery.

    PubMed

    López-Hernández, Y; Patiño-Rodríguez, O; García-Orta, S T; Pinos-Rodríguez, J M

    2016-12-01

    An adequate and effective tuberculosis (TB) diagnosis system has been identified by the World Health Organization as a priority in the fight against this disease. Over the years, several methods have been developed to identify the bacillus, but bacterial culture remains one of the most affordable methods for most countries. For rapid and accurate identification, however, it is more feasible to implement molecular techniques, taking advantage of the availability of public databases containing protein sequences. Mass spectrometry (MS) has become an interesting technique for the identification of TB. Here, we review some of the most widely employed methods for identifying Mycobacterium tuberculosis and present an update on MS applied for the identification of mycobacterial species. © 2016 The Society for Applied Microbiology.

  1. a Gross Error Elimination Method for Point Cloud Data Based on Kd-Tree

    NASA Astrophysics Data System (ADS)

    Kang, Q.; Huang, G.; Yang, S.

    2018-04-01

    Point cloud data has been one type of widely used data sources in the field of remote sensing. Key steps of point cloud data's pro-processing focus on gross error elimination and quality control. Owing to the volume feature of point could data, existed gross error elimination methods need spend massive memory both in space and time. This paper employed a new method which based on Kd-tree algorithm to construct, k-nearest neighbor algorithm to search, settled appropriate threshold to determine with result turns out a judgement that whether target point is or not an outlier. Experimental results show that, our proposed algorithm will help to delete gross error in point cloud data and facilitate to decrease memory consumption, improve efficiency.

  2. Texture based segmentation method to detect atherosclerotic plaque from optical tomography images

    NASA Astrophysics Data System (ADS)

    Prakash, Ammu; Hewko, Mark; Sowa, Michael; Sherif, Sherif

    2013-06-01

    Optical coherence tomography (OCT) imaging has been widely employed in assessing cardiovascular disease. Atherosclerosis is one of the major cause cardio vascular diseases. However visual detection of atherosclerotic plaque from OCT images is often limited and further complicated by high frame rates. We developed a texture based segmentation method to automatically detect plaque and non plaque regions from OCT images. To verify our results we compared them to photographs of the vascular tissue with atherosclerotic plaque that we used to generate the OCT images. Our results show a close match with photographs of vascular tissue with atherosclerotic plaque. Our texture based segmentation method for plaque detection could be potentially used in clinical cardiovascular OCT imaging for plaque detection.

  3. Workforce Issues. Employment Practices in Selected Large Private Companies. Report to Congressional Committees.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. General Government Div.

    The General Accounting Office examined employment practices in 130 selected large private U.S. companies with at least 100 workers in each of 10 or more employment locations and at least 25,000 employees. Of the 130 companies surveyed, 83 (64%) returned usable responses. The respondents reported using a wide range of employment practices for…

  4. Training and Jobs Programs in Action: Case Studies in Private-Sector Initiatives for the Hard-to-Employ.

    ERIC Educational Resources Information Center

    Robison, David

    This book contains fifty-three case studies covering a wide variety of private-sector activities and public-private partnerships designed to increase training and employment opportunities for the hard-to-employ and speed the transition of the unemployed from government support and subsidized jobs to permanent private employment. Compiled from a…

  5. Improving the detection of pathways in genome-wide association studies by combined effects of SNPs from Linkage Disequilibrium blocks.

    PubMed

    Zhao, Huiying; Nyholt, Dale R; Yang, Yuanhao; Wang, Jihua; Yang, Yuedong

    2017-06-14

    Genome-wide association studies (GWAS) have successfully identified single variants associated with diseases. To increase the power of GWAS, gene-based and pathway-based tests are commonly employed to detect more risk factors. However, the gene- and pathway-based association tests may be biased towards genes or pathways containing a large number of single-nucleotide polymorphisms (SNPs) with small P-values caused by high linkage disequilibrium (LD) correlations. To address such bias, numerous pathway-based methods have been developed. Here we propose a novel method, DGAT-path, to divide all SNPs assigned to genes in each pathway into LD blocks, and to sum the chi-square statistics of LD blocks for assessing the significance of the pathway by permutation tests. The method was proven robust with the type I error rate >1.6 times lower than other methods. Meanwhile, the method displays a higher power and is not biased by the pathway size. The applications to the GWAS summary statistics for schizophrenia and breast cancer indicate that the detected top pathways contain more genes close to associated SNPs than other methods. As a result, the method identified 17 and 12 significant pathways containing 20 and 21 novel associated genes, respectively for two diseases. The method is available online by http://sparks-lab.org/server/DGAT-path .

  6. Evaluation of Shifted Excitation Raman Difference Spectroscopy and Comparison to Computational Background Correction Methods Applied to Biochemical Raman Spectra.

    PubMed

    Cordero, Eliana; Korinth, Florian; Stiebing, Clara; Krafft, Christoph; Schie, Iwan W; Popp, Jürgen

    2017-07-27

    Raman spectroscopy provides label-free biochemical information from tissue samples without complicated sample preparation. The clinical capability of Raman spectroscopy has been demonstrated in a wide range of in vitro and in vivo applications. However, a challenge for in vivo applications is the simultaneous excitation of auto-fluorescence in the majority of tissues of interest, such as liver, bladder, brain, and others. Raman bands are then superimposed on a fluorescence background, which can be several orders of magnitude larger than the Raman signal. To eliminate the disturbing fluorescence background, several approaches are available. Among instrumentational methods shifted excitation Raman difference spectroscopy (SERDS) has been widely applied and studied. Similarly, computational techniques, for instance extended multiplicative scatter correction (EMSC), have also been employed to remove undesired background contributions. Here, we present a theoretical and experimental evaluation and comparison of fluorescence background removal approaches for Raman spectra based on SERDS and EMSC.

  7. Evaluation of Shifted Excitation Raman Difference Spectroscopy and Comparison to Computational Background Correction Methods Applied to Biochemical Raman Spectra

    PubMed Central

    Cordero, Eliana; Korinth, Florian; Stiebing, Clara; Krafft, Christoph; Schie, Iwan W.; Popp, Jürgen

    2017-01-01

    Raman spectroscopy provides label-free biochemical information from tissue samples without complicated sample preparation. The clinical capability of Raman spectroscopy has been demonstrated in a wide range of in vitro and in vivo applications. However, a challenge for in vivo applications is the simultaneous excitation of auto-fluorescence in the majority of tissues of interest, such as liver, bladder, brain, and others. Raman bands are then superimposed on a fluorescence background, which can be several orders of magnitude larger than the Raman signal. To eliminate the disturbing fluorescence background, several approaches are available. Among instrumentational methods shifted excitation Raman difference spectroscopy (SERDS) has been widely applied and studied. Similarly, computational techniques, for instance extended multiplicative scatter correction (EMSC), have also been employed to remove undesired background contributions. Here, we present a theoretical and experimental evaluation and comparison of fluorescence background removal approaches for Raman spectra based on SERDS and EMSC. PMID:28749450

  8. GEOGRAPHIC FACTORS IN EMPLOYMENT AND MANPOWER DEVELOPMENT.

    ERIC Educational Resources Information Center

    Department of Labor, Washington, DC.

    THIS COUNTRY MUST FACE THE ECONOMIC AND SOCIAL CONSEQUENCES OF CHANGING PATTERNS OF EMPLOYMENT LOCATION WHICH RESULT FROM SHIFTING CURRENTS OF TECHNOLOGICAL CHANGE, PRODUCT DEMAND, AND JOB AND PROFIT SEEKING. ECONOMIC DEVELOPMENT PROGRAMS OF THE LAST 7 YEARS, EMPLOYING A WIDE VARIETY OF APPROACHES TO THEIR COMMON GOAL OF ECONOMIC DEVELOPMENT AND…

  9. The Contribution of Work-Integrated Learning to Undergraduate Employability Skill Outcomes

    ERIC Educational Resources Information Center

    Jackson, Denise

    2013-01-01

    WIL has attracted considerable attention as an instrument for enhancing professional practice and developing work-readiness in new graduates. It is widely considered as a point of difference in developing graduate employability by enhancing skill outcomes, such as team-work, communication, self-management and problem solving, employment prospects…

  10. Employablity Skills among Graduates of Estate Management in Nigeria

    ERIC Educational Resources Information Center

    Egbenta, Idu Robert

    2015-01-01

    There is wide claim that employers have a high level of dissatisfaction associated with graduates from Nigeria higher institutions of learning. This paper examines whether graduates of estate management in Nigeria higher institutions have employability skills for productive employment. The study randomly sampled 59 principal partners or heads of…

  11. Method for determination of aflatoxin M₁ in cheese and butter by HPLC using an immunoaffinity column.

    PubMed

    Sakuma, Hisako; Kamata, Yoichi; Sugita-Konishi, Yoshiko; Kawakami, Hiroshi

    2011-01-01

    A rapid, sensitive convenient method for determination of aflatoxin M₁ (AFM₁) in cheese and butter by HPLC was developed and validated. The method employs a safe extraction solution (mixture of acetonitrile, methanol and water) and an immunoaffinity column (IAC) for clean-up. Compared with the widely used method employing chloroform and a Florisil column, the IAC method has a short analytical time and there are no interference peaks. The limits of quantification (LOQ) of the IAC method were 0.12 and 0.14 µg/kg, while those of the Florisil column method were 0.47 and 0.23 µg/kg in cheese and buffer, respectively. The recovery and relative standard deviation (RSD) for cheese (spiked at 0.5 µg/kg) in the IAC method were 92% and 7%, respectively, while for the Florisil column method the corresponding values were 76% and 10%. The recovery and RSD for butter (spiked at 0.5 µg/kg) in the IAC method were 97% and 9%, and those in the Florisil method were 74% and 9%, respectively. In the IAC method, the values of in-house precision (n=2, day=5) of cheese and butter (spiked at 0.5 µg/kg) were 9% and 13%, respectively. The IAC method is superior to the Florisil column method in terms of safety, ease of handling, sensitivity and reliability. A survey of AFM₁ contamination in imported cheese and butter in Japan was conducted by the IAC method. AFM₁ was not detected in 60 samples of cheese and 30 samples of butter.

  12. Analysis of operator splitting errors for near-limit flame simulations

    NASA Astrophysics Data System (ADS)

    Lu, Zhen; Zhou, Hua; Li, Shan; Ren, Zhuyin; Lu, Tianfeng; Law, Chung K.

    2017-04-01

    High-fidelity simulations of ignition, extinction and oscillatory combustion processes are of practical interest in a broad range of combustion applications. Splitting schemes, widely employed in reactive flow simulations, could fail for stiff reaction-diffusion systems exhibiting near-limit flame phenomena. The present work first employs a model perfectly stirred reactor (PSR) problem with an Arrhenius reaction term and a linear mixing term to study the effects of splitting errors on the near-limit combustion phenomena. Analysis shows that the errors induced by decoupling of the fractional steps may result in unphysical extinction or ignition. The analysis is then extended to the prediction of ignition, extinction and oscillatory combustion in unsteady PSRs of various fuel/air mixtures with a 9-species detailed mechanism for hydrogen oxidation and an 88-species skeletal mechanism for n-heptane oxidation, together with a Jacobian-based analysis for the time scales. The tested schemes include the Strang splitting, the balanced splitting, and a newly developed semi-implicit midpoint method. Results show that the semi-implicit midpoint method can accurately reproduce the dynamics of the near-limit flame phenomena and it is second-order accurate over a wide range of time step size. For the extinction and ignition processes, both the balanced splitting and midpoint method can yield accurate predictions, whereas the Strang splitting can lead to significant shifts on the ignition/extinction processes or even unphysical results. With an enriched H radical source in the inflow stream, a delay of the ignition process and the deviation on the equilibrium temperature are observed for the Strang splitting. On the contrary, the midpoint method that solves reaction and diffusion together matches the fully implicit accurate solution. The balanced splitting predicts the temperature rise correctly but with an over-predicted peak. For the sustainable and decaying oscillatory combustion from cool flames, both the Strang splitting and the midpoint method can successfully capture the dynamic behavior, whereas the balanced splitting scheme results in significant errors.

  13. Analysis of operator splitting errors for near-limit flame simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Zhen; Zhou, Hua; Li, Shan

    High-fidelity simulations of ignition, extinction and oscillatory combustion processes are of practical interest in a broad range of combustion applications. Splitting schemes, widely employed in reactive flow simulations, could fail for stiff reaction–diffusion systems exhibiting near-limit flame phenomena. The present work first employs a model perfectly stirred reactor (PSR) problem with an Arrhenius reaction term and a linear mixing term to study the effects of splitting errors on the near-limit combustion phenomena. Analysis shows that the errors induced by decoupling of the fractional steps may result in unphysical extinction or ignition. The analysis is then extended to the prediction ofmore » ignition, extinction and oscillatory combustion in unsteady PSRs of various fuel/air mixtures with a 9-species detailed mechanism for hydrogen oxidation and an 88-species skeletal mechanism for n-heptane oxidation, together with a Jacobian-based analysis for the time scales. The tested schemes include the Strang splitting, the balanced splitting, and a newly developed semi-implicit midpoint method. Results show that the semi-implicit midpoint method can accurately reproduce the dynamics of the near-limit flame phenomena and it is second-order accurate over a wide range of time step size. For the extinction and ignition processes, both the balanced splitting and midpoint method can yield accurate predictions, whereas the Strang splitting can lead to significant shifts on the ignition/extinction processes or even unphysical results. With an enriched H radical source in the inflow stream, a delay of the ignition process and the deviation on the equilibrium temperature are observed for the Strang splitting. On the contrary, the midpoint method that solves reaction and diffusion together matches the fully implicit accurate solution. The balanced splitting predicts the temperature rise correctly but with an over-predicted peak. For the sustainable and decaying oscillatory combustion from cool flames, both the Strang splitting and the midpoint method can successfully capture the dynamic behavior, whereas the balanced splitting scheme results in significant errors.« less

  14. Post processing for offline Chinese handwritten character string recognition

    NASA Astrophysics Data System (ADS)

    Wang, YanWei; Ding, XiaoQing; Liu, ChangSong

    2012-01-01

    Offline Chinese handwritten character string recognition is one of the most important research fields in pattern recognition. Due to the free writing style, large variability in character shapes and different geometric characteristics, Chinese handwritten character string recognition is a challenging problem to deal with. However, among the current methods over-segmentation and merging method which integrates geometric information, character recognition information and contextual information, shows a promising result. It is found experimentally that a large part of errors are segmentation error and mainly occur around non-Chinese characters. In a Chinese character string, there are not only wide characters namely Chinese characters, but also narrow characters like digits and letters of the alphabet. The segmentation error is mainly caused by uniform geometric model imposed on all segmented candidate characters. To solve this problem, post processing is employed to improve recognition accuracy of narrow characters. On one hand, multi-geometric models are established for wide characters and narrow characters respectively. Under multi-geometric models narrow characters are not prone to be merged. On the other hand, top rank recognition results of candidate paths are integrated to boost final recognition of narrow characters. The post processing method is investigated on two datasets, in total 1405 handwritten address strings. The wide character recognition accuracy has been improved lightly and narrow character recognition accuracy has been increased up by 10.41% and 10.03% respectively. It indicates that the post processing method is effective to improve recognition accuracy of narrow characters.

  15. Use of statistical and pharmacokinetic-pharmacodynamic modeling and simulation to improve decision-making: A section summary report of the trends and innovations in clinical trial statistics conference.

    PubMed

    Kimko, Holly; Berry, Seth; O'Kelly, Michael; Mehrotra, Nitin; Hutmacher, Matthew; Sethuraman, Venkat

    2017-01-01

    The application of modeling and simulation (M&S) methods to improve decision-making was discussed during the Trends & Innovations in Clinical Trial Statistics Conference held in Durham, North Carolina, USA on May 1-4, 2016. Uses of both pharmacometric and statistical M&S were presented during the conference, highlighting the diversity of the methods employed by pharmacometricians and statisticians to address a broad range of quantitative issues in drug development. Five presentations are summarized herein, which cover the development strategy of employing M&S to drive decision-making; European initiatives on best practice in M&S; case studies of pharmacokinetic/pharmacodynamics modeling in regulatory decisions; estimation of exposure-response relationships in the presence of confounding; and the utility of estimating the probability of a correct decision for dose selection when prior information is limited. While M&S has been widely used during the last few decades, it is expected to play an essential role as more quantitative assessments are employed in the decision-making process. By integrating M&S as a tool to compile the totality of evidence collected throughout the drug development program, more informed decisions will be made.

  16. Implementation of a Flipped Classroom for Nuclear Medicine Physician CME.

    PubMed

    Komarraju, Aparna; Bartel, Twyla B; Dickinson, Lisa A; Grant, Frederick D; Yarbrough, Tracy L

    2018-06-21

    Increasingly, emerging technologies are expanding instructional possibilities, with new methods being adopted to improve knowledge acquisition and retention. Within medical education, many new techniques have been employed in the undergraduate setting, with less utilization thus far in the continuing medical education (CME) sphere. This paper discusses the use of a new method for CME-the "flipped classroom," widely used in undergraduate medical education. This method engages learners by providing content before the live ("in class") session that aids in preparation and fosters in-class engagement. A flipped classroom method was employed using an online image-rich case-based module and quiz prior to a live CME session at a national nuclear medicine meeting. The preparatory material provided a springboard for in-depth discussion at the live session-a case-based activity utilizing audience response technology. Study participants completed a survey regarding their initial experience with this new instructional method. In addition, focus group interviews were conducted with session attendees who had or had not completed the presession material; transcripts were qualitatively analyzed. Quantitative survey data (completed by two-thirds of the session attendees) suggested that the flipped method was highly valuable and met attendee educational objectives. Analysis of focus group data yielded six themes broadly related to two categories-benefits of the flipped method for CME and programmatic considerations for successfully implementing the flipped method in CME. Data from this study have proven encouraging and support further investigations around the incorporation of this innovative teaching method into CME for nuclear imaging specialists.

  17. Evaluation of an automatic brain segmentation method developed for neonates on adult MR brain images

    NASA Astrophysics Data System (ADS)

    Moeskops, Pim; Viergever, Max A.; Benders, Manon J. N. L.; Išgum, Ivana

    2015-03-01

    Automatic brain tissue segmentation is of clinical relevance in images acquired at all ages. The literature presents a clear distinction between methods developed for MR images of infants, and methods developed for images of adults. The aim of this work is to evaluate a method developed for neonatal images in the segmentation of adult images. The evaluated method employs supervised voxel classification in subsequent stages, exploiting spatial and intensity information. Evaluation was performed using images available within the MRBrainS13 challenge. The obtained average Dice coefficients were 85.77% for grey matter, 88.66% for white matter, 81.08% for cerebrospinal fluid, 95.65% for cerebrum, and 96.92% for intracranial cavity, currently resulting in the best overall ranking. The possibility of applying the same method to neonatal as well as adult images can be of great value in cross-sectional studies that include a wide age range.

  18. Use of World Wide Web and NCSA Mcsaic at Langley

    NASA Technical Reports Server (NTRS)

    Nelson, Michael

    1994-01-01

    A brief history of the use of the World Wide Web at Langley Research Center is presented along with architecture of the Langley Web. Benefits derived from the Web and some Langley projects that have employed the World Wide Web are discussed.

  19. Verification of a non-hydrostatic dynamical core using horizontally spectral element vertically finite difference method: 2-D aspects

    NASA Astrophysics Data System (ADS)

    Choi, S.-J.; Giraldo, F. X.; Kim, J.; Shin, S.

    2014-06-01

    The non-hydrostatic (NH) compressible Euler equations of dry atmosphere are solved in a simplified two dimensional (2-D) slice framework employing a spectral element method (SEM) for the horizontal discretization and a finite difference method (FDM) for the vertical discretization. The SEM uses high-order nodal basis functions associated with Lagrange polynomials based on Gauss-Lobatto-Legendre (GLL) quadrature points. The FDM employs a third-order upwind biased scheme for the vertical flux terms and a centered finite difference scheme for the vertical derivative terms and quadrature. The Euler equations used here are in a flux form based on the hydrostatic pressure vertical coordinate, which are the same as those used in the Weather Research and Forecasting (WRF) model, but a hybrid sigma-pressure vertical coordinate is implemented in this model. We verified the model by conducting widely used standard benchmark tests: the inertia-gravity wave, rising thermal bubble, density current wave, and linear hydrostatic mountain wave. The results from those tests demonstrate that the horizontally spectral element vertically finite difference model is accurate and robust. By using the 2-D slice model, we effectively show that the combined spatial discretization method of the spectral element and finite difference method in the horizontal and vertical directions, respectively, offers a viable method for the development of a NH dynamical core.

  20. Simultaneous Genotype Calling and Haplotype Phasing Improves Genotype Accuracy and Reduces False-Positive Associations for Genome-wide Association Studies

    PubMed Central

    Browning, Brian L.; Yu, Zhaoxia

    2009-01-01

    We present a novel method for simultaneous genotype calling and haplotype-phase inference. Our method employs the computationally efficient BEAGLE haplotype-frequency model, which can be applied to large-scale studies with millions of markers and thousands of samples. We compare genotype calls made with our method to genotype calls made with the BIRDSEED, CHIAMO, GenCall, and ILLUMINUS genotype-calling methods, using genotype data from the Illumina 550K and Affymetrix 500K arrays. We show that our method has higher genotype-call accuracy and yields fewer uncalled genotypes than competing methods. We perform single-marker analysis of data from the Wellcome Trust Case Control Consortium bipolar disorder and type 2 diabetes studies. For bipolar disorder, the genotype calls in the original study yield 25 markers with apparent false-positive association with bipolar disorder at a p < 10−7 significance level, whereas genotype calls made with our method yield no associated markers at this significance threshold. Conversely, for markers with replicated association with type 2 diabetes, there is good concordance between genotype calls used in the original study and calls made by our method. Results from single-marker and haplotypic analysis of our method's genotype calls for the bipolar disorder study indicate that our method is highly effective at eliminating genotyping artifacts that cause false-positive associations in genome-wide association studies. Our new genotype-calling methods are implemented in the BEAGLE and BEAGLECALL software packages. PMID:19931040

  1. Application of redundancy in the Saturn 5 guidance and control system

    NASA Technical Reports Server (NTRS)

    Moore, F. B.; White, J. B.

    1976-01-01

    The Saturn launch vehicle's guidance and control system is so complex that the reliability of a simplex system is not adequate to fulfill mission requirements. Thus, to achieve the desired reliability, redundancy encompassing a wide range of types and levels was employed. At one extreme, the lowest level, basic components (resistors, capacitors, relays, etc.) are employed in series, parallel, or quadruplex arrangements to insure continued system operation in the presence of possible failure conditions. At the other extreme, the highest level, complete subsystem duplication is provided so that a backup subsystem can be employed in case the primary system malfunctions. In between these two extremes, many other redundancy schemes and techniques are employed at various levels. Basic redundancy concepts are covered to gain insight into the advantages obtained with various techniques. Points and methods of application of these techniques are included. The theoretical gain in reliability resulting from redundancy is assessed and compared to a simplex system. Problems and limitations encountered in the practical application of redundancy are discussed as well as techniques verifying proper operation of the redundant channels. As background for the redundancy application discussion, a basic description of the guidance and control system is included.

  2. Combined imaging and chemical sensing using a single optical imaging fiber.

    PubMed

    Bronk, K S; Michael, K L; Pantano, P; Walt, D R

    1995-09-01

    Despite many innovations and developments in the field of fiber-optic chemical sensors, optical fibers have not been employed to both view a sample and concurrently detect an analyte of interest. While chemical sensors employing a single optical fiber or a noncoherent fiberoptic bundle have been applied to a wide variety of analytical determinations, they cannot be used for imaging. Similarly, coherent imaging fibers have been employed only for their originally intended purpose, image transmission. We herein report a new technique for viewing a sample and measuring surface chemical concentrations that employs a coherent imaging fiber. The method is based on the deposition of a thin, analyte-sensitive polymer layer on the distal surface of a 350-microns-diameter imaging fiber. We present results from a pH sensor array and an acetylcholine biosensor array, each of which contains approximately 6000 optical sensors. The acetylcholine biosensor has a detection limit of 35 microM and a fast (< 1 s) response time. In association with an epifluorescence microscope and a charge-coupled device, these modified imaging fibers can display visual information of a remote sample with 4-microns spatial resolution, allowing for alternating acquisition of both chemical analysis and visual histology.

  3. Phlebotomy skills expected of career entry CLS/CLT graduates: a Missouri hospital perspective.

    PubMed

    Millstead, C

    2000-01-01

    To determine how much, what type, and what proficiency of phlebotomy experience CLS/CLT students should have during the training program to be prepared to meet the needs of the majority of Missouri hospital employers. Survey to determine the role healthcare professionals, inside and outside the laboratory, play in today's blood collection patterns and phlebotomy management. The Missouri Organization of Clinical Laboratory Science mailed 204 surveys to the Missouri Hospital Association member laboratories. MAIN OUTCOMES/CONCLUSIONS: This research examined the need for modifying phlebotomy skills of clinical laboratory science students. Data gathered from employers support the premise that entry-level competencies of CLS/CLT graduates will vary according to clinical facility size. CLS/CLT programs may use data from this study to plan phlebotomy practicums. It can be extrapolated that Missouri employers who are most likely to employ career entry graduates expect them to draw blood from 9.3 patients within one hour. Fifty-three percent of 40 to 400 bed hospitals expect graduates to perform difficult draws in at least eight types of hospital units. Laboratories are the major managers of hospital wide phlebotomy services; thus, CLS/CLT curricula should include phlebotomy management methods.

  4. Awareness and Perceptions of Emergency Contraceptive Pills Among Women in Kinshasa, Democratic Republic of the Congo.

    PubMed

    Hernandez, Julie H; Muanda, Mbadu; Garcia, Mélissa; Matawa, Grace

    2017-09-01

    Despite the commitment of the Democratic Republic of the Congo (DRC) to expand the family planning method mix and increase access to services, awareness of emergency contraception is low among women, and the method remains underused and poorly integrated in family planning programming. Data from 15 focus group discussions conducted in 2016 among women aged 15-35 were used to examine awareness and perceptions of, and attitudes toward, emergency contraceptives. After facilitators explained emergency contraceptive pills' mechanism of action and other characteristics, participants were asked about the potential benefits and risks of making the method more widely available. Transcripts were analyzed using an iterative approach. Women reported employing a wide range of postcoital contraceptive behaviors, albeit often using inappropriate products, and generally agreed that emergency contraceptive pills seemed to be a potentially effective solution to their family planning needs. Perceived benefits and limitations of the method were almost always framed in reference to other, better-known contraceptives, and women expressed strong preferences for pharmacy-based provision that aligned with their usual behaviors for obtaining contraceptives. Participants were reluctant to see the method available for free. Emergency contraceptive pills have the potential to address gaps in the family planning method mix in the DRC. Assessing whether women have incomplete or erroneous information about family planning methods can provide better understanding of women's contraceptive choices in low-income countries.

  5. GENOME-WIDE COMPARATIVE ANALYSIS OF PHYLOGENETIC TREES: THE PROKARYOTIC FOREST OF LIFE

    PubMed Central

    Puigbò, Pere; Wolf, Yuri I.; Koonin, Eugene V.

    2013-01-01

    Genome-wide comparison of phylogenetic trees is becoming an increasingly common approach in evolutionary genomics, and a variety of approaches for such comparison have been developed. In this article we present several methods for comparative analysis of large numbers of phylogenetic trees. To compare phylogenetic trees taking into account the bootstrap support for each internal branch, the Boot-Split Distance (BSD) method is introduced as an extension of the previously developed Split Distance (SD) method for tree comparison. The BSD method implements the straightforward idea that comparison of phylogenetic trees can be made more robust by treating tree splits differentially depending on the bootstrap support. Approaches are also introduced for detecting tree-like and net-like evolutionary trends in the phylogenetic Forest of Life (FOL), i.e., the entirety of the phylogenetic trees for conserved genes of prokaryotes. The principal method employed for this purpose includes mapping quartets of species onto trees to calculate the support of each quartet topology and so to quantify the tree and net contributions to the distances between species. We describe the applications methods used to analyze the FOL and the results obtained with these methods. These results support the concept of the Tree of Life (TOL) as a central evolutionary trend in the FOL as opposed to the traditional view of the TOL as a ‘species tree’. PMID:22399455

  6. Genome-wide comparative analysis of phylogenetic trees: the prokaryotic forest of life.

    PubMed

    Puigbò, Pere; Wolf, Yuri I; Koonin, Eugene V

    2012-01-01

    Genome-wide comparison of phylogenetic trees is becoming an increasingly common approach in evolutionary genomics, and a variety of approaches for such comparison have been developed. In this article, we present several methods for comparative analysis of large numbers of phylogenetic trees. To compare phylogenetic trees taking into account the bootstrap support for each internal branch, the Boot-Split Distance (BSD) method is introduced as an extension of the previously developed Split Distance method for tree comparison. The BSD method implements the straightforward idea that comparison of phylogenetic trees can be made more robust by treating tree splits differentially depending on the bootstrap support. Approaches are also introduced for detecting tree-like and net-like evolutionary trends in the phylogenetic Forest of Life (FOL), i.e., the entirety of the phylogenetic trees for conserved genes of prokaryotes. The principal method employed for this purpose includes mapping quartets of species onto trees to calculate the support of each quartet topology and so to quantify the tree and net contributions to the distances between species. We describe the application of these methods to analyze the FOL and the results obtained with these methods. These results support the concept of the Tree of Life (TOL) as a central evolutionary trend in the FOL as opposed to the traditional view of the TOL as a "species tree."

  7. Preferred Materials and Methods Employed for Endodontic Treatment by Iranian General Practitioners

    PubMed Central

    Raoof, Maryam; Zeini, Negar; Haghani, Jahangir; Sadr, Saeedeh; Mohammadalizadeh, Sakineh

    2015-01-01

    Introduction: The aim of this study was to gather information on the materials and methods employed in root canal treatment (RCT) by general dental practitioners (GDPs) in Iran. Methods and Materials: A questionnaire was distributed among 450 dentists who attended the 53th Iranian Dental Association congress. Participants were asked to consider demographic variables and answer the questions regarding the materials and methods commonly used in RCT. Descriptive statistics were given as absolute frequencies and valid percentages. The chi-square test was used to investigate the influence of gender and the years of professional activity for the employed materials and techniques. Results: The response rate was 84.88%. The results showed that 61.5% of the participants did not perform pulp sensitivity tests prior to RCT. Less than half of the general dental practitioners (47.4%) said that they would trace a sinus tract before starting the treatment. Nearly 16% of practitioners preferred the rubber dam isolation method. Over 36% of the practitioners reported using formocresol for pulpotomy. The combined approach of working length (WL) radiographs and electronic apex locators was used by 35.2% of the practitioners. Most of the respondents used K-file hand instruments for canal preparation and the technique of choice was step-back (43.5%), while 40.1% of respondents used NiTi rotary files, mostly ProTaper and RaCe. The most widely used irrigant was normal saline (61.8%). Calcium hydroxide was the most commonly used inter appointment medicament (84.6%). The most popular obturation technique was cold lateral condensation (81.7%) with 51% using zinc oxide-eugenol-based sealers. Conclusions: The majority of Iranian GDPs who participated in the present survey do not comply with quality guidelines of endodontic treatment. PMID:25834595

  8. Vibration isolation design for periodically stiffened shells by the wave finite element method

    NASA Astrophysics Data System (ADS)

    Hong, Jie; He, Xueqing; Zhang, Dayi; Zhang, Bing; Ma, Yanhong

    2018-04-01

    Periodically stiffened shell structures are widely used due to their excellent specific strength, in particular for aeronautical and astronautical components. This paper presents an improved Wave Finite Element Method (FEM) that can be employed to predict the band-gap characteristics of stiffened shell structures efficiently. An aero-engine casing, which is a typical periodically stiffened shell structure, was employed to verify the validation and efficiency of the Wave FEM. Good agreement has been found between the Wave FEM and the classical FEM for different boundary conditions. One effective wave selection method based on the Wave FEM has thus been put forward to filter the radial modes of a shell structure. Furthermore, an optimisation strategy by the combination of the Wave FEM and genetic algorithm was presented for periodically stiffened shell structures. The optimal out-of-plane band gap and the mass of the whole structure can be achieved by the optimisation strategy under an aerodynamic load. Results also indicate that geometric parameters of stiffeners can be properly selected that the out-of-plane vibration attenuates significantly in the frequency band of interest. This study can provide valuable references for designing the band gaps of vibration isolation.

  9. A method of transmissibility design for dual-chamber pneumatic vibration isolator

    NASA Astrophysics Data System (ADS)

    Lee, Jeung-Hoon; Kim, Kwang-Joon

    2009-06-01

    Dual-chamber pneumatic vibration isolators have a wide range of applications for vibration isolation of vibration-sensitive equipment. Recent advances in precision machine tools and instruments such as medical devices and those related to nano-technology require better isolation performance, which can be efficiently achieved by precise modeling- and design- of the isolation system. This paper discusses an efficient transmissibility design method of a pneumatic vibration isolator wherein a complex stiffness model of a dual-chamber pneumatic spring developed in our previous study is employed. Three design parameters, the volume ratio between the two pneumatic chambers, the geometry of the capillary tube connecting the two pneumatic chambers, and, finally, the stiffness of the diaphragm employed for prevention of air leakage, were found to be important factors in transmissibility design. Based on a design technique that maximizes damping of the dual-chamber pneumatic spring, trade-offs among the resonance frequency of transmissibility, peak transmissibility, and transmissibility in high frequency range were found, which were not ever stated in previous researches. Furthermore, this paper discusses the negative role of the diaphragm in transmissibility design. The design method proposed in this paper is illustrated through experimental measurements.

  10. Gastroenterology Curriculum in the Canadian Medical School System.

    PubMed

    Dang, ThucNhi Tran; Wong, Clarence; Bistritz, Lana

    2017-01-01

    Background and Purpose. Gastroenterology is a diverse subspecialty that covers a wide array of topics. The preclinical gastroenterology curriculum is often the only formal training that medical students receive prior to becoming residents. There is no Canadian consensus on learning objectives or instructional methods and a general lack of awareness of curriculum at other institutions. This results in variable background knowledge for residents and lack of guidance for course development. Objectives. (1) Elucidate gastroenterology topics being taught at the preclinical level. (2) Determine instructional methods employed to teach gastroenterology content. Results . A curriculum map of gastroenterology topics was constructed from 10 of the medical schools that responded. Topics often not taught included pediatric GI diseases, surgery and trauma, food allergies/intolerances, and obesity. Gastroenterology was taught primarily by gastroenterologists and surgeons. Didactic and small group teaching was the most employed teaching method. Conclusion. This study is the first step in examining the Canadian gastroenterology curriculum at a preclinical level. The data can be used to inform curriculum development so that topics generally lacking are better incorporated in the curriculum. The study can also be used as a guide for further curriculum design and alignment across the country.

  11. Quality Indicators for Competitive Employment Outcomes: What Special Education Teachers Need to Know in Transition Planning

    ERIC Educational Resources Information Center

    Brooke, Valerie Ann; Revell, Grant; Wehman, Paul

    2009-01-01

    The quality of job outcomes achieved by youth with disabilities who are transitioning into employment varies widely across the country. Special education teachers, youth with disabilities, families, community rehabilitation program (CRP) staff providing employment services, and others involved in assisting transitioning youth can benefit from a…

  12. The Employability Advantage: Embedding Skills through a University-Wide Language Programme

    ERIC Educational Resources Information Center

    Cervi-Wilson, Tiziana; Brick, Billy

    2016-01-01

    As the employment of graduates appears among the performance indicators of institutions in higher education, universities are focussing more and more upon the development of employability related skills to enhance students' prospects in the job market. All UK universities are measured on the first jobs that their students acquire after graduation.…

  13. Chinese International Students' Perspective and Strategies in Preparing for Their Future Employability

    ERIC Educational Resources Information Center

    Huang, Rong; Turner, Rebecca; Chen, Qian

    2014-01-01

    Graduate employability and the contribution graduates make to the UK economy have been widely debated by policy-makers; however, little attention has been paid to the employability of international students. Given the growing significance of international students to the UK economy this is an interesting oversight; this article addresses this…

  14. Employers and Family Day Care.

    ERIC Educational Resources Information Center

    Ward, Pat

    This paper provides employers with critical information about family day care, the most widely used type of out-of-home care for infants and toddlers in the United States. Employers who are concerned about honoring parents' choice of child care, committed to high quality child care, and dedicated to using resources efficiently, will be pleasantly…

  15. Language, Employment, and Settlement: Temporary Meat Workers in Australia

    ERIC Educational Resources Information Center

    Piller, Ingrid; Lising, Loy

    2014-01-01

    Australia is one of the world's largest beef exporters. However, meat processing jobs are widely considered undesirable and are increasingly filled with employer-sponsored migrant workers on temporary long-stay visas. Against this background, our paper explores the role of language in the employment and migration trajectories of a group of meat…

  16. The Contested Curriculum: Academic Learning and Employability in Higher Education

    ERIC Educational Resources Information Center

    Speight, Sarah; Lackovic, Natasa; Cooker, Lucy

    2013-01-01

    This article explores the discourse of employability in higher education by investigating the understanding of different stakeholder groups (students, staff, employers) of the University of Nottingham (UK and China), and their fit to each other and to the educational literature. It finds that, while theories of life-long or life-wide learning…

  17. Wireless Wide Area Networks for School Districts.

    ERIC Educational Resources Information Center

    Nair, Prakash

    This paper considers a basic question that many schools districts face in attempting to develop affordable, expandable district-wide computer networks that are resistant to obsolescence: Should these wide area networks (WANs) employ wireless technology, stick to venerable hard-wired solutions, or combine both. This publication explores the…

  18. Coupling Matched Molecular Pairs with Machine Learning for Virtual Compound Optimization.

    PubMed

    Turk, Samo; Merget, Benjamin; Rippmann, Friedrich; Fulle, Simone

    2017-12-26

    Matched molecular pair (MMP) analyses are widely used in compound optimization projects to gain insights into structure-activity relationships (SAR). The analysis is traditionally done via statistical methods but can also be employed together with machine learning (ML) approaches to extrapolate to novel compounds. The here introduced MMP/ML method combines a fragment-based MMP implementation with different machine learning methods to obtain automated SAR decomposition and prediction. To test the prediction capabilities and model transferability, two different compound optimization scenarios were designed: (1) "new fragments" which occurs when exploring new fragments for a defined compound series and (2) "new static core and transformations" which resembles for instance the identification of a new compound series. Very good results were achieved by all employed machine learning methods especially for the new fragments case, but overall deep neural network models performed best, allowing reliable predictions also for the new static core and transformations scenario, where comprehensive SAR knowledge of the compound series is missing. Furthermore, we show that models trained on all available data have a higher generalizability compared to models trained on focused series and can extend beyond chemical space covered in the training data. Thus, coupling MMP with deep neural networks provides a promising approach to make high quality predictions on various data sets and in different compound optimization scenarios.

  19. Diversity in Genetic In Vivo Methods for Protein-Protein Interaction Studies: from the Yeast Two-Hybrid System to the Mammalian Split-Luciferase System

    PubMed Central

    Stynen, Bram; Tournu, Hélène; Tavernier, Jan

    2012-01-01

    Summary: The yeast two-hybrid system pioneered the field of in vivo protein-protein interaction methods and undisputedly gave rise to a palette of ingenious techniques that are constantly pushing further the limits of the original method. Sensitivity and selectivity have improved because of various technical tricks and experimental designs. Here we present an exhaustive overview of the genetic approaches available to study in vivo binary protein interactions, based on two-hybrid and protein fragment complementation assays. These methods have been engineered and employed successfully in microorganisms such as Saccharomyces cerevisiae and Escherichia coli, but also in higher eukaryotes. From single binary pairwise interactions to whole-genome interactome mapping, the self-reassembly concept has been employed widely. Innovative studies report the use of proteins such as ubiquitin, dihydrofolate reductase, and adenylate cyclase as reconstituted reporters. Protein fragment complementation assays have extended the possibilities in protein-protein interaction studies, with technologies that enable spatial and temporal analyses of protein complexes. In addition, one-hybrid and three-hybrid systems have broadened the types of interactions that can be studied and the findings that can be obtained. Applications of these technologies are discussed, together with the advantages and limitations of the available assays. PMID:22688816

  20. A new real-time method for investigation of affinity properties and binding kinetics of magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Orlov, Alexey V.; Nikitin, Maxim P.; Bragina, Vera A.; Znoyko, Sergey L.; Zaikina, Marina N.; Ksenevich, Tatiana I.; Gorshkov, Boris G.; Nikitin, Petr I.

    2015-04-01

    A method for quantitative investigation of affinity constants of receptors immobilized on magnetic nanoparticles (MP) is developed based on spectral correlation interferometry (SCI). The SCI records with a picometer resolution the thickness changes of a layer of molecules or nanoparticles due to a biochemical reaction on a cover slip, averaged over the sensing area. The method is compatible with other types of sensing surfaces employed in biosensing. The measured values of kinetic association constants of magnetic nanoparticles are 4 orders of magnitude higher than those of molecular antibody association with antigen. The developed method also suggests highly sensitive detection of antigens in a wide dynamic range. The limit of detection of 92 pg/ml has been demonstrated for prostate-specific antigen (PSA) with 50-nm MP employed as labels, which produce 3-order amplification of the SCI signals. The calibration curve features high sensitivity (slope) of 3-fold signal raise per 10-fold increase of PSA concentration within 4-order dynamic range, which is an attractive compromise for precise quantitative and highly sensitive immunoassay. The proposed biosensing technique offers inexpensive disposable sensor chips of cover slips and represents an economically sound alternative to traditional immunoassays for disease diagnostics, detection of pathogens in food and environmental monitoring.

  1. Use of modified atmosphere packaging to preserve mushroom quality during storage.

    PubMed

    Palacios, Irene; Moro, Carlos; Lozano, Miguel; D'Arrigo, Matilde; Guillamón, Eva; García-Lafuente, Ana; Villares, Ana

    2011-09-01

    Mushrooms have attracted much attention due to their excellent nutritional and sensory properties. However, they are highly perishable and rapidly lose their organoleptic characteristics. Many methods have been employed for mushroom storage, such as packaging, blanching, canning, or freeze drying. Among them, modified atmosphere packaging (MAP) has been widely employed for preserving fresh mushrooms. MAP provides an affordable packaging system that partly avoids enzymatic browning, fermentation and other biochemical processes by maintaining a controlled gas atmosphere. Several factors, including optimum CO2 and O2 partial pressures, permeability, package material, thickness, or product weight, must be considered in order to design a suitable modified atmosphere package for mushrooms. Thus, different strategies are available to preserve mushroom quality after harvest. The article presents some promising patents on use of modified atmosphere packaging to preserve mushroom quality during storage.

  2. Genome-wide scans of genetic variants for psychophysiological endophenotypes: a methodological overview.

    PubMed

    Iacono, William G; Malone, Stephen M; Vaidyanathan, Uma; Vrieze, Scott I

    2014-12-01

    This article provides an introductory overview of the investigative strategy employed to evaluate the genetic basis of 17 endophenotypes examined as part of a 20-year data collection effort from the Minnesota Center for Twin and Family Research. Included are characterization of the study samples, descriptive statistics for key properties of the psychophysiological measures, and rationale behind the steps taken in the molecular genetic study design. The statistical approach included (a) biometric analysis of twin and family data, (b) heritability analysis using 527,829 single nucleotide polymorphisms (SNPs), (c) genome-wide association analysis of these SNPs and 17,601 autosomal genes, (d) follow-up analyses of candidate SNPs and genes hypothesized to have an association with each endophenotype, (e) rare variant analysis of nonsynonymous SNPs in the exome, and (f) whole genome sequencing association analysis using 27 million genetic variants. These methods were used in the accompanying empirical articles comprising this special issue, Genome-Wide Scans of Genetic Variants for Psychophysiological Endophenotypes. Copyright © 2014 Society for Psychophysiological Research.

  3. Creating a RAW264.7 CRISPR-Cas9 Genome Wide Library

    PubMed Central

    Napier, Brooke A; Monack, Denise M

    2017-01-01

    The bacterial clustered regularly interspaced short palindromic repeats (CRISPR)-Cas9 genome editing tools are used in mammalian cells to knock-out specific genes of interest to elucidate gene function. The CRISPR-Cas9 system requires that the mammalian cell expresses Cas9 endonuclease, guide RNA (gRNA) to lead the endonuclease to the gene of interest, and the PAM sequence that links the Cas9 to the gRNA. CRISPR-Cas9 genome wide libraries are used to screen the effect of each gene in the genome on the cellular phenotype of interest, in an unbiased high-throughput manner. In this protocol, we describe our method of creating a CRISPR-Cas9 genome wide library in a transformed murine macrophage cell-line (RAW264.7). We have employed this library to identify novel mediators in the caspase-11 cell death pathway (Napier et al., 2016); however, this library can then be used to screen the importance of specific genes in multiple murine macrophage cellular pathways. PMID:28868328

  4. Full potential methods for analysis/design of complex aerospace configurations

    NASA Technical Reports Server (NTRS)

    Shankar, Vijaya; Szema, Kuo-Yen; Bonner, Ellwood

    1986-01-01

    The steady form of the full potential equation, in conservative form, is employed to analyze and design a wide variety of complex aerodynamic shapes. The nonlinear method is based on the theory of characteristic signal propagation coupled with novel flux biasing concepts and body-fitted mapping procedures. The resulting codes are vectorized for the CRAY XMP and the VPS-32 supercomputers. Use of the full potential nonlinear theory is demonstrated for a single-point supersonic wing design and a multipoint design for transonic maneuver/supersonic cruise/maneuver conditions. Achievement of high aerodynamic efficiency through numerical design is verified by wind tunnel tests. Other studies reported include analyses of a canard/wing/nacelle fighter geometry.

  5. The 'sniffer-patch' technique for detection of neurotransmitter release.

    PubMed

    Allen, T G

    1997-05-01

    A wide variety of techniques have been employed for the detection and measurement of neurotransmitter release from biological preparations. Whilst many of these methods offer impressive levels of sensitivity, few are able to combine sensitivity with the necessary temporal and spatial resolution required to study quantal release from single cells. One detection method that is seeing a revival of interest and has the potential to fill this niche is the so-called 'sniffer-patch' technique. In this article, specific examples of the practical aspects of using this technique are discussed along with the procedures involved in calibrating these biosensors to extend their applications to provide quantitative, in addition to simple qualitative, measurements of quantal transmitter release.

  6. RapGene: a fast and accurate strategy for synthetic gene assembly in Escherichia coli

    PubMed Central

    Zampini, Massimiliano; Stevens, Pauline Rees; Pachebat, Justin A.; Kingston-Smith, Alison; Mur, Luis A. J.; Hayes, Finbarr

    2015-01-01

    The ability to assemble DNA sequences de novo through efficient and powerful DNA fabrication methods is one of the foundational technologies of synthetic biology. Gene synthesis, in particular, has been considered the main driver for the emergence of this new scientific discipline. Here we describe RapGene, a rapid gene assembly technique which was successfully tested for the synthesis and cloning of both prokaryotic and eukaryotic genes through a ligation independent approach. The method developed in this study is a complete bacterial gene synthesis platform for the quick, accurate and cost effective fabrication and cloning of gene-length sequences that employ the widely used host Escherichia coli. PMID:26062748

  7. High-pressure torsion for new hydrogen storage materials

    PubMed Central

    Edalati, Kaveh; Akiba, Etsuo; Horita, Zenji

    2018-01-01

    Abstract High-pressure torsion (HPT) is widely used as a severe plastic deformation technique to create ultrafine-grained structures with promising mechanical and functional properties. Since 2007, the method has been employed to enhance the hydrogenation kinetics in different Mg-based hydrogen storage materials. Recent studies showed that the method is effective not only for increasing the hydrogenation kinetics but also for improving the hydrogenation activity, for enhancing the air resistivity and more importantly for synthesizing new nanostructured hydrogen storage materials with high densities of lattice defects. This manuscript reviews some major findings on the impact of HPT process on the hydrogen storage performance of different titanium-based and magnesium-based materials. PMID:29511396

  8. On modelling three-dimensional piezoelectric smart structures with boundary spectral element method

    NASA Astrophysics Data System (ADS)

    Zou, Fangxin; Aliabadi, M. H.

    2017-05-01

    The computational efficiency of the boundary element method in elastodynamic analysis can be significantly improved by employing high-order spectral elements for boundary discretisation. In this work, for the first time, the so-called boundary spectral element method is utilised to formulate the piezoelectric smart structures that are widely used in structural health monitoring (SHM) applications. The resultant boundary spectral element formulation has been validated by the finite element method (FEM) and physical experiments. The new formulation has demonstrated a lower demand on computational resources and a higher numerical stability than commercial FEM packages. Comparing to the conventional boundary element formulation, a significant reduction in computational expenses has been achieved. In summary, the boundary spectral element formulation presented in this paper provides a highly efficient and stable mathematical tool for the development of SHM applications.

  9. Mach-Zehnder interferometer implementation for thermo-optical and Kerr effect study

    NASA Astrophysics Data System (ADS)

    Bundulis, Arturs; Nitiss, Edgars; Busenbergs, Janis; Rutkis, Martins

    2018-04-01

    In this paper, we propose the Mach-Zehnder interferometric method for third-order nonlinear optical and thermo-optical studies. Both effects manifest themselves as refractive index dependence on the incident light intensity and are widely employed for multiple opto-optical and thermo-optical applications. With the implemented method, we have measured the Kerr and thermo-optical coefficients of chloroform under CW, ns and ps laser irradiance. The application of lasers with different light wavelengths, pulse duration and energy allowed us to distinguish the processes responsible for refractive index changes in the investigated solution. Presented setup was also used for demonstration of opto-optical switching. Results from Mach-Zehnder experiment were compared to Z-scan data obtained in our previous studies. Based on this, a quality comparison of both methods was assessed and advantages and disadvantages of each method were analyzed.

  10. Demosaicing images from colour cameras for digital image correlation

    NASA Astrophysics Data System (ADS)

    Forsey, A.; Gungor, S.

    2016-11-01

    Digital image correlation is not the intended use for consumer colour cameras, but with care they can be successfully employed in such a role. The main obstacle is the sparsely sampled colour data caused by the use of a colour filter array (CFA) to separate the colour channels. It is shown that the method used to convert consumer camera raw files into a monochrome image suitable for digital image correlation (DIC) can have a significant effect on the DIC output. A number of widely available software packages and two in-house methods are evaluated in terms of their performance when used with DIC. Using an in-plane rotating disc to produce a highly constrained displacement field, it was found that the bicubic spline based in-house demosaicing method outperformed the other methods in terms of accuracy and aliasing suppression.

  11. [Current knowledge on the strain typing of the pathogenic fungus Histoplasma capsulatum var. capsulatum: a review of the findings].

    PubMed

    Reyes-Montes, M del R; Taylor, M L; Curiel-Quesada, E; Mesa-Arango, A C

    2000-12-01

    The classification of microbial strains is currently based on different typing methods, which must meet certain criteria in order to be widely used. Phenotypic and genotypic methods are being employed in the epidemiology of several fungal diseases. However, some problems associated to the phenotypic methods have fostered genotyping procedures, from DNA polymorphic diversity to gene sequencing studies, all aiming to differentiate and to relate fungal isolates or strains. Through these studies, it is possible to identify outbreaks, to detect nosocomial infection transmission, and to determine the source of infection, as well as to recognize virulent isolates. This paper is aimed at analyzing the methods recently used to type Histoplasma capsulatum, causative agent of the systemic mycosis known as histoplasmosis, in order to recommend those that yield reproducible and accurate results.

  12. Fast Image Restoration for Spatially Varying Defocus Blur of Imaging Sensor

    PubMed Central

    Cheong, Hejin; Chae, Eunjung; Lee, Eunsung; Jo, Gwanghyun; Paik, Joonki

    2015-01-01

    This paper presents a fast adaptive image restoration method for removing spatially varying out-of-focus blur of a general imaging sensor. After estimating the parameters of space-variant point-spread-function (PSF) using the derivative in each uniformly blurred region, the proposed method performs spatially adaptive image restoration by selecting the optimal restoration filter according to the estimated blur parameters. Each restoration filter is implemented in the form of a combination of multiple FIR filters, which guarantees the fast image restoration without the need of iterative or recursive processing. Experimental results show that the proposed method outperforms existing space-invariant restoration methods in the sense of both objective and subjective performance measures. The proposed algorithm can be employed to a wide area of image restoration applications, such as mobile imaging devices, robot vision, and satellite image processing. PMID:25569760

  13. Application of Fiber-Optical Techniques in the Access Transmission and Backbone Transport of Mobile Networks

    NASA Astrophysics Data System (ADS)

    Hilt, Attila; Pozsonyi, László

    2012-09-01

    Fixed access networks widely employ fiber-optical techniques due to the extremely wide bandwidth offered to subscribers. In the last decade, there has also been an enormous increase of user data visible in mobile systems. The importance of fiber-optical techniques within the fixed transmission/transport networks of mobile systems is therefore inevitably increasing. This article summarizes a few reasons and gives examples why and how fiber-optic techniques are employed efficiently in second-generation networks.

  14. Monolithic multigrid methods for two-dimensional resistive magnetohydrodynamics

    DOE PAGES

    Adler, James H.; Benson, Thomas R.; Cyr, Eric C.; ...

    2016-01-06

    Magnetohydrodynamic (MHD) representations are used to model a wide range of plasma physics applications and are characterized by a nonlinear system of partial differential equations that strongly couples a charged fluid with the evolution of electromagnetic fields. The resulting linear systems that arise from discretization and linearization of the nonlinear problem are generally difficult to solve. In this paper, we investigate multigrid preconditioners for this system. We consider two well-known multigrid relaxation methods for incompressible fluid dynamics: Braess--Sarazin relaxation and Vanka relaxation. We first extend these to the context of steady-state one-fluid viscoresistive MHD. Then we compare the two relaxationmore » procedures within a multigrid-preconditioned GMRES method employed within Newton's method. To isolate the effects of the different relaxation methods, we use structured grids, inf-sup stable finite elements, and geometric interpolation. Furthermore, we present convergence and timing results for a two-dimensional, steady-state test problem.« less

  15. Recent trends in the determination of vitamin D.

    PubMed

    Gomes, Fabio P; Shaw, P Nicholas; Whitfield, Karen; Koorts, Pieter; Hewavitharana, Amitha K

    2013-12-01

    The occurrence of vitamin D deficiency has become an issue of serious concern in the worldwide population. As a result numerous analytical methods have been developed, for a variety of matrices, during the last few years to measure vitamin D analogs and metabolites. This review employs a comprehensive search of all vitamin D methods developed during the last 5 years for all applications, using ISI Web of Science(®), Scifinder(®), Science Direct, Scopus and PubMed. Particular emphasis is given to sample-preparation methods and the different forms of vitamin D measured across different fields of applications such as biological fluids, food and pharmaceutical preparations. This review compares and critically evaluates a wide range of approaches and methods, and hence it will enable readers to access developments across a number of applications and to select or develop the optimal analytical method for vitamin D for their particular application.

  16. Reconstructing Spatial Distributions from Anonymized Locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horey, James L; Forrest, Stephanie; Groat, Michael

    2012-01-01

    Devices such as mobile phones, tablets, and sensors are often equipped with GPS that accurately report a person's location. Combined with wireless communication, these devices enable a wide range of new social tools and applications. These same qualities, however, leave location-aware applications vulnerable to privacy violations. This paper introduces the Negative Quad Tree, a privacy protection method for location aware applications. The method is broadly applicable to applications that use spatial density information, such as social applications that measure the popularity of social venues. The method employs a simple anonymization algorithm running on mobile devices, and a more complex reconstructionmore » algorithm on a central server. This strategy is well suited to low-powered mobile devices. The paper analyzes the accuracy of the reconstruction method in a variety of simulated and real-world settings and demonstrates that the method is accurate enough to be used in many real-world scenarios.« less

  17. High-resolution method for evolving complex interface networks

    NASA Astrophysics Data System (ADS)

    Pan, Shucheng; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2018-04-01

    In this paper we describe a high-resolution transport formulation of the regional level-set approach for an improved prediction of the evolution of complex interface networks. The novelty of this method is twofold: (i) construction of local level sets and reconstruction of a global level set, (ii) local transport of the interface network by employing high-order spatial discretization schemes for improved representation of complex topologies. Various numerical test cases of multi-region flow problems, including triple-point advection, single vortex flow, mean curvature flow, normal driven flow, dry foam dynamics and shock-bubble interaction show that the method is accurate and suitable for a wide range of complex interface-network evolutions. Its overall computational cost is comparable to the Semi-Lagrangian regional level-set method while the prediction accuracy is significantly improved. The approach thus offers a viable alternative to previous interface-network level-set method.

  18. Variable neighborhood search for reverse engineering of gene regulatory networks.

    PubMed

    Nicholson, Charles; Goodwin, Leslie; Clark, Corey

    2017-01-01

    A new search heuristic, Divided Neighborhood Exploration Search, designed to be used with inference algorithms such as Bayesian networks to improve on the reverse engineering of gene regulatory networks is presented. The approach systematically moves through the search space to find topologies representative of gene regulatory networks that are more likely to explain microarray data. In empirical testing it is demonstrated that the novel method is superior to the widely employed greedy search techniques in both the quality of the inferred networks and computational time. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. SBA-15 Mesoporous Silica as Catalytic Support for Hydrodesulfurization Catalysts—Review

    PubMed Central

    Huirache-Acuña, Rafael; Nava, Rufino; Peza-Ledesma, Carmen L.; Lara-Romero, Javier; Alonso-Núñez, Gabriel; Pawelec, Barbara; Rivera-Muñoz, Eric M.

    2013-01-01

    SBA-15 is an interesting mesoporous silica material having highly ordered nanopores and a large surface area, which is widely employed as catalyst supports, absorbents, drug delivery materials, etc. Since it has a lack of functionality, heteroatoms and organic functional groups have been incorporated by direct or post-synthesis methods in order to modify their functionality. The aim of this article is to review the state-of-the-art related to the use of SBA-15-based mesoporous systems as supports for hydrodesulfurization (HDS) catalysts. PMID:28788323

  20. Time-resolved fluorescence decay measurements for flowing particles

    DOEpatents

    Deka, C.; Steinkamp, J.A.

    1999-06-01

    Time-resolved fluorescence decay measurements are disclosed for flowing particles. An apparatus and method for the measurement and analysis of fluorescence for individual cells and particles in flow are described, wherein the rapid measurement capabilities of flow cytometry and the robust measurement and analysis procedures of time-domain fluorescence lifetime spectroscopy are combined. A pulse-modulated CW laser is employed for excitation of the particles. The characteristics and the repetition rate of the excitation pulses can be readily adjusted to accommodate for fluorescence decays having a wide range of lifetimes. 12 figs.

  1. Time-resolved fluorescence decay measurements for flowing particles

    DOEpatents

    Deka, Chiranjit; Steinkamp, John A.

    1999-01-01

    Time-resolved fluorescence decay measurements for flowing particles. An apparatus and method for the measurement and analysis of fluorescence for individual cells and particles in flow are described, wherein the rapid measurement capabilities of flow cytometry and the robust measurement and analysis procedures of time-domain fluorescence lifetime spectroscopy are combined. A pulse-modulated cw laser is employed for excitation of the particles. The characteristics and the repetition rate of the excitation pulses can be readily adjusted to accommodate for fluorescence decays having a wide range of lifetimes.

  2. Real-time data reduction capabilities at the Langley 7 by 10 foot high speed tunnel

    NASA Technical Reports Server (NTRS)

    Fox, C. H., Jr.

    1980-01-01

    The 7 by 10 foot high speed tunnel performs a wide range of tests employing a variety of model installation methods. To support the reduction of static data from this facility, a generalized wind tunnel data reduction program had been developed for use on the Langley central computer complex. The capabilities of a version of this generalized program adapted for real time use on a dedicated on-site computer are discussed. The input specifications, instructions for the console operator, and full descriptions of the algorithms are included.

  3. Review on SERS of Bacteria

    PubMed Central

    Mosier-Boss, Pamela A.

    2017-01-01

    Surface enhanced Raman spectroscopy (SERS) has been widely used for chemical detection. Moreover, the inherent richness of the spectral data has made SERS attractive for use in detecting biological materials, including bacteria. This review discusses methods that have been used to obtain SERS spectra of bacteria. The kinds of SERS substrates employed to obtain SERS spectra are discussed as well as how bacteria interact with silver and gold nanoparticles. The roll of capping agents on Ag/Au NPs in obtaining SERS spectra is examined as well as the interpretation of the spectral data. PMID:29137201

  4. A Three-Dimensional Solution of Flows over Wings with Leading-Edge Vortex Separation. Part 1: Engineering Document

    NASA Technical Reports Server (NTRS)

    Brune, G. W.; Weber, J. A.; Johnson, F. T.; Lu, P.; Rubbert, P. E.

    1975-01-01

    A method of predicting forces, moments, and detailed surface pressures on thin, sharp-edged wings with leading-edge vortex separation in incompressible flow is presented. The method employs an inviscid flow model in which the wing and the rolled-up vortex sheets are represented by piecewise, continuous quadratic doublet sheet distributions. The Kutta condition is imposed on all wing edges. Computed results are compared with experimental data and with the predictions of the leading-edge suction analogy for a selected number of wing planforms over a wide range of angle of attack. These comparisons show the method to be very promising, capable of producing not only force predictions, but also accurate predictions of detailed surface pressure distributions, loads, and moments.

  5. The boundary element method applied to 3D magneto-electro-elastic dynamic problems

    NASA Astrophysics Data System (ADS)

    Igumnov, L. A.; Markov, I. P.; Kuznetsov, Iu A.

    2017-11-01

    Due to the coupling properties, the magneto-electro-elastic materials possess a wide number of applications. They exhibit general anisotropic behaviour. Three-dimensional transient analyses of magneto-electro-elastic solids can hardly be found in the literature. 3D direct boundary element formulation based on the weakly-singular boundary integral equations in Laplace domain is presented in this work for solving dynamic linear magneto-electro-elastic problems. Integral expressions of the three-dimensional fundamental solutions are employed. Spatial discretization is based on a collocation method with mixed boundary elements. Convolution quadrature method is used as a numerical inverse Laplace transform scheme to obtain time domain solutions. Numerical examples are provided to illustrate the capability of the proposed approach to treat highly dynamic problems.

  6. Incinerator technology overview

    NASA Astrophysics Data System (ADS)

    Santoleri, Joseph J.

    1993-03-01

    Many of the major chemical companies in the U.S. who regarded a safe environment as their responsibility installed waste treatment and disposal facilities on their plant sites in the last two decades. Many of these plants elected to use incinerators as the treatment process. This was not always the most economical method, but in many cases it was the only method of disposal that provided a safe and sure method of maximum destruction. Environmental concern over contamination from uncontrolled land disposal sites, and the emergence of tougher regulations for land disposal provide incentives for industry to employ a wide variety of traditional and advanced technologies for managing hazardous wastes. Incineration systems utilizing proper design, operation, and maintenance provides the safest, and in the long run, the most economical avenue to the maximum level of destruction of organic hazardous wastes.

  7. MALDI Imaging Mass Spectrometry—A Mini Review of Methods and Recent Developments

    PubMed Central

    Eriksson, Cecilia; Masaki, Noritaka; Yao, Ikuko; Hayasaka, Takahiro; Setou, Mitsutoshi

    2013-01-01

    As the only imaging method available, Imaging Mass Spectrometry (IMS) can determine both the identity and the distribution of hundreds of molecules on tissue sections, all in one single run. IMS is becoming an established research technology, and due to recent technical and methodological improvements the interest in this technology is increasing steadily and within a wide range of scientific fields. Of the different IMS methods available, matrix-assisted laser desorption/ionization (MALDI) IMS is the most commonly employed. The course at IMSC 2012 in Kyoto covered the fundamental principles and techniques of MALDI-IMS, assuming no previous experience in IMS. This mini review summarizes the content of the one-day course and describes some of the most recent work performed within this research field. PMID:24349941

  8. COMPUTATIONAL METHODOLOGIES for REAL-SPACE STRUCTURAL REFINEMENT of LARGE MACROMOLECULAR COMPLEXES

    PubMed Central

    Goh, Boon Chong; Hadden, Jodi A.; Bernardi, Rafael C.; Singharoy, Abhishek; McGreevy, Ryan; Rudack, Till; Cassidy, C. Keith; Schulten, Klaus

    2017-01-01

    The rise of the computer as a powerful tool for model building and refinement has revolutionized the field of structure determination for large biomolecular systems. Despite the wide availability of robust experimental methods capable of resolving structural details across a range of spatiotemporal resolutions, computational hybrid methods have the unique ability to integrate the diverse data from multimodal techniques such as X-ray crystallography and electron microscopy into consistent, fully atomistic structures. Here, commonly employed strategies for computational real-space structural refinement are reviewed, and their specific applications are illustrated for several large macromolecular complexes: ribosome, virus capsids, chemosensory array, and photosynthetic chromatophore. The increasingly important role of computational methods in large-scale structural refinement, along with current and future challenges, is discussed. PMID:27145875

  9. Mono-isotope Prediction for Mass Spectra Using Bayes Network.

    PubMed

    Li, Hui; Liu, Chunmei; Rwebangira, Mugizi Robert; Burge, Legand

    2014-12-01

    Mass spectrometry is one of the widely utilized important methods to study protein functions and components. The challenge of mono-isotope pattern recognition from large scale protein mass spectral data needs computational algorithms and tools to speed up the analysis and improve the analytic results. We utilized naïve Bayes network as the classifier with the assumption that the selected features are independent to predict mono-isotope pattern from mass spectrometry. Mono-isotopes detected from validated theoretical spectra were used as prior information in the Bayes method. Three main features extracted from the dataset were employed as independent variables in our model. The application of the proposed algorithm to publicMo dataset demonstrates that our naïve Bayes classifier is advantageous over existing methods in both accuracy and sensitivity.

  10. Wide Field and Planetary Camera for Space Telescope

    NASA Technical Reports Server (NTRS)

    Lockhart, R. F.

    1982-01-01

    The Space Telescope's Wide Field and Planetary Camera instrument, presently under construction, will be used to map the observable universe and to study the outer planets. It will be able to see 1000 times farther than any previously employed instrument. The Wide Field system will be located in a radial bay, receiving its signals via a pick-off mirror centered on the optical axis of the telescope assembly. The external thermal radiator employed by the instrument for cooling will be part of the exterior surface of the Space Telescope. In addition to having a larger (1200-12,000 A) wavelength range than any of the other Space Telescope instruments, its data rate, at 1 Mb/sec, exceeds that of the other instruments. Attention is given to the operating modes and projected performance levels of the Wide Field Camera and Planetary Camera.

  11. Identification of polymorphic inversions from genotypes

    PubMed Central

    2012-01-01

    Background Polymorphic inversions are a source of genetic variability with a direct impact on recombination frequencies. Given the difficulty of their experimental study, computational methods have been developed to infer their existence in a large number of individuals using genome-wide data of nucleotide variation. Methods based on haplotype tagging of known inversions attempt to classify individuals as having a normal or inverted allele. Other methods that measure differences between linkage disequilibrium attempt to identify regions with inversions but unable to classify subjects accurately, an essential requirement for association studies. Results We present a novel method to both identify polymorphic inversions from genome-wide genotype data and classify individuals as containing a normal or inverted allele. Our method, a generalization of a published method for haplotype data [1], utilizes linkage between groups of SNPs to partition a set of individuals into normal and inverted subpopulations. We employ a sliding window scan to identify regions likely to have an inversion, and accumulation of evidence from neighboring SNPs is used to accurately determine the inversion status of each subject. Further, our approach detects inversions directly from genotype data, thus increasing its usability to current genome-wide association studies (GWAS). Conclusions We demonstrate the accuracy of our method to detect inversions and classify individuals on principled-simulated genotypes, produced by the evolution of an inversion event within a coalescent model [2]. We applied our method to real genotype data from HapMap Phase III to characterize the inversion status of two known inversions within the regions 17q21 and 8p23 across 1184 individuals. Finally, we scan the full genomes of the European Origin (CEU) and Yoruba (YRI) HapMap samples. We find population-based evidence for 9 out of 15 well-established autosomic inversions, and for 52 regions previously predicted by independent experimental methods in ten (9+1) individuals [3,4]. We provide efficient implementations of both genotype and haplotype methods as a unified R package inveRsion. PMID:22321652

  12. Irreversible bonding of polyimide and polydimethylsiloxane (PDMS) based on a thiol-epoxy click reaction

    NASA Astrophysics Data System (ADS)

    Hoang, Michelle V.; Chung, Hyun-Joong; Elias, Anastasia L.

    2016-10-01

    Polyimide is one of the most popular substrate materials for the microfabrication of flexible electronics, while polydimethylsiloxane (PDMS) is the most widely used stretchable substrate/encapsulant material. These two polymers are essential in fabricating devices for microfluidics, bioelectronics, and the internet of things; bonding these materials together is a crucial challenge. In this work, we employ click chemistry at room temperature to irreversibly bond polyimide and PDMS through thiol-epoxy bonds using two different methods. In the first method, we functionalize the surfaces of the PDMS and polyimide substrates with mercaptosilanes and epoxysilanes, respectively, for the formation of a thiol-epoxy bond in the click reaction. In the second method, we functionalize one or both surfaces with mercaptosilane and introduce an epoxy adhesive layer between the two surfaces. When the surfaces are bonded using the epoxy adhesive without any surface functionalization, an extremely small peel strength (<0.01 N mm-1) is measured with a peel test, and adhesive failure occurs at the PDMS surface. With surface functionalization, however, remarkably higher peel strengths of ~0.2 N mm-1 (method 1) and  >0.3 N mm-1 (method 2) are observed, and failure occurs by tearing of the PDMS layer. We envision that the novel processing route employing click chemistry can be utilized in various cases of stretchable and flexible device fabrication.

  13. Recent advances in analytical methods for the determination of 4-alkylphenols and bisphenol A in solid environmental matrices: A critical review.

    PubMed

    Salgueiro-González, N; Castiglioni, S; Zuccato, E; Turnes-Carou, I; López-Mahía, P; Muniategui-Lorenzo, S

    2018-09-18

    The problem of endocrine disrupting compounds (EDCs) in the environment has become a worldwide concern in recent decades. Besides their toxicological effects at low concentrations and their widespread use in industrial and household applications, these pollutants pose a risk for non-target organisms and also for public safety. Analytical methods to determine these compounds at trace levels in different matrices are urgently needed. This review critically discusses trends in analytical methods for well-known EDCs like alkylphenols and bisphenol A in solid environmental matrices, including sediment and aquatic biological samples (from 2006 to 2018). Information about extraction, clean-up and determination is covered in detail, including analytical quality parameters (QA/QC). Conventional and novel analytical techniques are compared, with their advantages and drawbacks. Ultrasound assisted extraction followed by solid phase extraction clean-up is the most widely used procedure for sediment and aquatic biological samples, although softer extraction conditions have been employed for the latter. The use of liquid chromatography followed by tandem mass spectrometry has greatly increased in the last five years. The majority of these methods have been employed for the analysis of river sediments and bivalve molluscs because of their usefulness in aquatic ecosystem (bio)monitoring programs. Green, simple, fast analytical methods are now needed to determine these compounds in complex matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Cross-entropy clustering framework for catchment classification

    NASA Astrophysics Data System (ADS)

    Tongal, Hakan; Sivakumar, Bellie

    2017-09-01

    There is an increasing interest in catchment classification and regionalization in hydrology, as they are useful for identification of appropriate model complexity and transfer of information from gauged catchments to ungauged ones, among others. This study introduces a nonlinear cross-entropy clustering (CEC) method for classification of catchments. The method specifically considers embedding dimension (m), sample entropy (SampEn), and coefficient of variation (CV) to represent dimensionality, complexity, and variability of the time series, respectively. The method is applied to daily streamflow time series from 217 gauging stations across Australia. The results suggest that a combination of linear and nonlinear parameters (i.e. m, SampEn, and CV), representing different aspects of the underlying dynamics of streamflows, could be useful for determining distinct patterns of flow generation mechanisms within a nonlinear clustering framework. For the 217 streamflow time series, nine hydrologically homogeneous clusters that have distinct patterns of flow regime characteristics and specific dominant hydrological attributes with different climatic features are obtained. Comparison of the results with those obtained using the widely employed k-means clustering method (which results in five clusters, with the loss of some information about the features of the clusters) suggests the superiority of the cross-entropy clustering method. The outcomes from this study provide a useful guideline for employing the nonlinear dynamic approaches based on hydrologic signatures and for gaining an improved understanding of streamflow variability at a large scale.

  15. Quantification of methionine and selenomethionine in biological samples using multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS).

    PubMed

    Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel

    2018-05-01

    Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.

  16. Pressure in an exactly solvable model of active fluid

    NASA Astrophysics Data System (ADS)

    Marini Bettolo Marconi, Umberto; Maggi, Claudio; Paoluzzi, Matteo

    2017-07-01

    We consider the pressure in the steady-state regime of three stochastic models characterized by self-propulsion and persistent motion and widely employed to describe the behavior of active particles, namely, the Active Brownian particle (ABP) model, the Gaussian colored noise (GCN) model, and the unified colored noise approximation (UCNA) model. Whereas in the limit of short but finite persistence time, the pressure in the UCNA model can be obtained by different methods which have an analog in equilibrium systems, in the remaining two models only the virial route is, in general, possible. According to this method, notwithstanding each model obeys its own specific microscopic law of evolution, the pressure displays a certain universal behavior. For generic interparticle and confining potentials, we derive a formula which establishes a correspondence between the GCN and the UCNA pressures. In order to provide explicit formulas and examples, we specialize the discussion to the case of an assembly of elastic dumbbells confined to a parabolic well. By employing the UCNA we find that, for this model, the pressure determined by the thermodynamic method coincides with the pressures obtained by the virial and mechanical methods. The three methods when applied to the GCN give a pressure identical to that obtained via the UCNA. Finally, we find that the ABP virial pressure exactly agrees with the UCNA and GCN results.

  17. Toward a New Taxonomy for Understanding the Nature and Consequences of Contingent Employment

    ERIC Educational Resources Information Center

    Feldman, Daniel C.

    2006-01-01

    Purpose: The main goal of this article is to present a new taxonomy of contingent employment that better represents the wide variety of part-time, temporary, and contract employment arrangements that have emerged since Feldman's review. Design/methodology/approach: Reviews the literature over the past 15 years. Findings: The paper suggests that…

  18. Catalytic Chemistry on Oxide Nanostructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asthagiri, Aravind; Dixon, David A.; Dohnalek, Zdenek

    2016-05-29

    Metal oxides represent one of the most important and widely employed materials in catalysis. Extreme variability of their chemistry provides a unique opportunity to tune their properties and to utilize them for the design of highly active and selective catalysts. For bulk oxides, this can be achieved by varying their stoichiometry, phase, exposed surface facets, defect, dopant densities and numerous other ways. Further, distinct properties from those of bulk oxides can be attained by restricting the oxide dimensionality and preparing them in the form of ultrathin films and nanoclusters as discussed throughout this book. In this chapter we focus onmore » demonstrating such unique catalytic properties brought by the oxide nanoscaling. In the highlighted studies planar models are carefully designed to achieve minimal dispersion of structural motifs and to attain detailed mechanistic understanding of targeted chemical transformations. Detailed level of morphological and structural characterization necessary to achieve this goal is accomplished by employing both high-resolution imaging via scanning probe methods and ensemble-averaged surface sensitive spectroscopic methods. Three prototypical examples illustrating different properties of nanoscaled oxides in different classes of reactions are selected.« less

  19. An Autonomous Satellite Time Synchronization System Using Remotely Disciplined VC-OCXOs.

    PubMed

    Gu, Xiaobo; Chang, Qing; Glennon, Eamonn P; Xu, Baoda; Dempseter, Andrew G; Wang, Dun; Wu, Jiapeng

    2015-07-23

    An autonomous remote clock control system is proposed to provide time synchronization and frequency syntonization for satellite to satellite or ground to satellite time transfer, with the system comprising on-board voltage controlled oven controlled crystal oscillators (VC-OCXOs) that are disciplined to a remote master atomic clock or oscillator. The synchronization loop aims to provide autonomous operation over extended periods, be widely applicable to a variety of scenarios and robust. A new architecture comprising the use of frequency division duplex (FDD), synchronous time division (STDD) duplex and code division multiple access (CDMA) with a centralized topology is employed. This new design utilizes dual one-way ranging methods to precisely measure the clock error, adopts least square (LS) methods to predict the clock error and employs a third-order phase lock loop (PLL) to generate the voltage control signal. A general functional model for this system is proposed and the error sources and delays that affect the time synchronization are discussed. Related algorithms for estimating and correcting these errors are also proposed. The performance of the proposed system is simulated and guidance for selecting the clock is provided.

  20. Geophysical experiments for the pre-reclamation assessment of industrial and municipal waste landfills

    NASA Astrophysics Data System (ADS)

    Balia, R.; Littarru, B.

    2010-03-01

    Two examples of combined application of geophysical techniques for the pre-reclamation study of old waste landfills in Sardinia, Italy, are illustrated. The first one concerned a mine tailings basin and the second one a municipal solid waste landfill; both disposal sites date back to the 1970-80s. The gravity, shallow reflection, resistivity and induced polarization methods were employed in different combinations at the two sites, and in both cases useful information on the landfill's geometry has been obtained. The gravity method proved effective for locating the boundaries of the landfill and the shallow reflection seismic technique proved effective for the precise imaging of the landfill's bottom; conversely the electrical techniques, though widely employed for studying waste landfills, provided mainly qualitative and debatable results. The overall effectiveness of the surveys has been highly improved through the combined use of different techniques, whose individual responses, being strongly dependent on their specific basic physical characteristic and the complexity of the situation to be studied, did not show the same effectiveness at the two places.

  1. Towards exaggerated emphysema stereotypes

    NASA Astrophysics Data System (ADS)

    Chen, C.; Sørensen, L.; Lauze, F.; Igel, C.; Loog, M.; Feragen, A.; de Bruijne, M.; Nielsen, M.

    2012-03-01

    Classification is widely used in the context of medical image analysis and in order to illustrate the mechanism of a classifier, we introduce the notion of an exaggerated image stereotype based on training data and trained classifier. The stereotype of some image class of interest should emphasize/exaggerate the characteristic patterns in an image class and visualize the information the employed classifier relies on. This is useful for gaining insight into the classification and serves for comparison with the biological models of disease. In this work, we build exaggerated image stereotypes by optimizing an objective function which consists of a discriminative term based on the classification accuracy, and a generative term based on the class distributions. A gradient descent method based on iterated conditional modes (ICM) is employed for optimization. We use this idea with Fisher's linear discriminant rule and assume a multivariate normal distribution for samples within a class. The proposed framework is applied to computed tomography (CT) images of lung tissue with emphysema. The synthesized stereotypes illustrate the exaggerated patterns of lung tissue with emphysema, which is underpinned by three different quantitative evaluation methods.

  2. A comparison of transport algorithms for premixed, laminar steady state flames

    NASA Technical Reports Server (NTRS)

    Coffee, T. P.; Heimerl, J. M.

    1980-01-01

    The effects of different methods of approximating multispecies transport phenomena in models of premixed, laminar, steady state flames were studied. Five approximation methods that span a wide range of computational complexity were developed. Identical data for individual species properties were used for each method. Each approximation method is employed in the numerical solution of a set of five H2-02-N2 flames. For each flame the computed species and temperature profiles, as well as the computed flame speeds, are found to be very nearly independent of the approximation method used. This does not indicate that transport phenomena are unimportant, but rather that the selection of the input values for the individual species transport properties is more important than the selection of the method used to approximate the multispecies transport. Based on these results, a sixth approximation method was developed that is computationally efficient and provides results extremely close to the most sophisticated and precise method used.

  3. Speckle reduction in optical coherence tomography using two-step iteration method (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wang, Xianghong; Liu, Xinyu; Wang, Nanshuo; Yu, Xiaojun; Bo, En; Chen, Si; Liu, Linbo

    2017-02-01

    Optical coherence tomography (OCT) provides high resolution and cross-sectional images of biological tissue and is widely used for diagnosis of ocular diseases. However, OCT images suffer from speckle noise, which typically considered as multiplicative noise in nature, reducing the image resolution and contrast. In this study, we propose a two-step iteration (TSI) method to suppress those noises. We first utilize augmented Lagrange method to recover a low-rank OCT image and remove additive Gaussian noise, and then employ the simple and efficient split Bregman method to solve the Total-Variation Denoising model. We validated such proposed method using images of swine, rabbit and human retina. Results demonstrate that our TSI method outperforms the other popular methods in achieving higher peak signal-to-noise ratio (PSNR) and structure similarity (SSIM) while preserving important structural details, such as tiny capillaries and thin layers in retinal OCT images. In addition, the results of our TSI method show clearer boundaries and maintains high image contrast, which facilitates better image interpretations and analyses.

  4. Analysis of several Boolean operation based trajectory generation strategies for automotive spray applications

    NASA Astrophysics Data System (ADS)

    Gao, Guoyou; Jiang, Chunsheng; Chen, Tao; Hui, Chun

    2018-05-01

    Industrial robots are widely used in various processes of surface manufacturing, such as thermal spraying. The established robot programming methods are highly time-consuming and not accurate enough to fulfil the demands of the actual market. There are many off-line programming methods developed to reduce the robot programming effort. This work introduces the principle of several based robot trajectory generation strategy on planar surface and curved surface. Since the off-line programming software is widely used and thus facilitates the robot programming efforts and improves the accuracy of robot trajectory, the analysis of this work is based on the second development of off-line programming software Robot studio™. To meet the requirements of automotive paint industry, this kind of software extension helps provide special functions according to the users defined operation parameters. The presented planning strategy generates the robot trajectory by moving an orthogonal surface according to the information of coating surface, a series of intersection curves are then employed to generate the trajectory points. The simulation results show that the path curve created with this method is successive and smooth, which corresponds to the requirements of automotive spray industrial applications.

  5. Analytical methods for the determination of personal care products in human samples: an overview.

    PubMed

    Jiménez-Díaz, I; Zafra-Gómez, A; Ballesteros, O; Navalón, A

    2014-11-01

    Personal care products (PCPs) are organic chemicals widely used in everyday human life. Nowadays, preservatives, UV-filters, antimicrobials and musk fragrances are widely used PCPs. Different studies have shown that some of these compounds can cause adverse health effects, such as genotoxicity, which could even lead to mutagenic or carcinogenic effects, or estrogenicity because of their endocrine disruption activity. Due to the absence of official monitoring protocols, there is an increasing demand of analytical methods that allow the determination of those compounds in human samples in order to obtain more information regarding their behavior and fate in the human body. The complexity of the biological matrices and the low concentration levels of these compounds make necessary the use of advanced sample treatment procedures that afford both, sample clean-up, to remove potentially interfering matrix components, as well as the concentration of analytes. In the present work, a review of the more recent analytical methods published in the scientific literature for the determination of PCPs in human fluids and tissue samples, is presented. The work focused on sample preparation and the analytical techniques employed. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Multivariate data analysis and machine learning in Alzheimer's disease with a focus on structural magnetic resonance imaging.

    PubMed

    Falahati, Farshad; Westman, Eric; Simmons, Andrew

    2014-01-01

    Machine learning algorithms and multivariate data analysis methods have been widely utilized in the field of Alzheimer's disease (AD) research in recent years. Advances in medical imaging and medical image analysis have provided a means to generate and extract valuable neuroimaging information. Automatic classification techniques provide tools to analyze this information and observe inherent disease-related patterns in the data. In particular, these classifiers have been used to discriminate AD patients from healthy control subjects and to predict conversion from mild cognitive impairment to AD. In this paper, recent studies are reviewed that have used machine learning and multivariate analysis in the field of AD research. The main focus is on studies that used structural magnetic resonance imaging (MRI), but studies that included positron emission tomography and cerebrospinal fluid biomarkers in addition to MRI are also considered. A wide variety of materials and methods has been employed in different studies, resulting in a range of different outcomes. Influential factors such as classifiers, feature extraction algorithms, feature selection methods, validation approaches, and cohort properties are reviewed, as well as key MRI-based and multi-modal based studies. Current and future trends are discussed.

  7. Differential network analysis reveals the genome-wide landscape of estrogen receptor modulation in hormonal cancers

    PubMed Central

    Hsiao, Tzu-Hung; Chiu, Yu-Chiao; Hsu, Pei-Yin; Lu, Tzu-Pin; Lai, Liang-Chuan; Tsai, Mong-Hsun; Huang, Tim H.-M.; Chuang, Eric Y.; Chen, Yidong

    2016-01-01

    Several mutual information (MI)-based algorithms have been developed to identify dynamic gene-gene and function-function interactions governed by key modulators (genes, proteins, etc.). Due to intensive computation, however, these methods rely heavily on prior knowledge and are limited in genome-wide analysis. We present the modulated gene/gene set interaction (MAGIC) analysis to systematically identify genome-wide modulation of interaction networks. Based on a novel statistical test employing conjugate Fisher transformations of correlation coefficients, MAGIC features fast computation and adaption to variations of clinical cohorts. In simulated datasets MAGIC achieved greatly improved computation efficiency and overall superior performance than the MI-based method. We applied MAGIC to construct the estrogen receptor (ER) modulated gene and gene set (representing biological function) interaction networks in breast cancer. Several novel interaction hubs and functional interactions were discovered. ER+ dependent interaction between TGFβ and NFκB was further shown to be associated with patient survival. The findings were verified in independent datasets. Using MAGIC, we also assessed the essential roles of ER modulation in another hormonal cancer, ovarian cancer. Overall, MAGIC is a systematic framework for comprehensively identifying and constructing the modulated interaction networks in a whole-genome landscape. MATLAB implementation of MAGIC is available for academic uses at https://github.com/chiuyc/MAGIC. PMID:26972162

  8. Identifying Pleiotropic Genes in Genome-Wide Association Studies for Multivariate Phenotypes with Mixed Measurement Scales

    PubMed Central

    Williams, L. Keoki; Buu, Anne

    2017-01-01

    We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206

  9. Medicinal Plants Used by Traditional Healers in Sangurur, Elgeyo Marakwet County, Kenya

    PubMed Central

    Kigen, Gabriel; Kipkore, Wilson; Wanjohi, Bernard; Haruki, Boniface; Kemboi, Jemutai

    2017-01-01

    Background: Although herbal medical products are still widely used in Kenya, many of the medicinal plants used by traditional medical practitioners (TMPs) have not been documented, despite several challenges that are now threatening the sustainability of the practice. Objective: To document the medicinal plants and healing methods used by TMPs in a region of Kenya with several recognized herbalists for potential research. Materials and Methods: Semi-structured interviews, group discussions, and direct observations were used to collect ethnopharmacological information. The participant's bio-data, clinical conditions treated, methods of treatment, medicinal plants used, methods of preparation and administration, and dosage forms were recorded. Results: A total of 99 medicinal plants and 12 complementary preparations employed in the treatment of 64 medical conditions were identified. The most widely used plant was Rotala tenella which was used to treat nine medicinal conditions; seven each for Aloe tweediae and Dovyalis abyssinica; and six each for Basella alba and Euclea divinorum. The plants belonged to 55 families with Fabaceae family being the most frequently used (10), followed by Apocynaceae and Solanaceae, each with six species, respectively. We identified plants used to determine the sex of an unborn baby and those used to treat several conditions including anthrax and cerebral malaria and herbs used to detoxify meat from an animal that has died from anthrax. Of special interest was R. tenella which is used to prevent muscle injury. Conclusions: We have documented several plants with potential therapeutic effects. Further research may be conducted to determine their efficacy. SUMMARY The medicinal plants used by traditional healers in a community which still practices herbal medicine in Kenya were documented. A total of 99 medicinal plants and 12 complementary preparations employed in the treatment of 64 medical conditions were identified. Further research may be carried out in order to determine their therapeutic efficacies. Abbreviations Used: Fic: Informant consensus factor, Nur: Number of use reports in each category, Ns: Number of reported species, TMPs: Traditional medical practitioners. PMID:29263626

  10. Combining the 3D model generated from point clouds and thermography to identify the defects presented on the facades of a building

    NASA Astrophysics Data System (ADS)

    Huang, Yishuo; Chiang, Chih-Hung; Hsu, Keng-Tsang

    2018-03-01

    Defects presented on the facades of a building do have profound impacts on extending the life cycle of the building. How to identify the defects is a crucial issue; destructive and non-destructive methods are usually employed to identify the defects presented on a building. Destructive methods always cause the permanent damages for the examined objects; on the other hand, non-destructive testing (NDT) methods have been widely applied to detect those defects presented on exterior layers of a building. However, NDT methods cannot provide efficient and reliable information for identifying the defects because of the huge examination areas. Infrared thermography is often applied to quantitative energy performance measurements for building envelopes. Defects on the exterior layer of buildings may be caused by several factors: ventilation losses, conduction losses, thermal bridging, defective services, moisture condensation, moisture ingress, and structure defects. Analyzing the collected thermal images can be quite difficult when the spatial variations of surface temperature are small. In this paper the authors employ image segmentation to cluster those pixels with similar surface temperatures such that the processed thermal images can be composed of limited groups. The surface temperature distribution in each segmented group is homogenous. In doing so, the regional boundaries of the segmented regions can be identified and extracted. A terrestrial laser scanner (TLS) is widely used to collect the point clouds of a building, and those point clouds are applied to reconstruct the 3D model of the building. A mapping model is constructed such that the segmented thermal images can be projected onto the 2D image of the specified 3D building. In this paper, the administrative building in Chaoyang University campus is used as an example. The experimental results not only provide the defect information but also offer their corresponding spatial locations in the 3D model.

  11. A simple method using two-step hot embossing technique with shrinking for fabrication of cross microchannels on PMMA substrate and its application to electrophoretic separation of amino acids in functional drinks.

    PubMed

    Wiriyakun, Natta; Nacapricha, Duangjai; Chantiwas, Rattikan

    2016-12-01

    This work presents a simple hot embossing method with a shrinking procedure to produce cross-shape microchannels on poly(methyl methacrylate) (PMMA) substrate for the fabrication of an electrophoresis chip. The proposed method employed a simple two-step hot embossing technique, carried out consecutively on the same piece of substrate to make the crossing channels. Studies of embossing conditions, i.e. temperature, pressure and time, were carried out to investigate their effects on the dimension of the microchannels. Applying a simple shrinking procedure reduced the size of the channels from 700±20µm wide×150±5µm deep to 250±10µm wide×30±2µm deep, i.e. 80% and 64% reduction in the depth and width, respectively. Thermal fusion was employed to bond the PMMA substrate with a PMMA cover plate to produce the microfluidic device. Replication of microchip was achieved by precise control of conditions in the fabrication process (pressure, temperature and time), resulting in lower than 7% RSD of channel dimension, width and depth (n =10 devices). The method was simple and robust without the use of expensive equipment to construct the microstructure on a thermoplastic substrate. The PMMA microchip was used for demonstration of amine functionalization on the PMMA surface, measurement of electroosmotic flow and for electrophoretic separation of amino acids in functional drink samples. The precision of migration time and peak area of the amino acids, Lys, Ile and Phe at 125μM to 500μM, were in the range 3.2-4.2% RSD (n=9 devices) and 4.5-5.3% RSD (n=9 devices), respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Fumagillin: an overview of recent scientific advances and their significance for apiculture.

    PubMed

    van den Heever, Johan P; Thompson, Thomas S; Curtis, Jonathan M; Ibrahim, Abdullah; Pernal, Stephen F

    2014-04-02

    Fumagillin is a potent fungal metabolite first isolated from Aspergillus fumigatus. It is widely used in apiculture and human medicine against a variety of microsporidian fungal infections. It has been the subject of research in cancer treatments by employing its angiogenesis inhibitory properties. The toxicity of fumagillin has limited its use for human applications and spurred the development of analogues using structure-activity relationships relating to its angiogenesis properties. These discoveries may hold the key to the development of alternative chemical treatments for use in apiculture. The toxicity of fumagillin to humans is important for beekeeping, because any residues remaining in hive products pose a direct risk to the consumer. The analytical methods published to date measure fumagillin and its decomposition products but overlook the dicyclohexylamine counterion of the salt form widely used in apiculture.

  13. Stellar photometry with the Wide Field/Planetary Camera of the Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Holtzman, Jon A.

    1990-07-01

    Simulations of Wide Field/Planetary Camera (WF/PC) images are analyzed in order to discover the most effective techniques for stellar photometry and to evaluate the accuracy and limitations of these techniques. The capabilities and operation of the WF/PC and the simulations employed in the study are described. The basic techniques of stellar photometry and methods to improve these techniques for the WF/PC are discussed. The correct parameters for star detection, aperture photometry, and point-spread function (PSF) fitting with the DAOPHOT software of Stetson (1987) are determined. Consideration is given to undersampling of the stellar images by the detector; variations in the PSF; and the crowding of the stellar images. It is noted that, with some changes DAOPHOT, is able to generate photometry almost to the level of photon statistics.

  14. Substance Use, Education, Employment, and Criminal Activity Outcomes of Adolescents in Outpatient Chemical Dependency Programs

    PubMed Central

    Balsa, Ana I.; Homer, Jenny F.; French, Michael T.; Weisner, Constance M.

    2010-01-01

    Although the primary outcome of interest in clinical evaluations of addiction treatment programs is usually abstinence, participation in these programs can have a wide range of consequences. This study evaluated the effects of treatment initiation on substance use, school attendance, employment, and involvement in criminal activity at 12 months post-admission for 419 adolescents (aged 12 to 18) enrolled in chemical dependency recovery programs in a large managed care health plan. Instrumental variables estimation methods were used to account for unobserved selection into treatment by jointly modeling the likelihood of participation in treatment and the odds of attaining a certain outcome or level of an outcome. Treatment initiation significantly increased the likelihood of attending school, promoted abstinence, and decreased the probability of adolescent employment, but it did not significantly affect participation in criminal activity at the 12-month follow-up. These findings highlight the need to address selection in a non-experimental study and demonstrate the importance of considering multiple outcomes when assessing the effectiveness of adolescent treatment. PMID:18064572

  15. Wide-Field Megahertz OCT Imaging of Patients with Diabetic Retinopathy

    PubMed Central

    Reznicek, Lukas; Kolb, Jan P.; Klein, Thomas; Mohler, Kathrin J.; Huber, Robert; Kernt, Marcus; Märtz, Josef; Neubauer, Aljoscha S.

    2015-01-01

    Purpose. To evaluate the feasibility of wide-field Megahertz (MHz) OCT imaging in patients with diabetic retinopathy. Methods. A consecutive series of 15 eyes of 15 patients with diagnosed diabetic retinopathy were included. All patients underwent Megahertz OCT imaging, a close clinical examination, slit lamp biomicroscopy, and funduscopic evaluation. To acquire densely sampled, wide-field volumetric datasets, an ophthalmic 1050 nm OCT prototype system based on a Fourier-domain mode-locked (FDML) laser source with 1.68 MHz A-scan rate was employed. Results. We were able to obtain OCT volume scans from all included 15 patients. Acquisition time was 1.8 seconds. Obtained volume datasets consisted of 2088 × 1044 A-scans of 60° of view. Thus, reconstructed en face images had a resolution of 34.8 pixels per degree in x-axis and 17.4 pixels per degree. Due to the densely sampled OCT volume dataset, postprocessed customized cross-sectional B-frames through pathologic changes such as an individual microaneurysm or a retinal neovascularization could be imaged. Conclusions. Wide-field Megahertz OCT is feasible to successfully image patients with diabetic retinopathy at high scanning rates and a wide angle of view, providing information in all three axes. The Megahertz OCT is a useful tool to screen diabetic patients for diabetic retinopathy. PMID:26273665

  16. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    PubMed

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  17. A Microstrip Patch-Fed Short Backfire Antenna for the Tracking and Data Relay Satellite System-Continuation (TDRSS-C) Multiple Access (MA) Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.; Kory, Carol L.; Lambert, Kevin M.; Acosta, Roberto J.

    2006-01-01

    Short Backfire Antennas (SBAs) are widely utilized for mobile satellite communications, tracking, telemetry, and wireless local area network (WLAN) applications due to their compact structure and excellent radiation characteristics [1-3]. Typically, these SBA s consist of an excitation element (i.e., a half-wavelength dipole), a reflective bottom plane, a planar sub-reflector located above the "exciter", and an outer circular rim. This configuration is capable of achieving gains on the order of 13-15 dBi, but with relatively narrow bandwidths (approx.3%-5%), making it incompatible with the requirements of the next generation enhanced Tracking and Data Relay Satellite System-Continuation (TDRSS-C) Multiple Access (MA) array [1]. Several attempts have been made to enhance the bandwidth performance of the common dipole-fed SBA by employing various other feeding mechanisms (e.g., waveguide, slot) with moderate success [4-5]. In this paper, a novel method of using a microstrip patch is employed for the first time to excite an SBA. The patch element is fed via two H-shaped slots electromagnetically coupled to a broadband hybrid coupler to maintain a wide bandwidth, as well as provide for dual circular polarization capabilities.

  18. Annual Survey of Public Employment & Payroll Summary Report: 2013. Economy-Wide Statistics Briefs: Public Sector

    ERIC Educational Resources Information Center

    Willhide, Robert Jesse

    2014-01-01

    This report is part of a series of reports that provides information on the structure, function, finances, taxation, employment, and pension systems of the United States' approximately 90,000 state and local governments. This report presents data on state and local government employment and payroll based on information collected by the 2013 Annual…

  19. Voices from the Field: Developing Employability Skills for Archaeological Students Using a Project Based Learning Approach

    ERIC Educational Resources Information Center

    Wood, Gaynor

    2016-01-01

    Graduate employment statistics are receiving considerable attention in UK universities. This paper looks at how a wide range of employability attributes can be developed with students, through the innovative use of the Project Based Learning (PjBL) approach. The case study discussed here involves a group of archaeology students from the University…

  20. Employment-based health insurance is failing: now what?

    PubMed

    Enthoven, Alain C

    2003-01-01

    Employment-based health insurance is failing. Costs are out of control. Employers have no effective strategy to deal with this. They must think strategically about fundamental change. This analysis explains how employers' purchasing policies contribute to rising costs and block growth of economical care. Single-source managed care is ineffective, and effective managed care cannot be a single source. Employers should create exchanges through which they can offer employees wide, responsible, individual, multiple choices among health care delivery systems and create serious competition based on value for money. Recently introduced technology can assist this process.

  1. [Polymeric drug carriers activated by ultrasounds energy].

    PubMed

    Kik, Krzysztof; Lwow, Felicja; Szmigiero, Leszek

    2007-01-01

    In the last two decades an extensive research on the employment of ultrasounds in anticancer therapy has been noticed. So far ultrasounds have been widely used in medicine for diagnostic purposes (ultrasonography), but their great therapeutic potential and the development of polymer based antineoplastic drug carriers have persuaded many investigators to start research on the employment of ultrasounds in anticancer therapy. A new therapeutic concept based on the controlled drug's molecules release from their transporting polymer carriers has been proposed. Cavitation, a phenomenon characteristic for the action of ultrasounds, is used to destroy polymeric drug carriers and for drug release in target sites. The sonodynamic therapy (SDT) which utilizes ultrasonic waves for "acoustic drug activation" leading to the enhancement of cytotoxic activity of some drugs has also been developed. Furthermore, a long standing research on ultrasounds resulted in a new concept based on hyperthermia. This method of cancer treatment does not require any chemotherapeutic agent to be applied.

  2. Performance analysis of ‘Perturb and Observe’ and ‘Incremental Conductance’ MPPT algorithms for PV system

    NASA Astrophysics Data System (ADS)

    Lodhi, Ehtisham; Lodhi, Zeeshan; Noman Shafqat, Rana; Chen, Fieda

    2017-07-01

    Photovoltaic (PV) system usually employed The Maximum power point tracking (MPPT) techniques for increasing its efficiency. The performance of the PV system perhaps boosts by controlling at its apex point of power, in this way maximal power can be given to load. The proficiency of a PV system usually depends upon irradiance, temperature and array architecture. PV array shows a non-linear style for V-I curve and maximal power point on V-P curve also varies with changing environmental conditions. MPPT methods grantees that a PV module is regulated at reference voltage and to produce entire usage of the maximal output power. This paper gives analysis between two widely employed Perturb and Observe (P&O) and Incremental Conductance (INC) MPPT techniques. Their performance is evaluated and compared through theoretical analysis and digital simulation on the basis of response time and efficiency under varying irradiance and temperature condition using Matlab/Simulink.

  3. [Work-related stress and psychological distress assessment in urban and suburban public transportation companies].

    PubMed

    Romeo, L; Lazzarini, G; Farisè, E; Quintarelli, E; Riolfi, A; Perbellini, L

    2012-01-01

    The risk of work-related stress has been determined in bus drivers and workers employed in the service department of two urban and suburban public transportation companies. The INAIL evaluation method (Check list and HSE indicator tool) was used. The GHQ-12 questionnaire, which is widely used to assess the level of psychological distress, was also employed. 81.9% of workers involved in the survey answered both the HSE indicator tool and the GHQ-12 questionnaire. The Check list evaluation showed an increase in quantifiable company stress indicators while close examination using the HSE indicator tool demonstrated critical situations for all the subscales, with the control subscales more problematic in bus drivers. The demand, manager's support, relationships and change subscales were most associated with psychological distress in bus drivers, while relationships, role, change and demand subscales were negatively related in workers of the service department.

  4. Stochastic convergence of renewable energy consumption in OECD countries: a fractional integration approach.

    PubMed

    Solarin, Sakiru Adebola; Gil-Alana, Luis Alberiko; Al-Mulali, Usama

    2018-04-13

    In this article, we have examined the hypothesis of convergence of renewable energy consumption in 27 OECD countries. However, instead of relying on classical techniques, which are based on the dichotomy between stationarity I(0) and nonstationarity I(1), we consider a more flexible approach based on fractional integration. We employ both parametric and semiparametric techniques. Using parametric methods, evidence of convergence is found in the cases of Mexico, Switzerland and Sweden along with the USA, Portugal, the Czech Republic, South Korea and Spain, and employing semiparametric approaches, we found evidence of convergence in all these eight countries along with Australia, France, Japan, Greece, Italy and Poland. For the remaining 13 countries, even though the orders of integration of the series are smaller than one in all cases except Germany, the confidence intervals are so wide that we cannot reject the hypothesis of unit roots thus not finding support for the hypothesis of convergence.

  5. Noise Estimation in Electroencephalogram Signal by Using Volterra Series Coefficients

    PubMed Central

    Hassani, Malihe; Karami, Mohammad Reza

    2015-01-01

    The Volterra model is widely used for nonlinearity identification in practical applications. In this paper, we employed Volterra model to find the nonlinearity relation between electroencephalogram (EEG) signal and the noise that is a novel approach to estimate noise in EEG signal. We show that by employing this method. We can considerably improve the signal to noise ratio by the ratio of at least 1.54. An important issue in implementing Volterra model is its computation complexity, especially when the degree of nonlinearity is increased. Hence, in many applications it is urgent to reduce the complexity of computation. In this paper, we use the property of EEG signal and propose a new and good approximation of delayed input signal to its adjacent samples in order to reduce the computation of finding Volterra series coefficients. The computation complexity is reduced by the ratio of at least 1/3 when the filter memory is 3. PMID:26284176

  6. Analyzing the Heterogeneous Hierarchy of Cultural Heritage Materials: Analytical Imaging.

    PubMed

    Trentelman, Karen

    2017-06-12

    Objects of cultural heritage significance are created using a wide variety of materials, or mixtures of materials, and often exhibit heterogeneity on multiple length scales. The effective study of these complex constructions thus requires the use of a suite of complementary analytical technologies. Moreover, because of the importance and irreplaceability of most cultural heritage objects, researchers favor analytical techniques that can be employed noninvasively, i.e., without having to remove any material for analysis. As such, analytical imaging has emerged as an important approach for the study of cultural heritage. Imaging technologies commonly employed, from the macroscale through the micro- to nanoscale, are discussed with respect to how the information obtained helps us understand artists' materials and methods, the cultures in which the objects were created, how the objects may have changed over time, and importantly, how we may develop strategies for their preservation.

  7. Wide Linear Corticotomy and Anterior Segmental Osteotomy Under Local Anesthesia Combined Corticision for Correcting Severe Anterior Protrusion With Insufficient Alveolar Housing.

    PubMed

    Noh, Min-Ki; Lee, Baek-Soo; Kim, Shin-Yeop; Jeon, Hyeran Helen; Kim, Seong-Hun; Nelson, Gerald

    2017-11-01

    This article presents an alternate surgical treatment method to correct a severe anterior protrusion in an adult patient with an extremely thin alveolus. To accomplish an effective and efficient anterior segmental retraction without periodontal complications, the authors performed, under local anesthesia, a wide linear corticotomy and corticision in the maxilla and an anterior segmental osteotomy in mandible. In the maxilla, a wide linear corticotomy was performed under local anesthesia. In the maxillary first premolar area, a wide section of cortical bone was removed. Retraction forces were applied buccolingually with the aid of temporary skeletal anchorage devices. Corticision was later performed to close residual extraction space. In the mandible, an anterior segmental osteotomy was performed and the first premolars were extracted under local anesthesia. In the maxilla, a wide linear corticotomy facilitated a bony block movement with temporary skeletal anchorage devices, without complications. The remaining extraction space after the bony block movement was closed effectively, accelerated by corticision. In the mandible, anterior segmental retraction was facilitated by an anterior segmental osteotomy performed under local anesthesia. Corticision was later employed to accelerate individual tooth movements. A wide linear corticotomy and an anterior segmental osteotomy combined with corticision can be an effective and efficient alternative to conventional orthodontic treatment in the bialveolar protrusion patient with an extremely thin alveolar housing.

  8. Parental employment, family structure, and child's health insurance.

    PubMed

    Rolett, A; Parker, J D; Heck, K E; Makuc, D M

    2001-01-01

    To examine the impact of family structure on the relationship between parental employment characteristics and employer-sponsored health insurance coverage among children with employed parents in the United States. National Health Interview Survey data for 1993-1995 was used to estimate proportions of children without employer-sponsored health insurance, by family structure, separately according to maternal and paternal employment characteristics. In addition, relative odds of being without employer-sponsored insurance were estimated, controlling for family structure and child's age, race, and poverty status. Children with 2 employed parents were more likely to have employer-sponsored health insurance coverage than children with 1 employed parent, even among children in 2-parent families. However, among children with employed parents, the percentage with employer-sponsored health insurance coverage varied widely, depending on the hours worked, employment sector, occupation, industry, and firm size. Employer-sponsored health insurance coverage for children is extremely variable, depending on employment characteristics and marital status of the parents.

  9. A nonrecursive 'Order N' preconditioned conjugate gradient/range space formulation of MDOF dynamics

    NASA Technical Reports Server (NTRS)

    Kurdila, A. J.; Menon, R.; Sunkel, John

    1991-01-01

    This paper addresses the requirements of present-day mechanical system simulations of algorithms that induce parallelism on a fine scale and of transient simulation methods which must be automatically load balancing for a wide collection of system topologies and hardware configurations. To this end, a combination range space/preconditioned conjugage gradient formulation of multidegree-of-freedon dynamics is developed, which, by employing regular ordering of the system connectivity graph, makes it possible to derive an extremely efficient preconditioner from the range space metric (as opposed to the system coefficient matrix). Because of the effectiveness of the preconditioner, the method can achieve performance rates that depend linearly on the number of substructures. The method, termed 'Order N' does not require the assembly of system mass or stiffness matrices, and is therefore amenable to implementation on work stations. Using this method, a 13-substructure model of the Space Station was constructed.

  10. Accessible methods for the dynamic time-scale decomposition of biochemical systems.

    PubMed

    Surovtsova, Irina; Simus, Natalia; Lorenz, Thomas; König, Artjom; Sahle, Sven; Kummer, Ursula

    2009-11-01

    The growing complexity of biochemical models asks for means to rationally dissect the networks into meaningful and rather independent subnetworks. Such foregoing should ensure an understanding of the system without any heuristics employed. Important for the success of such an approach is its accessibility and the clarity of the presentation of the results. In order to achieve this goal, we developed a method which is a modification of the classical approach of time-scale separation. This modified method as well as the more classical approach have been implemented for time-dependent application within the widely used software COPASI. The implementation includes different possibilities for the representation of the results including 3D-visualization. The methods are included in COPASI which is free for academic use and available at www.copasi.org. irina.surovtsova@bioquant.uni-heidelberg.de Supplementary data are available at Bioinformatics online.

  11. A simple, multidimensional approach to high-throughput discovery of catalytic reactions.

    PubMed

    Robbins, Daniel W; Hartwig, John F

    2011-09-09

    Transition metal complexes catalyze many important reactions that are employed in medicine, materials science, and energy production. Although high-throughput methods for the discovery of catalysts that would mirror related approaches for the discovery of medicinally active compounds have been the focus of much attention, these methods have not been sufficiently general or accessible to typical synthetic laboratories to be adopted widely. We report a method to evaluate a broad range of catalysts for potential coupling reactions with the use of simple laboratory equipment. Specifically, we screen an array of catalysts and ligands with a diverse mixture of substrates and then use mass spectrometry to identify reaction products that, by design, exceed the mass of any single substrate. With this method, we discovered a copper-catalyzed alkyne hydroamination and two nickel-catalyzed hydroarylation reactions, each of which displays excellent functional-group tolerance.

  12. MATLAB algorithm to implement soil water data assimilation with the Ensemble Kalman Filter using HYDRUS.

    PubMed

    Valdes-Abellan, Javier; Pachepsky, Yakov; Martinez, Gonzalo

    2018-01-01

    Data assimilation is becoming a promising technique in hydrologic modelling to update not only model states but also to infer model parameters, specifically to infer soil hydraulic properties in Richard-equation-based soil water models. The Ensemble Kalman Filter method is one of the most widely employed method among the different data assimilation alternatives. In this study the complete Matlab© code used to study soil data assimilation efficiency under different soil and climatic conditions is shown. The code shows the method how data assimilation through EnKF was implemented. Richards equation was solved by the used of Hydrus-1D software which was run from Matlab. •MATLAB routines are released to be used/modified without restrictions for other researchers•Data assimilation Ensemble Kalman Filter method code.•Soil water Richard equation flow solved by Hydrus-1D.

  13. Database and new models based on a group contribution method to predict the refractive index of ionic liquids.

    PubMed

    Wang, Xinxin; Lu, Xingmei; Zhou, Qing; Zhao, Yongsheng; Li, Xiaoqian; Zhang, Suojiang

    2017-08-02

    Refractive index is one of the important physical properties, which is widely used in separation and purification. In this study, the refractive index data of ILs were collected to establish a comprehensive database, which included about 2138 pieces of data from 1996 to 2014. The Group Contribution-Artificial Neural Network (GC-ANN) model and Group Contribution (GC) method were employed to predict the refractive index of ILs at different temperatures from 283.15 K to 368.15 K. Average absolute relative deviations (AARD) of the GC-ANN model and the GC method were 0.179% and 0.628%, respectively. The results showed that a GC-ANN model provided an effective way to estimate the refractive index of ILs, whereas the GC method was simple and extensive. In summary, both of the models were accurate and efficient approaches for estimating refractive indices of ILs.

  14. Computational plasticity algorithm for particle dynamics simulations

    NASA Astrophysics Data System (ADS)

    Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

    2018-01-01

    The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

  15. Automated variance reduction for MCNP using deterministic methods.

    PubMed

    Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B

    2005-01-01

    In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.

  16. Approximating high-dimensional dynamics by barycentric coordinates with linear programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirata, Yoshito, E-mail: yoshito@sat.t.u-tokyo.ac.jp; Aihara, Kazuyuki; Suzuki, Hideyuki

    The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics ofmore » the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.« less

  17. Multibody model reduction by component mode synthesis and component cost analysis

    NASA Technical Reports Server (NTRS)

    Spanos, J. T.; Mingori, D. L.

    1990-01-01

    The classical assumed-modes method is widely used in modeling the dynamics of flexible multibody systems. According to the method, the elastic deformation of each component in the system is expanded in a series of spatial and temporal functions known as modes and modal coordinates, respectively. This paper focuses on the selection of component modes used in the assumed-modes expansion. A two-stage component modal reduction method is proposed combining Component Mode Synthesis (CMS) with Component Cost Analysis (CCA). First, each component model is truncated such that the contribution of the high frequency subsystem to the static response is preserved. Second, a new CMS procedure is employed to assemble the system model and CCA is used to further truncate component modes in accordance with their contribution to a quadratic cost function of the system output. The proposed method is demonstrated with a simple example of a flexible two-body system.

  18. Approximating high-dimensional dynamics by barycentric coordinates with linear programming.

    PubMed

    Hirata, Yoshito; Shiro, Masanori; Takahashi, Nozomu; Aihara, Kazuyuki; Suzuki, Hideyuki; Mas, Paloma

    2015-01-01

    The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics of the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.

  19. Automated detection of age-related macular degeneration in OCT images using multiple instance learning

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Liu, Xiaoming; Yang, Zhou

    2017-07-01

    Age-related Macular Degeneration (AMD) is a kind of macular disease which mostly occurs in old people,and it may cause decreased vision or even lead to permanent blindness. Drusen is an important clinical indicator for AMD which can help doctor diagnose disease and decide the strategy of treatment. Optical Coherence Tomography (OCT) is widely used in the diagnosis of ophthalmic diseases, include AMD. In this paper, we propose a classification method based on Multiple Instance Learning (MIL) to detect AMD. Drusen can exist in a few slices of OCT images, and MIL is utilized in our method. We divided the method into two phases: training phase and testing phase. We train the initial features and clustered to create a codebook, and employ the trained classifier in the test set. Experiment results show that our method achieved high accuracy and effectiveness.

  20. Wide wavelength range tunable one-dimensional silicon nitride nano-grating guided mode resonance filter based on azimuthal rotation

    NASA Astrophysics Data System (ADS)

    Yukino, Ryoji; Sahoo, Pankaj K.; Sharma, Jaiyam; Takamura, Tsukasa; Joseph, Joby; Sandhu, Adarsh

    2017-01-01

    We describe wavelength tuning in a one dimensional (1D) silicon nitride nano-grating guided mode resonance (GMR) structure under conical mounting configuration of the device. When the GMR structure is rotated about the axis perpendicular to the surface of the device (azimuthal rotation) for light incident at oblique angles, the conditions for resonance are different than for conventional GMR structures under classical mounting. These resonance conditions enable tuning of the GMR peak position over a wide range of wavelengths. We experimental demonstrate tuning over a range of 375 nm between 500 nm˜875 nm. We present a theoretical model to explain the resonance conditions observed in our experiments and predict the peak positions with show excellent agreement with experiments. Our method for tuning wavelengths is simpler and more efficient than conventional procedures that employ variations in the design parameters of structures or conical mounting of two-dimensional (2D) GMR structures and enables a single 1D GMR device to function as a high efficiency wavelength filter over a wide range of wavelengths. We expect tunable filters based on this technique to be applicable in a wide range of fields including astronomy and biomedical imaging.

  1. Protocol matters: which methylome are you actually studying?

    PubMed Central

    Robinson, Mark D; Statham, Aaron L; Speed, Terence P; Clark, Susan J

    2011-01-01

    The field of epigenetics is now capitalizing on the vast number of emerging technologies, largely based on second-generation sequencing, which interrogate DNA methylation status and histone modifications genome-wide. However, getting an exhaustive and unbiased view of a methylome at a reasonable cost is proving to be a significant challenge. In this article, we take a closer look at the impact of the DNA sequence and bias effects introduced to datasets by genome-wide DNA methylation technologies and where possible, explore the bioinformatics tools that deconvolve them. There remains much to be learned about the performance of genome-wide technologies, the data we mine from these assays and how it reflects the actual biology. While there are several methods to interrogate the DNA methylation status genome-wide, our opinion is that no single technique suitably covers the minimum criteria of high coverage and, high resolution at a reasonable cost. In fact, the fraction of the methylome that is studied currently depends entirely on the inherent biases of the protocol employed. There is promise for this to change, as the third generation of sequencing technologies is expected to again ‘revolutionize’ the way that we study genomes and epigenomes. PMID:21566704

  2. An improved set of standards for finding cost for cost-effectiveness analysis.

    PubMed

    Barnett, Paul G

    2009-07-01

    Guidelines have helped standardize methods of cost-effectiveness analysis, allowing different interventions to be compared and enhancing the generalizability of study findings. There is agreement that all relevant services be valued from the societal perspective using a long-term time horizon and that more exact methods be used to cost services most affected by the study intervention. Guidelines are not specific enough with respect to costing methods, however. The literature was reviewed to identify the problems associated with the 4 principal methods of cost determination. Microcosting requires direct measurement and is ordinarily reserved to cost novel interventions. Analysts should include nonwage labor cost, person-level and institutional overhead, and the cost of development, set-up activities, supplies, space, and screening. Activity-based cost systems have promise of finding accurate costs of all services provided, but are not widely adopted. Quality must be evaluated and the generalizability of cost estimates to other settings must be considered. Administrative cost estimates, chiefly cost-adjusted charges, are widely used, but the analyst must consider items excluded from the available system. Gross costing methods determine quantity of services used and employ a unit cost. If the intervention will affect the characteristics of a service, the method should not assume that the service is homogeneous. Questions are posed for future reviews of the quality of costing methods. The analyst must avoid inappropriate assumptions, especially those that bias the analysis by exclusion of costs that are affected by the intervention under study.

  3. Development of an Inverse Algorithm for Resonance Inspection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Canhai; Xu, Wei; Sun, Xin

    2012-10-01

    Resonance inspection (RI), which employs the natural frequency spectra shift between the good and the anomalous part populations to detect defects, is a non-destructive evaluation (NDE) technique with many advantages such as low inspection cost, high testing speed, and broad applicability to structures with complex geometry compared to other contemporary NDE methods. It has already been widely used in the automobile industry for quality inspections of safety critical parts. Unlike some conventionally used NDE methods, the current RI technology is unable to provide details, i.e. location, dimension, or types, of the flaws for the discrepant parts. Such limitation severely hindersmore » its wide spread applications and further development. In this study, an inverse RI algorithm based on maximum correlation function is proposed to quantify the location and size of flaws for a discrepant part. A dog-bone shaped stainless steel sample with and without controlled flaws are used for algorithm development and validation. The results show that multiple flaws can be accurately pinpointed back using the algorithms developed, and the prediction accuracy decreases with increasing flaw numbers and decreasing distance between flaws.« less

  4. Calculation of total and ionization cross sections for electron scattering by primary benzene compounds

    NASA Astrophysics Data System (ADS)

    Singh, Suvam; Naghma, Rahla; Kaur, Jaspreet; Antony, Bobby

    2016-07-01

    The total and ionization cross sections for electron scattering by benzene, halobenzenes, toluene, aniline, and phenol are reported over a wide energy domain. The multi-scattering centre spherical complex optical potential method has been employed to find the total elastic and inelastic cross sections. The total ionization cross section is estimated from total inelastic cross section using the complex scattering potential-ionization contribution method. In the present article, the first theoretical calculations for electron impact total and ionization cross section have been performed for most of the targets having numerous practical applications. A reasonable agreement is obtained compared to existing experimental observations for all the targets reported here, especially for the total cross section.

  5. Two-Relaxation-Time Lattice Boltzmann Method for Advective-Diffusive-Reactive Transport

    NASA Astrophysics Data System (ADS)

    Yan, Z.; Hilpert, M.

    2016-12-01

    The lattice Boltzmann method (LBM) has been applied to study a wide range of reactive transport in porous and fractured media. The single-relaxation-time (SRT) LBM, employing single relaxation time, is the most popular LBM due to its simplicity of understanding and implementation. Nevertheless, the SRT LBM may suffer from numerical instability for small value of the relaxation time. By contrast, the multiple-relaxation-time (MRT) LBM, employing multiple relaxation times, can improve the numerical stability through tuning the multiple relaxation times, but the complexity of implementing this method restricts its applications. The two-relaxation-time (TRT) LBM, which employs two relaxation times, combines the advantages of SRT and MRT LBMs. The TRT LBM can produce simulations with better accuracy and stability than the SRT one, and is easier to implement than the MRT one. This work evaluated the numerical accuracy and stability of the TRT method by comparing the simulation results with analytical solutions of Gaussian hill transport and Taylor dispersion under different advective velocities. The accuracy generally increased with the tunable relaxation time τ, and the stability first increased and then decreased as τ increased, showing an optimal TRT method emerging the best numerical stability. The free selection of τ enabled the TRT LBM to simulate the Gaussian hill transport and Taylor dispersion under relatively high advective velocity, under which the SRT LBM suffered from numerical instability. Finally, the TRT method was applied to study the contaminant degradation by chemotactic microorganisms in porous media, which acted as a reprehensive of reactive transport in this study, and well predicted the evolution of microorganisms and degradation of contaminants for different transport scenarios. To sum up, the TRT LBM produced simulation results with good accuracy and stability for various advective-diffusive-reactive transport through tuning the relaxation time τ, illustrating its potential to study various biogeochemical behaviors in the subsurface environment.

  6. Accurate small and wide angle x-ray scattering profiles from atomic models of proteins and nucleic acids

    NASA Astrophysics Data System (ADS)

    Nguyen, Hung T.; Pabit, Suzette A.; Meisburger, Steve P.; Pollack, Lois; Case, David A.

    2014-12-01

    A new method is introduced to compute X-ray solution scattering profiles from atomic models of macromolecules. The three-dimensional version of the Reference Interaction Site Model (RISM) from liquid-state statistical mechanics is employed to compute the solvent distribution around the solute, including both water and ions. X-ray scattering profiles are computed from this distribution together with the solute geometry. We describe an efficient procedure for performing this calculation employing a Lebedev grid for the angular averaging. The intensity profiles (which involve no adjustable parameters) match experiment and molecular dynamics simulations up to wide angle for two proteins (lysozyme and myoglobin) in water, as well as the small-angle profiles for a dozen biomolecules taken from the BioIsis.net database. The RISM model is especially well-suited for studies of nucleic acids in salt solution. Use of fiber-diffraction models for the structure of duplex DNA in solution yields close agreement with the observed scattering profiles in both the small and wide angle scattering (SAXS and WAXS) regimes. In addition, computed profiles of anomalous SAXS signals (for Rb+ and Sr2+) emphasize the ionic contribution to scattering and are in reasonable agreement with experiment. In cases where an absolute calibration of the experimental data at q = 0 is available, one can extract a count of the excess number of waters and ions; computed values depend on the closure that is assumed in the solution of the Ornstein-Zernike equations, with results from the Kovalenko-Hirata closure being closest to experiment for the cases studied here.

  7. Health Status After Cancer: Does It Matter Which Hospital You Belong To?

    PubMed Central

    2010-01-01

    Background Survival rates are widely used to compare the quality of cancer care. However, the extent to which cancer survivors regain full physical or cognitive functioning is not captured by this statistic. To address this concern we introduce post-diagnosis employment as a supplemental measure of the quality of cancer care. Methods This study is based on individual level data from the Norwegian Cancer Registry (n = 46,720) linked with data on labor market outcomes and socioeconomic status from Statistics Norway. We study variation across Norwegian hospital catchment areas (n = 55) with respect to survival and employment five years after cancer diagnosis. To handle the selection problem, we exploit the fact that cancer patients in Norway (until 2001) have been allocated to local hospitals based on their place of residence. Results We document substantial differences across catchment areas with respect to patients' post-diagnosis employment rates. Conventional quality indicators based on survival rates indicate smaller differences. The two sets of indicators are only moderately correlated. Conclusions This analysis shows that indicators based on survival and post-diagnosis employment may capture different parts of the health status distribution, and that using only one of them to capture quality of care may be insufficient. PMID:20626866

  8. Occupational Characteristics of Adults with Pediatric-Onset Spinal Cord Injury

    PubMed Central

    Zebracki, Kathy; Vogel, Lawrence C.

    2015-01-01

    Background: Employment rates among individuals with spinal cord injury (SCI) are lower than in the general population and little is known about the specific occupations in which they are employed. Objectives: To describe specific occupations of adults with pediatric-onset SCI using the 2010 Standard Occupational Classification (SOC) system and to determine associations between SOC occupations and demographic factors. Methods: Cross-sectional data specific to education and employment were collected from the last interviews of a larger longitudinal study. Occupations were categorized according to the 2010 SOC system. SOC groups were compared within gender level of injury and final education. Results: Of the 461 total participants 219 (47.5%) were employed and specific occupations were available for 179. Among the SOC groups Education Law Community Service Arts and Media Occupations were most prevalent (30.2%) followed by Management Business and Finance Occupations (21.1%) Computer Engineering and Science Occupations (10.6%) Administrative and Office Support Occupations (10.0%) Service Occupations (7.3%) Healthcare Practitioners and Technical Occupations (3.9%) and Production Occupations (3.4%). Differences were found in the distribution of SOC groups between gender levels of injury and final education groups. Conclusion: A wide variety of occupations were reported in adults with pediatric-onset SCI generally in concordance with final education and functional ability levels. PMID:25762856

  9. Polychlorinated biphenyl exposure, diabetes and endogenous hormones: a cross-sectional study in men previously employed at a capacitor manufacturing plant

    PubMed Central

    2012-01-01

    Background Studies have shown associations of diabetes and endogenous hormones with exposure to a wide variety of organochlorines. We have previously reported positive associations of polychlorinated biphenyls (PCBs) and inverse associations of selected steroid hormones with diabetes in postmenopausal women previously employed in a capacitor manufacturing plant. Methods This paper examines associations of PCBs with diabetes and endogenous hormones in 63 men previously employed at the same plant who in 1996 underwent surveys of their exposure and medical history and collection of bloods and urine for measurements of PCBs, lipids, liver function, hematologic markers and endogenous hormones. Results PCB exposure was positively associated with diabetes and age and inversely associated with thyroid stimulating hormone and triiodothyronine-uptake. History of diabetes was significantly related to total PCBs and all PCB functional groupings, but not to quarters worked and job score, after control for potential confounders. None of the exposures were related to insulin resistance (HOMA-IR) in non-diabetic men. Conclusions Associations of PCBs with specific endogenous hormones differ in some respects from previous findings in postmenopausal women employed at the capacitor plant. Results from this study, however, do confirm previous reports relating PCB exposure to diabetes and suggest that these associations are not mediated by measured endogenous hormones. PMID:22931295

  10. Unraveling the Affordances of "Silas Marner" in a Japanese University EFL Context

    ERIC Educational Resources Information Center

    Canning, Nicholas Alexander; Nelson, Mark Evan

    2018-01-01

    Graded readers, simplified versions of literature and other texts at graduated levels of difficulty, are widely employed in contexts of foreign language pedagogy and are widely considered to be a form of written-language input ostensibly suitable for a wide array of developmental stages. However, the efficacy of graded readers is not unchallenged,…

  11. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  12. Application of Canonical Effective Methods to Background-Independent Theories

    NASA Astrophysics Data System (ADS)

    Buyukcam, Umut

    Effective formalisms play an important role in analyzing phenomena above some given length scale when complete theories are not accessible. In diverse exotic but physically important cases, the usual path-integral techniques used in a standard Quantum Field Theory approach seldom serve as adequate tools. This thesis exposes a new effective method for quantum systems, called the Canonical Effective Method, which owns particularly wide applicability in backgroundindependent theories as in the case of gravitational phenomena. The central purpose of this work is to employ these techniques to obtain semi-classical dynamics from canonical quantum gravity theories. Application to non-associative quantum mechanics is developed and testable results are obtained. Types of non-associative algebras relevant for magnetic-monopole systems are discussed. Possible modifications of hypersurface deformation algebra and the emergence of effective space-times are presented. iii.

  13. A simple method to extract DNA from hair shafts using enzymatic laundry powder.

    PubMed

    Guan, Zheng; Zhou, Yu; Liu, Jinchuan; Jiang, Xiaoling; Li, Sicong; Yang, Shuming; Chen, Ailiang

    2013-01-01

    A simple method to extract DNA from hair shafts was developed by using enzymatic laundry powder at the first step of the process. The whole extraction can be finished in less than 2 hours. The simple extraction reagent proposed here contains only two cheap components: ordinary enzymatic laundry powder and PCR buffer. After extraction, an ultra sensitive fluorescent nucleic acid stain, PicoGreen, was used for quantifying trace amount of double-stranded DNA in the solution extracted. For further validation of DNA extraction, four primers were employed to amplify DNA microsatellite loci. Both fluorescence spectroscopy and PCR results suggested that this method can extract DNA from hair shafts with good efficiency and repeatability. The study will greatly facilitate the use of hair shafts in future for DNA analyses on genome-wide scale.

  14. Dual-wavelength digital holographic imaging with phase background subtraction

    NASA Astrophysics Data System (ADS)

    Khmaladze, Alexander; Matz, Rebecca L.; Jasensky, Joshua; Seeley, Emily; Holl, Mark M. Banaszak; Chen, Zhan

    2012-05-01

    Three-dimensional digital holographic microscopic phase imaging of objects that are thicker than the wavelength of the imaging light is ambiguous and results in phase wrapping. In recent years, several unwrapping methods that employed two or more wavelengths were introduced. These methods compare the phase information obtained from each of the wavelengths and extend the range of unambiguous height measurements. A straightforward dual-wavelength phase imaging method is presented which allows for a flexible tradeoff between the maximum height of the sample and the amount of noise the method can tolerate. For highly accurate phase measurements, phase unwrapping of objects with heights higher than the beat (synthetic) wavelength (i.e. the product of the original two wavelengths divided by their difference), can be achieved. Consequently, three-dimensional measurements of a wide variety of biological systems and microstructures become technically feasible. Additionally, an effective method of removing phase background curvature based on slowly varying polynomial fitting is proposed. This method allows accurate volume measurements of several small objects with the same image frame.

  15. Solutions of the two-dimensional Hubbard model: Benchmarks and results from a wide range of numerical algorithms

    DOE PAGES

    LeBlanc, J. P. F.; Antipov, Andrey E.; Becca, Federico; ...

    2015-12-14

    Numerical results for ground-state and excited-state properties (energies, double occupancies, and Matsubara-axis self-energies) of the single-orbital Hubbard model on a two-dimensional square lattice are presented, in order to provide an assessment of our ability to compute accurate results in the thermodynamic limit. Many methods are employed, including auxiliary-field quantum Monte Carlo, bare and bold-line diagrammatic Monte Carlo, method of dual fermions, density matrix embedding theory, density matrix renormalization group, dynamical cluster approximation, diffusion Monte Carlo within a fixed-node approximation, unrestricted coupled cluster theory, and multireference projected Hartree-Fock methods. Comparison of results obtained by different methods allows for the identification ofmore » uncertainties and systematic errors. The importance of extrapolation to converged thermodynamic-limit values is emphasized. Furthermore, cases where agreement between different methods is obtained establish benchmark results that may be useful in the validation of new approaches and the improvement of existing methods.« less

  16. Re-Entry of Women to the Labour Market After an Interruption in Employment.

    ERIC Educational Resources Information Center

    Seear, B. N.

    The problems involved in the re-entry of women into employment were studied, and the extent to which there exists a demand for employment for re-entry women was examined. A growing number of women are seeking re-entry in a wide range of income levels. The demand for part-time work appears to exceed supply. Official machinery for assisting re-entry…

  17. Analytical methods applied to diverse types of Brazilian propolis

    PubMed Central

    2011-01-01

    Propolis is a bee product, composed mainly of plant resins and beeswax, therefore its chemical composition varies due to the geographic and plant origins of these resins, as well as the species of bee. Brazil is an important supplier of propolis on the world market and, although green colored propolis from the southeast is the most known and studied, several other types of propolis from Apis mellifera and native stingless bees (also called cerumen) can be found. Propolis is usually consumed as an extract, so the type of solvent and extractive procedures employed further affect its composition. Methods used for the extraction; analysis the percentage of resins, wax and insoluble material in crude propolis; determination of phenolic, flavonoid, amino acid and heavy metal contents are reviewed herein. Different chromatographic methods applied to the separation, identification and quantification of Brazilian propolis components and their relative strengths are discussed; as well as direct insertion mass spectrometry fingerprinting. Propolis has been used as a popular remedy for several centuries for a wide array of ailments. Its antimicrobial properties, present in propolis from different origins, have been extensively studied. But, more recently, anti-parasitic, anti-viral/immune stimulating, healing, anti-tumor, anti-inflammatory, antioxidant and analgesic activities of diverse types of Brazilian propolis have been evaluated. The most common methods employed and overviews of their relative results are presented. PMID:21631940

  18. Online identification of chlorogenic acids, sesquiterpene lactones, and flavonoids in the Brazilian arnica Lychnophora ericoides Mart. (Asteraceae) leaves by HPLC-DAD-MS and HPLC-DAD-MS/MS and a validated HPLC-DAD method for their simultaneous analysis.

    PubMed

    Gobbo-Neto, Leonardo; Lopes, Norberto P

    2008-02-27

    Lychnophora ericoides Mart. (Asteraceae, Vernonieae) is a plant, endemic to Brazil, with occurrence restricted to the "cerrado" biome. Traditional medicine employs alcoholic and aqueous-alcoholic preparations of leaves from this species for the treatment of wounds, inflammation, and pain. Furthermore, leaves of L. ericoides are also widely used as flavorings for the Brazilian traditional spirit "cachaça". A method has been developed for the extraction and HPLC-DAD analysis of the secondary metabolites of L. ericoides leaves. This analytical method was validated with 11 secondary metabolites chosen to represent the different classes and polarities of secondary metabolites occurring in L. ericoides leaves, and good responses were obtained for each validation parameter analyzed. The same HPLC analytical method was also employed for online secondary metabolite identification by HPLC-DAD-MS and HPLC-DAD-MS/MS, leading to the identification of di- C-glucosylflavones, coumaroylglucosylflavonols, flavone, flavanones, flavonols, chalcones, goyazensolide, and eremantholide-type sesquiterpene lactones and positional isomeric series of chlorogenic acids possessing caffeic and/or ferulic moieties. Among the 52 chromatographic peaks observed, 36 were fully identified and 8 were attributed to compounds belonging to series of caffeoylferuloylquinic and diferuloylquinic acids that could not be individualized from each other.

  19. De novo assembly of the transcriptome of the non-model plant Streptocarpus rexii employing a novel heuristic to recover locus-specific transcript clusters.

    PubMed

    Chiara, Matteo; Horner, David S; Spada, Alberto

    2013-01-01

    De novo transcriptome characterization from Next Generation Sequencing data has become an important approach in the study of non-model plants. Despite notable advances in the assembly of short reads, the clustering of transcripts into unigene-like (locus-specific) clusters remains a somewhat neglected subject. Indeed, closely related paralogous transcripts are often merged into single clusters by current approaches. Here, a novel heuristic method for locus-specific clustering is compared to that implemented in the de novo assembler Oases, using the same initial transcript collections, derived from Arabidopsis thaliana and the developmental model Streptocarpus rexii. We show that the proposed approach improves cluster specificity in the A. thaliana dataset for which the reference genome is available. Furthermore, for the S. rexii data our filtered transcript collection matches a larger number of distinct annotated loci in reference genomes than the Oases set, while containing a reduced overall number of loci. A detailed discussion of advantages and limitations of our approach in processing de novo transcriptome reconstructions is presented. The proposed method should be widely applicable to other organisms, irrespective of the transcript assembly method employed. The S. rexii transcriptome is available as a sophisticated and augmented publicly available online database.

  20. Optimal Multiple Surface Segmentation With Shape and Context Priors

    PubMed Central

    Bai, Junjie; Garvin, Mona K.; Sonka, Milan; Buatti, John M.; Wu, Xiaodong

    2014-01-01

    Segmentation of multiple surfaces in medical images is a challenging problem, further complicated by the frequent presence of weak boundary evidence, large object deformations, and mutual influence between adjacent objects. This paper reports a novel approach to multi-object segmentation that incorporates both shape and context prior knowledge in a 3-D graph-theoretic framework to help overcome the stated challenges. We employ an arc-based graph representation to incorporate a wide spectrum of prior information through pair-wise energy terms. In particular, a shape-prior term is used to penalize local shape changes and a context-prior term is used to penalize local surface-distance changes from a model of the expected shape and surface distances, respectively. The globally optimal solution for multiple surfaces is obtained by computing a maximum flow in a low-order polynomial time. The proposed method was validated on intraretinal layer segmentation of optical coherence tomography images and demonstrated statistically significant improvement of segmentation accuracy compared to our earlier graph-search method that was not utilizing shape and context priors. The mean unsigned surface positioning errors obtained by the conventional graph-search approach (6.30 ± 1.58 μm) was improved to 5.14 ± 0.99 μm when employing our new method with shape and context priors. PMID:23193309

  1. The Molecular Genetic Architecture of Self-Employment

    PubMed Central

    van der Loos, Matthijs J. H. M.; Rietveld, Cornelius A.; Eklund, Niina; Koellinger, Philipp D.; Rivadeneira, Fernando; Abecasis, Gonçalo R.; Ankra-Badu, Georgina A.; Baumeister, Sebastian E.; Benjamin, Daniel J.; Biffar, Reiner; Blankenberg, Stefan; Boomsma, Dorret I.; Cesarini, David; Cucca, Francesco; de Geus, Eco J. C.; Dedoussis, George; Deloukas, Panos; Dimitriou, Maria; Eiriksdottir, Guðny; Eriksson, Johan; Gieger, Christian; Gudnason, Vilmundur; Höhne, Birgit; Holle, Rolf; Hottenga, Jouke-Jan; Isaacs, Aaron; Järvelin, Marjo-Riitta; Johannesson, Magnus; Kaakinen, Marika; Kähönen, Mika; Kanoni, Stavroula; Laaksonen, Maarit A.; Lahti, Jari; Launer, Lenore J.; Lehtimäki, Terho; Loitfelder, Marisa; Magnusson, Patrik K. E.; Naitza, Silvia; Oostra, Ben A.; Perola, Markus; Petrovic, Katja; Quaye, Lydia; Raitakari, Olli; Ripatti, Samuli; Scheet, Paul; Schlessinger, David; Schmidt, Carsten O.; Schmidt, Helena; Schmidt, Reinhold; Senft, Andrea; Smith, Albert V.; Spector, Timothy D.; Surakka, Ida; Svento, Rauli; Terracciano, Antonio; Tikkanen, Emmi; van Duijn, Cornelia M.; Viikari, Jorma; Völzke, Henry; Wichmann, H. -Erich; Wild, Philipp S.; Willems, Sara M.; Willemsen, Gonneke; van Rooij, Frank J. A.; Groenen, Patrick J. F.; Uitterlinden, André G.; Hofman, Albert; Thurik, A. Roy

    2013-01-01

    Economic variables such as income, education, and occupation are known to affect mortality and morbidity, such as cardiovascular disease, and have also been shown to be partly heritable. However, very little is known about which genes influence economic variables, although these genes may have both a direct and an indirect effect on health. We report results from the first large-scale collaboration that studies the molecular genetic architecture of an economic variable–entrepreneurship–that was operationalized using self-employment, a widely-available proxy. Our results suggest that common SNPs when considered jointly explain about half of the narrow-sense heritability of self-employment estimated in twin data (σg 2/σP 2 = 25%, h 2 = 55%). However, a meta-analysis of genome-wide association studies across sixteen studies comprising 50,627 participants did not identify genome-wide significant SNPs. 58 SNPs with p<10−5 were tested in a replication sample (n = 3,271), but none replicated. Furthermore, a gene-based test shows that none of the genes that were previously suggested in the literature to influence entrepreneurship reveal significant associations. Finally, SNP-based genetic scores that use results from the meta-analysis capture less than 0.2% of the variance in self-employment in an independent sample (p≥0.039). Our results are consistent with a highly polygenic molecular genetic architecture of self-employment, with many genetic variants of small effect. Although self-employment is a multi-faceted, heavily environmentally influenced, and biologically distal trait, our results are similar to those for other genetically complex and biologically more proximate outcomes, such as height, intelligence, personality, and several diseases. PMID:23593239

  2. Age Discrimination in Employment Act; retiree health benefits. Final rule.

    PubMed

    2007-12-26

    The Equal Employment Opportunity Commission is publishing this final rule so that employers may create, adopt, and maintain a wide range of retiree health plan designs, such as Medicare bridge plans and Medicare wrap-around plans, without violating the Age Discrimination in Employment Act of 1967 (ADEA). To address concerns that the ADEA may be construed to create an incentive for employers to eliminate or reduce retiree health benefits, EEOC is creating a narrow exemption from the prohibitions of the ADEA for the practice of coordinating employer-sponsored retiree health benefits with eligibility for Medicare or a comparable State health benefits program. The rule does not otherwise affect an employer's ability to offer health or other employment benefits to retirees, consistent with the law.

  3. Federal Policies and Programs to Expand Employment Services Among Individuals with Serious Mental Illnesses.

    PubMed

    Karakus, Mustafa; Riley, Jarnee; Goldman, Howard

    2017-05-01

    Previous studies suggest that providing employment services to individuals with serious mental illnesses can help them obtain competitive, real-world employment. However, these services are still not easily accessible to this population. This paper provides a brief summary of recent federal initiatives that may influence widespread implementation of employment services. While there is an increasing recognition of the need to remove barriers and provide supported employment services to individuals with mental illnesses, a wide-spread coordination across Federal polices, financing and regulatory changes are necessary to promote measurable and lasting effects on the broad availability of employment services among this population.

  4. Price returns efficiency of the Shanghai A-Shares

    NASA Astrophysics Data System (ADS)

    Long, Wang Jiang; Jaaman, Saiful Hafizah; Samsudin, Humaida Banu

    2014-06-01

    Beta measured from the capital asset pricing model (CAPM) is the most widely used risk to estimate expected return. In this paper factors that influence Shanghai A-share stock return based on CAPM are explored and investigated. Price data of 312 companies listed on Shanghai Stock Exchange (SSE) from the year 2000 to 2011 are investigated. This study employed the Fama-MacBeth cross-sectional method to avoid weakness of traditional CAPM. In addition, this study improves the model by adjusting missing data. Findings of this study justifies that systematic risk can explain the portfolios' returns of China SSE stock market.

  5. Determination of calibration constants for the hole-drilling residual stress measurement technique applied to orthotropic composites. I - Theoretical considerations

    NASA Technical Reports Server (NTRS)

    Prasad, C. B.; Prabhakaran, R.; Tompkins, S.

    1987-01-01

    The hole-drilling technique for the measurement of residual stresses using electrical resistance strain gages has been widely used for isotropic materials and has been adopted by the ASTM as a standard method. For thin isotropic plates, with a hole drilled through the thickness, the idealized hole-drilling calibration constants are obtained by making use of the well-known Kirsch's solution. In this paper, an analogous attempt is made to theoretically determine the three idealized hole-drilling calibration constants for thin orthotropic materials by employing Savin's (1961) complex stress function approach.

  6. Utilizing DNA analysis to combat the world wide plague of present day slavery – trafficking in persons

    PubMed Central

    Palmbach, Timothy; Blom, Jeffrey; Hoynes, Emily; Primorac, Dragan; Gaboury, Mario

    2014-01-01

    A study was conducted to determine if modern forensic DNA typing methods can be properly employed throughout the world with a final goal of increasing arrests, prosecutions, and convictions of perpetrators of modern day trafficking in persons while concurrently reducing the burden of victim testimony in legal proceedings. Without interruption of investigations, collection of samples containing DNA was conducted in a variety of settings. Evidentiary samples were analyzed on the ANDE Rapid DNA system. Many of the collected swabs yielded informative short tandem repeat profiles with Rapid DNA technology. PMID:24577820

  7. Medical Image Compression Using a New Subband Coding Method

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Smith, Mark J. T.; Scales, Allen; Tucker, Doug

    1995-01-01

    A recently introduced iterative complexity- and entropy-constrained subband quantization design algorithm is generalized and applied to medical image compression. In particular, the corresponding subband coder is used to encode Computed Tomography (CT) axial slice head images, where statistical dependencies between neighboring image subbands are exploited. Inter-slice conditioning is also employed for further improvements in compression performance. The subband coder features many advantages such as relatively low complexity and operation over a very wide range of bit rates. Experimental results demonstrate that the performance of the new subband coder is relatively good, both objectively and subjectively.

  8. Utilizing DNA analysis to combat the world wide plague of present day slavery--trafficking in persons.

    PubMed

    Palmbach, Timothy M; Blom, Jeffrey; Hoynes, Emily; Primorac, Dragan; Gaboury, Mario

    2014-02-01

    A study was conducted to determine if modern forensic DNA typing methods can be properly employed throughout the world with a final goal of increasing arrests, prosecutions, and convictions of perpetrators of modern day trafficking in persons while concurrently reducing the burden of victim testimony in legal proceedings. Without interruption of investigations, collection of samples containing DNA was conducted in a variety of settings. Evidentiary samples were analyzed on the ANDE Rapid DNA system. Many of the collected swabs yielded informative short tandem repeat profiles with Rapid DNA technology.

  9. Detection of elemental mercury by multimode diode laser correlation spectroscopy.

    PubMed

    Lou, Xiutao; Somesfalean, Gabriel; Svanberg, Sune; Zhang, Zhiguo; Wu, Shaohua

    2012-02-27

    We demonstrate a method for elemental mercury detection based on correlation spectroscopy employing UV laser radiation generated by sum-frequency mixing of two visible multimode diode lasers. Resonance matching of the multimode UV laser is achieved in a wide wavelength range and with good tolerance for various operating conditions. Large mode-hops provide an off-resonance baseline, eliminating interferences from other gas species with broadband absorption. A sensitivity of 1 μg/m3 is obtained for a 1-m path length and 30-s integration time. The performance of the system shows promise for mercury monitoring in industrial applications.

  10. Comparison of two surface temperature measurement using thermocouples and infrared camera

    NASA Astrophysics Data System (ADS)

    Michalski, Dariusz; Strąk, Kinga; Piasecka, Magdalena

    This paper compares two methods applied to measure surface temperatures at an experimental setup designed to analyse flow boiling heat transfer. The temperature measurements were performed in two parallel rectangular minichannels, both 1.7 mm deep, 16 mm wide and 180 mm long. The heating element for the fluid flowing in each minichannel was a thin foil made of Haynes-230. The two measurement methods employed to determine the surface temperature of the foil were: the contact method, which involved mounting thermocouples at several points in one minichannel, and the contactless method to study the other minichannel, where the results were provided with an infrared camera. Calculations were necessary to compare the temperature results. Two sets of measurement data obtained for different values of the heat flux were analysed using the basic statistical methods, the method error and the method accuracy. The experimental error and the method accuracy were taken into account. The comparative analysis showed that although the values and distributions of the surface temperatures obtained with the two methods were similar but both methods had certain limitations.

  11. Waveform inversion with source encoding for breast sound speed reconstruction in ultrasound computed tomography.

    PubMed

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the sound speed distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Both computer simulation and experimental phantom studies are conducted to demonstrate the use of the WISE method. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  12. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

  13. Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models

    PubMed Central

    Eckert, Alissa M.; Tumpey, Terrence M.; Maines, Taronna R.

    2016-01-01

    SUMMARY Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. PMID:27412880

  14. Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models.

    PubMed

    Belser, Jessica A; Eckert, Alissa M; Tumpey, Terrence M; Maines, Taronna R

    2016-09-01

    Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  15. In vivo measurement of muscle output in intact Drosophila.

    PubMed

    Elliott, Christopher J H; Sparrow, John C

    2012-01-01

    We describe our methods for analysing muscle function in a whole intact small insect, taking advantage of a simple flexible optical beam to produce an inexpensive transducer with wide application. We review our previous data measuring the response to a single action potential driven muscle twitch to explore jumping behaviour in Drosophila melanogaster. In the fruitfly, where the sophisticated and powerful genetic toolbox is being widely employed to investigate neuromuscular function, we further demonstrate the use of the apparatus to analyse in detail, within whole flies, neuronal and muscle mutations affecting activation of muscle contraction in the jump muscle. We have now extended the use of the apparatus to record the muscle forces during larval and other aspects of adult locomotion. The robustness, simplicity and versatility of the apparatus are key to these measurements. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Research and application of microbial enzymes--India's contribution.

    PubMed

    Chand, Subhash; Mishra, Prashant

    2003-01-01

    Enzymes have attracted the attention of scientists world over due to their wide range of physiological, analytical and industrial applications. Although enzymes have been isolated, purified and studied from microbial, animal and plant sources, microorganisms represent the most common source of enzymes due to their broad biochemical diversity, feasibility of mass culture and ease of genetic manipulation. With the advent of molecular biology techniques, a number of genes of industrially important enzymes has been cloned and expressed in order to improve the production of enzymes, substrate utilization and other commercially useful properties. Special attention has been focused on enzymes isolated from thermophiles due to their inherent stability and industrial applications. In addition, a variety of methods have been employed to modify enzymes for their industrial usage including strain improvement, chemical modifications, modification of reaction environment, immobilization and protein engineering. A wide range of applications of enzymes in different bioprocess industries is discussed.

  17. Biomimetic plasmonic color generated by the single-layer coaxial honeycomb nanostructure arrays

    NASA Astrophysics Data System (ADS)

    Zhao, Jiancun; Gao, Bo; Li, Haoyong; Yu, Xiaochang; Yang, Xiaoming; Yu, Yiting

    2017-07-01

    We proposed a periodic coaxial honeycomb nanostructure array patterned in a silver film to realize the plasmonic structural color, which was inspired from natural honeybee hives. The spectral characteristics of the structure with variant geometrical parameters are investigated by employing a finite-difference time-domain method, and the corresponding colors are thus derived by calculating XYZ tristimulus values corresponding with the transmission spectra. The study demonstrates that the suggested structure with only a single layer has high transmission, narrow full-width at half-maximum, and wide color tunability by changing geometrical parameters. Therefore, the plasmonic colors realized possess a high color brightness, saturation, as well as a wide color gamut. In addition, the strong polarization independence makes it more attractive for practical applications. These results indicate that the recommended color-generating plasmonic structure has various potential applications in highly integrated optoelectronic devices, such as color filters and high-definition displays.

  18. A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features.

    PubMed

    Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang

    2016-12-01

    Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.

  19. A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features

    NASA Astrophysics Data System (ADS)

    Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang

    2016-04-01

    Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.

  20. Conformational properties of two exopolysaccharides produced by Inquilinus limosus, a cystic fibrosis lung pathogen.

    PubMed

    Kuttel, Michelle; Ravenscroft, Neil; Foschiatti, Michela; Cescutti, Paola; Rizzo, Roberto

    2012-03-01

    Inquilinus limosus is a multi-resistant bacterium found in the respiratory tract of patients with cystic fibrosis. This bacterium produces two unique fully pyruvylated exopolysaccharides in similar quantities: an α-(1→2)-linked mannan and a β-(1→3)-linked glucan. We employed molecular modelling methods to probe the characteristic conformations and dynamics of these polysaccharides, with corroboration from potentiometric titrations and circular dichroism experiments. Our calculations reveal different structural motifs for the mannan and glucan polysaccharides: the glucan forms primarily right-handed helices with a wide range of extensions, while the mannan forms only left-handed helices. This finding is supported by our circular dichroism experiments. Our calculations also show that the (1→3)-β-d-Glcp linkage is more dynamically flexible than the (1→2)-α-d-Manp: the glucan characteristically forms a range of wide helices with large central cavities. In contrast, the mannan forms rigid regular 'bottlebrush' helices with a minimal central cavity. The widely different character of these two polymers suggests a possible differentiation of biological roles. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    PubMed Central

    Sheu, Jonathan; Beltzer, Jim; Fury, Brian; Wilczek, Katarzyna; Tobin, Steve; Falconer, Danny; Nolta, Jan; Bauer, Gerhard

    2015-01-01

    Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs), we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s) and in 10-layer cell factories (CF10s), while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation. PMID:26151065

  2. Motion Trajectories for Wide-area Surveying with a Rover-based Distributed Spectrometer

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward; Anderson, Gary; Wilson, Edmond

    2006-01-01

    A mobile ground survey application that employs remote sensing as a primary means of area coverage is highlighted. It is distinguished from mobile robotic area coverage problems that employ contact or proximity-based sensing. The focus is on a specific concept for performing mobile surveys in search of biogenic gases on planetary surfaces using a distributed spectrometer -- a rover-based instrument designed for wide measurement coverage of promising search areas. Navigation algorithms for executing circular and spiral survey trajectories are presented for widearea distributed spectroscopy and evaluated based on area covered and distance traveled.

  3. Multiple Testing in the Context of Gene Discovery in Sickle Cell Disease Using Genome-Wide Association Studies.

    PubMed

    Kuo, Kevin H M

    2017-01-01

    The issue of multiple testing, also termed multiplicity, is ubiquitous in studies where multiple hypotheses are tested simultaneously. Genome-wide association study (GWAS), a type of genetic association study that has gained popularity in the past decade, is most susceptible to the issue of multiple testing. Different methodologies have been employed to address the issue of multiple testing in GWAS. The purpose of the review is to examine the methodologies employed in dealing with multiple testing in the context of gene discovery using GWAS in sickle cell disease complications.

  4. The Costs of Employing Older Workers. An Information Paper Prepared for Use by the Special Committee on Aging, United States Senate.

    ERIC Educational Resources Information Center

    Morrison, Malcolm; Rappaport, Anna

    Analysis of the costs of employing older workers indicates that some types of employment costs do vary by age and that overall compensation costs increase with age, largely because of increasing employee benefit costs. There is, however, no statistical evidence that direct salary costs increase by age on an economy-wide basis. The belief that…

  5. A laser pointer driven microheater for precise local heating and conditional gene regulation in vivo. Microheater driven gene regulation in zebrafish.

    PubMed

    Placinta, Mike; Shen, Meng-Chieh; Achermann, Marc; Karlstrom, Rolf O

    2009-12-30

    Tissue heating has been employed to study a variety of biological processes, including the study of genes that control embryonic development. Conditional regulation of gene expression is a particularly powerful approach for understanding gene function. One popular method for mis-expressing a gene of interest employs heat-inducible heat shock protein (hsp) promoters. Global heat shock of hsp-promoter-containing transgenic animals induces gene expression throughout all tissues, but does not allow for spatial control. Local heating allows for spatial control of hsp-promoter-driven transgenes, but methods for local heating are cumbersome and variably effective. We describe a simple, highly controllable, and versatile apparatus for heating biological tissue and other materials on the micron-scale. This microheater employs micron-scale fiber optics and uses an inexpensive laser-pointer as a power source. Optical fibers can be pulled on a standard electrode puller to produce tips of varying sizes that can then be used to reliably heat 20-100 mum targets. We demonstrate precise spatiotemporal control of hsp70l:GFP transgene expression in a variety of tissue types in zebrafish embryos and larvae. We also show how this system can be employed as part of a new method for lineage tracing that would greatly facilitate the study of organogenesis and tissue regulation at any time in the life cycle. This versatile and simple local heater has broad utility for the study of gene function and for lineage tracing. This system could be used to control hsp-driven gene expression in any organism simply by bringing the fiber optic tip in contact with the tissue of interest. Beyond these uses for the study of gene function, this device has wide-ranging utility in materials science and could easily be adapted for therapeutic purposes in humans.

  6. A direct method to solve optimal knots of B-spline curves: An application for non-uniform B-spline curves fitting.

    PubMed

    Dung, Van Than; Tjahjowidodo, Tegoeh

    2017-01-01

    B-spline functions are widely used in many industrial applications such as computer graphic representations, computer aided design, computer aided manufacturing, computer numerical control, etc. Recently, there exist some demands, e.g. in reverse engineering (RE) area, to employ B-spline curves for non-trivial cases that include curves with discontinuous points, cusps or turning points from the sampled data. The most challenging task in these cases is in the identification of the number of knots and their respective locations in non-uniform space in the most efficient computational cost. This paper presents a new strategy for fitting any forms of curve by B-spline functions via local algorithm. A new two-step method for fast knot calculation is proposed. In the first step, the data is split using a bisecting method with predetermined allowable error to obtain coarse knots. Secondly, the knots are optimized, for both locations and continuity levels, by employing a non-linear least squares technique. The B-spline function is, therefore, obtained by solving the ordinary least squares problem. The performance of the proposed method is validated by using various numerical experimental data, with and without simulated noise, which were generated by a B-spline function and deterministic parametric functions. This paper also discusses the benchmarking of the proposed method to the existing methods in literature. The proposed method is shown to be able to reconstruct B-spline functions from sampled data within acceptable tolerance. It is also shown that, the proposed method can be applied for fitting any types of curves ranging from smooth ones to discontinuous ones. In addition, the method does not require excessive computational cost, which allows it to be used in automatic reverse engineering applications.

  7. Mass Spectrometry Immuno Assay (MSIA™) Streptavidin Disposable Automation Research Tips (D.A.R.T's®) Antibody Phage Display Biopanning.

    PubMed

    Chin, Chai Fung; Choong, Yee Siew; Lim, Theam Soon

    2018-01-01

    Antibody phage display has been widely established as the method of choice to generate monoclonal antibodies with various efficacies post hybridoma technology. This technique is a popular method which takes precedence over ease of methodology, time- and cost-savings with comparable outcomes to conventional methods. Phage display technology manipulates the genome of M13 bacteriophage to display large diverse collection of antibodies that is capable of binding to various targets (nucleic acids, peptides, proteins, and carbohydrates). This subsequently leads to the discovery of target-related antibody binders. There have been several different approaches adapted for antibody phage display over the years. This chapter focuses on the semi-automated phage display antibody biopanning method utilizing the MSIA™ streptavidin D.A.R.T's ® system. The system employs the use of electronic multichannel pipettes with predefined programs to carry out the panning process. The method should also be adaptable to larger liquid handling instrumentations for higher throughput.

  8. Recent theoretical developments and experimental studies pertinent to vortex flow aerodynamics - With a view towards design

    NASA Technical Reports Server (NTRS)

    Lamar, J. E.; Luckring, J. M.

    1978-01-01

    A review is presented of recent progress in a research program directed towards the development of an improved vortex-flow technology base. It is pointed out that separation induced vortex-flows from the leading and side edges play an important role in the high angle-of-attack aerodynamic characteristics of a wide range of modern aircraft. In the analysis and design of high-speed aircraft, a detailed knowledge of this type of separation is required, particularly with regard to critical wind loads and the stability and performance at various off-design conditions. A description of analytical methods is presented. The theoretical methods employed are divided into two classes which are dependent upon the underlying aerodynamic assumptions. One conical flow method is considered along with three different nonconical flow methods. Comparisons are conducted between the described methods and available aerodynamic data. Attention is also given to a vortex flow drag study and a vortex flow wing design using suction analogy.

  9. A study of education and KSAOs on career entry for product engineers: What employers really want

    NASA Astrophysics Data System (ADS)

    Thornburgh, James

    The purpose of the study was to investigate the ways that employers of product engineers evaluate potential employees' job readiness, and which theories related to the education-work transaction are supported by practice. This study used a mixed methods approach and consisted of a state-wide survey (N=106) and local interviews (N=8). The results of the research indicate that attributes of both the Theory of Individual Differences and Credentialing Theory are present in the hiring practices of product engineers. Consistent with the Theory of Individual Differences, employers indicate they look for evidence of various job-related Knowledge, Skills, Abilities, and Other attributes (KSAOs) and they indicate they are willing to hire applicants who have less than a bachelor's degree. Consistent with Credentialing Theory, employers advertise a formal education minimum which represents only one way that individuals may learn to be an engineer. This study also confirmed prior research that most employers use primarily non-evidence based predictors to evaluate applicants. The primary initial screening predictors were experience, GPA, and major, while the primary finalist selection predictors were unstructured interviews, and applications, followed by structured interviews, job knowledge tests, and work sample test. Contrary to previous findings, this study did not find any major differences between what HR professionals, engineering managers, or other manager look for in terms of qualifications or what predictors they use when evaluating applicants for product engineer positions.

  10. Towards an SEMG-based tele-operated robot for masticatory rehabilitation.

    PubMed

    Kalani, Hadi; Moghimi, Sahar; Akbarzadeh, Alireza

    2016-08-01

    This paper proposes a real-time trajectory generation for a masticatory rehabilitation robot based on surface electromyography (SEMG) signals. We used two Gough-Stewart robots. The first robot was used as a rehabilitation robot while the second robot was developed to model the human jaw system. The legs of the rehabilitation robot were controlled by the SEMG signals of a tele-operator to reproduce the masticatory motion in the human jaw, supposedly mounted on the moving platform, through predicting the location of a reference point. Actual jaw motions and the SEMG signals from the masticatory muscles were recorded and used as output and input, respectively. Three different methods, namely time-delayed neural networks, time delayed fast orthogonal search, and time-delayed Laguerre expansion technique, were employed and compared to predict the kinematic parameters. The optimal model structures as well as the input delays were obtained for each model and each subject through a genetic algorithm. Equations of motion were obtained by the virtual work method. Fuzzy method was employed to develop a fuzzy impedance controller. Moreover, a jaw model was developed to demonstrate the time-varying behavior of the muscle lengths during the rehabilitation process. The three modeling methods were capable of providing reasonably accurate estimations of the kinematic parameters, although the accuracy and training/validation speed of time-delayed fast orthogonal search were higher than those of the other two aforementioned methods. Also, during a simulation study, the fuzzy impedance scheme proved successful in controlling the moving platform for the accurate navigation of the reference point in the desired trajectory. SEMG has been widely used as a control command for prostheses and exoskeleton robots. However, in the current study by employing the proposed rehabilitation robot the complete continuous profile of the clenching motion was reproduced in the sagittal plane. Copyright © 2016. Published by Elsevier Ltd.

  11. Numerical Analysis of Crack Tip Plasticity and History Effects under Mixed Mode Conditions

    NASA Astrophysics Data System (ADS)

    Lopez-Crespo, Pablo; Pommier, Sylvie

    The plastic behaviour in the crack tip region has a strong influence on the fatigue life of engineering components. In general, residual stresses developed as a consequence of the plasticity being constrained around the crack tip have a significant role on both the direction of crack propagation and the propagation rate. Finite element methods (FEM) are commonly employed in order to model plasticity. However, if millions of cycles need to be modelled to predict the fatigue behaviour of a component, the method becomes computationally too expensive. By employing a multiscale approach, very precise analyses computed by FEM can be brought to a global scale. The data generated using the FEM enables us to identify a global cyclic elastic-plastic model for the crack tip region. Once this model is identified, it can be employed directly, with no need of additional FEM computations, resulting in fast computations. This is done by partitioning local displacement fields computed by FEM into intensity factors (global data) and spatial fields. A Karhunen-Loeve algorithm developed for image processing was employed for this purpose. In addition, the partitioning is done such as to distinguish into elastic and plastic components. Each of them is further divided into opening mode and shear mode parts. The plastic flow direction was determined with the above approach on a centre cracked panel subjected to a wide range of mixed-mode loading conditions. It was found to agree well with the maximum tangential stress criterion developed by Erdogan and Sih, provided that the loading direction is corrected for residual stresses. In this approach, residual stresses are measured at the global scale through internal intensity factors.

  12. Association of Reference Pricing with Drug Selection and Spending

    PubMed Central

    Robinson, James C.; Whaley, Christopher M.; Brown, Timothy T.

    2017-01-01

    BACKGROUND In the United States, prices for therapeutically similar drugs vary widely, which has prompted efforts by public and private insurers to steer patients toward the lower-priced options. Under reference pricing, the insurer or employer establishes a maximum contribution it will make toward the price of a drug or procedure, and the patient pays the remainder. METHODS We used difference-in-differences multivariable regression methods to analyze changes in prescriptions and pricing for 1302 drugs in 78 therapeutic classes in the United States, before and after implementation of reference pricing by an alliance of private employers. We assessed trends for the study group relative to those for an employee group that was not subject to reference pricing. The study included 1,122,741 prescriptions that were reimbursed during the period from 2010 through 2014. RESULTS Implementation of reference pricing was associated with a higher percentage of prescriptions that were filled for the lowest-priced reference drug within its therapeutic class (difference in probability, 7.0 percentage points; 95% confidence interval [CI], 4.0 to 9.9), a lower average price paid per prescription (−13.9%; 95% CI, −23.8 to −2.7), and a higher rate of copayment by patients (5.2%; 95% CI, 0.2 to 10.4) than in the comparison group. During the first 18 months after implementation, spending for employers was $1.34 million lower and the amount of copayments for employees was $0.12 million higher than in the comparison group. CONCLUSIONS Implementation of reference pricing was associated with significant changes in drug selection and spending for a population of patients covered by employment-based insurance in the United States. (Funded by the Agency for Healthcare Research and Quality and the Genentech Foundation.) PMID:28813219

  13. Docking-based classification models for exploratory toxicology ...

    EPA Pesticide Factsheets

    Background: Exploratory toxicology is a new emerging research area whose ultimate mission is that of protecting human health and environment from risks posed by chemicals. In this regard, the ethical and practical limitation of animal testing has encouraged the promotion of computational methods for the fast screening of huge collections of chemicals available on the market. Results: We derived 24 reliable docking-based classification models able to predict the estrogenic potential of a large collection of chemicals having high quality experimental data, kindly provided by the U.S. Environmental Protection Agency (EPA). The predictive power of our docking-based models was supported by values of AUC, EF1% (EFmax = 7.1), -LR (at SE = 0.75) and +LR (at SE = 0.25) ranging from 0.63 to 0.72, from 2.5 to 6.2, from 0.35 to 0.67 and from 2.05 to 9.84, respectively. In addition, external predictions were successfully made on some representative known estrogenic chemicals. Conclusion: We show how structure-based methods, widely applied to drug discovery programs, can be adapted to meet the conditions of the regulatory context. Importantly, these methods enable one to employ the physicochemical information contained in the X-ray solved biological target and to screen structurally-unrelated chemicals. Shows how structure-based methods, widely applied to drug discovery programs, can be adapted to meet the conditions of the regulatory context. Evaluation of 24 reliable dockin

  14. Colorimetric determination of DNase I activity with a DNA-methyl green substrate.

    PubMed

    Sinicropi, D; Baker, D L; Prince, W S; Shiffer, K; Shak, S

    1994-11-01

    A simple, high throughput, and precise assay was developed for quantification of deoxyribonuclease I (DNase; IUB 3.1.21.1) activity. The method was adapted from the procedure devised by Kurnick which employs a substrate comprised of highly polymerized native DNA complexed with methyl green. Hydrolysis of the DNA produced unbound methyl green and a decrease in the absorbance of the solution at 620 nm. By adjusting the time and temperature of the reaction, the assay permits quantification of DNase activity over a wide concentration range (0.4 to 8900 ng/ml). Samples and standards were added to the substrate in microtiter plates and were incubated for 1-24 h at 25-37 degrees C to achieve the desired assay range. The DNase activity of the samples was interpolated from a standard curve generated with Pulmozyme recombinant human deoxyribonuclease I (rhDNase). Interassay precision was less than 12% CV and recovery was within 100 +/- 11%. Activity determination by the DNA-methyl green method correlated well with that determined by the widely used "hyperchromicity" method originated by Kunitz, which is based on the increase in absorbance at 260 nm upon hydrolysis of DNA. The DNA-methyl green assay was simpler and more versatile than the hyperchromicity method and was used to characterize the activity of rhDNase and DNase isolated from human urine.

  15. Multi-object model-based multi-atlas segmentation for rodent brains using dense discrete correspondences

    NASA Astrophysics Data System (ADS)

    Lee, Joohwi; Kim, Sun Hyung; Styner, Martin

    2016-03-01

    The delineation of rodent brain structures is challenging due to low-contrast multiple cortical and subcortical organs that are closely interfacing to each other. Atlas-based segmentation has been widely employed due to its ability to delineate multiple organs at the same time via image registration. The use of multiple atlases and subsequent label fusion techniques has further improved the robustness and accuracy of atlas-based segmentation. However, the accuracy of atlas-based segmentation is still prone to registration errors; for example, the segmentation of in vivo MR images can be less accurate and robust against image artifacts than the segmentation of post mortem images. In order to improve the accuracy and robustness of atlas-based segmentation, we propose a multi-object, model-based, multi-atlas segmentation method. We first establish spatial correspondences across atlases using a set of dense pseudo-landmark particles. We build a multi-object point distribution model using those particles in order to capture inter- and intra- subject variation among brain structures. The segmentation is obtained by fitting the model into a subject image, followed by label fusion process. Our result shows that the proposed method resulted in greater accuracy than comparable segmentation methods, including a widely used ANTs registration tool.

  16. An accurate online calibration system based on combined clamp-shape coil for high voltage electronic current transformers.

    PubMed

    Li, Zhen-hua; Li, Hong-bin; Zhang, Zhi

    2013-07-01

    Electronic transformers are widely used in power systems because of their wide bandwidth and good transient performance. However, as an emerging technology, the failure rate of electronic transformers is higher than that of traditional transformers. As a result, the calibration period needs to be shortened. Traditional calibration methods require the power of transmission line be cut off, which results in complicated operation and power off loss. This paper proposes an online calibration system which can calibrate electronic current transformers without power off. In this work, the high accuracy standard current transformer and online operation method are the key techniques. Based on the clamp-shape iron-core coil and clamp-shape air-core coil, a combined clamp-shape coil is designed as the standard current transformer. By analyzing the output characteristics of the two coils, the combined clamp-shape coil can achieve verification of the accuracy. So the accuracy of the online calibration system can be guaranteed. Moreover, by employing the earth potential working method and using two insulating rods to connect the combined clamp-shape coil to the high voltage bus, the operation becomes simple and safe. Tests in China National Center for High Voltage Measurement and field experiments show that the proposed system has a high accuracy of up to 0.05 class.

  17. Training the max-margin sequence model with the relaxed slack variables.

    PubMed

    Niu, Lingfeng; Wu, Jianmin; Shi, Yong

    2012-09-01

    Sequence models are widely used in many applications such as natural language processing, information extraction and optical character recognition, etc. We propose a new approach to train the max-margin based sequence model by relaxing the slack variables in this paper. With the canonical feature mapping definition, the relaxed problem is solved by training a multiclass Support Vector Machine (SVM). Compared with the state-of-the-art solutions for the sequence learning, the new method has the following advantages: firstly, the sequence training problem is transformed into a multiclassification problem, which is more widely studied and already has quite a few off-the-shelf training packages; secondly, this new approach reduces the complexity of training significantly and achieves comparable prediction performance compared with the existing sequence models; thirdly, when the size of training data is limited, by assigning different slack variables to different microlabel pairs, the new method can use the discriminative information more frugally and produces more reliable model; last but not least, by employing kernels in the intermediate multiclass SVM, nonlinear feature space can be easily explored. Experimental results on the task of named entity recognition, information extraction and handwritten letter recognition with the public datasets illustrate the efficiency and effectiveness of our method. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Study of electron impact inelastic scattering of chlorine molecule (Cl2)

    NASA Astrophysics Data System (ADS)

    Yadav, Hitesh; Vinodkumar, Minaxi; Limbachiya, Chetan; Vinodkumar, P. C.

    2018-02-01

    A theoretical study is carried out for electron interactions with the chlorine molecule (Cl2) for incident energies ranging from 0.01 to 5000 eV. This wide range of energy has allowed us to investigate a variety of processes and report data on symmetric excitation energies, dissociative electron attachment (DEA), total excitation cross sections, and ionization cross section (Q ion) along with total inelastic cross sections (Q inel). The present study is important since Cl2 is a prominent gas for plasma etching and its anionic atoms are important in the etching of semiconductor wafers. In order to compute the total inelastic cross sections, we have employed the ab initio R-matrix method (0.01 to 15 eV) together with the spherical complex optical potential method (∼15 to 5000 eV). The R-matrix calculations are performed using a close coupling method, and we have used DEA estimator via Quantemol-N to calculate the DEA fragmentation and cross sections. The present study finds overall good agreement with the available experimental data. Total excitation and inelastic cross sections of e-{{{Cl}}}2 scattering for a wide energy range (0.01 to 5 keV) are reported for the first time, to the best of our knowledge.

  19. An Autonomous Satellite Time Synchronization System Using Remotely Disciplined VC-OCXOs

    PubMed Central

    Gu, Xiaobo; Chang, Qing; Glennon, Eamonn P.; Xu, Baoda; Dempseter, Andrew G.; Wang, Dun; Wu, Jiapeng

    2015-01-01

    An autonomous remote clock control system is proposed to provide time synchronization and frequency syntonization for satellite to satellite or ground to satellite time transfer, with the system comprising on-board voltage controlled oven controlled crystal oscillators (VC-OCXOs) that are disciplined to a remote master atomic clock or oscillator. The synchronization loop aims to provide autonomous operation over extended periods, be widely applicable to a variety of scenarios and robust. A new architecture comprising the use of frequency division duplex (FDD), synchronous time division (STDD) duplex and code division multiple access (CDMA) with a centralized topology is employed. This new design utilizes dual one-way ranging methods to precisely measure the clock error, adopts least square (LS) methods to predict the clock error and employs a third-order phase lock loop (PLL) to generate the voltage control signal. A general functional model for this system is proposed and the error sources and delays that affect the time synchronization are discussed. Related algorithms for estimating and correcting these errors are also proposed. The performance of the proposed system is simulated and guidance for selecting the clock is provided. PMID:26213929

  20. Theory and in vivo application of electroporative gene delivery.

    PubMed

    Somiari, S; Glasspool-Malone, J; Drabick, J J; Gilbert, R A; Heller, R; Jaroszeski, M J; Malone, R W

    2000-09-01

    Efficient and safe methods for delivering exogenous genetic material into tissues must be developed before the clinical potential of gene therapy will be realized. Recently, in vivo electroporation has emerged as a leading technology for developing nonviral gene therapies and nucleic acid vaccines (NAV). Electroporation (EP) involves the application of pulsed electric fields to cells to enhance cell permeability, resulting in exogenous polynucleotide transit across the cytoplasmic membrane. Similar pulsed electrical field treatments are employed in a wide range of biotechnological processes including in vitro EP, hybridoma production, development of transgenic animals, and clinical electrochemotherapy. Electroporative gene delivery studies benefit from well-developed literature that may be used to guide experimental design and interpretation. Both theory and experimental analysis predict that the critical parameters governing EP efficacy include cell size and field strength, duration, frequency, and total number of applied pulses. These parameters must be optimized for each tissue in order to maximize gene delivery while minimizing irreversible cell damage. By providing an overview of the theory and practice of electroporative gene transfer, this review intends to aid researchers that wish to employ the method for preclinical and translational gene therapy, NAV, and functional genomic research.

  1. An overview of the model integration process: From pre ...

    EPA Pesticide Factsheets

    Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation

  2. Brain Mapping in a Patient with Congenital Blindness – A Case for Multimodal Approaches

    PubMed Central

    Roland, Jarod L.; Hacker, Carl D.; Breshears, Jonathan D.; Gaona, Charles M.; Hogan, R. Edward; Burton, Harold; Corbetta, Maurizio; Leuthardt, Eric C.

    2013-01-01

    Recent advances in basic neuroscience research across a wide range of methodologies have contributed significantly to our understanding of human cortical electrophysiology and functional brain imaging. Translation of this research into clinical neurosurgery has opened doors for advanced mapping of functionality that previously was prohibitively difficult, if not impossible. Here we present the case of a unique individual with congenital blindness and medically refractory epilepsy who underwent neurosurgical treatment of her seizures. Pre-operative evaluation presented the challenge of accurately and robustly mapping the cerebral cortex for an individual with a high probability of significant cortical re-organization. Additionally, a blind individual has unique priorities in one’s ability to read Braille by touch and sense the environment primarily by sound than the non-vision impaired person. For these reasons we employed additional measures to map sensory, motor, speech, language, and auditory perception by employing a number of cortical electrophysiologic mapping and functional magnetic resonance imaging methods. Our data show promising results in the application of these adjunctive methods in the pre-operative mapping of otherwise difficult to localize, and highly variable, functional cortical areas. PMID:23914170

  3. Utilising monitoring and modelling of estuarine environments to investigate catchment conditions responsible for stratification events in a typically well-mixed urbanised estuary

    NASA Astrophysics Data System (ADS)

    Lee, Serena B.; Birch, Gavin F.

    2012-10-01

    Estuarine health is affected by contamination from stormwater, particularly in highly-urbanised environments. For systems where catchment monitoring is insufficient, novel techniques must be employed to determine the impact of urban runoff on receiving water bodies. In the present work, estuarine monitoring and modelling were successfully employed to determine stormwater runoff volumes and establish an appropriate rainfall/runoff relationship capable of replicating fresh-water discharge due to the full range of precipitation conditions in the Sydney Estuary, Australia. Using estuary response to determine relationships between catchment rainfall and runoff is a widely applicable method and may be of assistance in the study of waterways where monitoring fluvial discharges is not practical or is beyond the capacity of management authorities. For the Sydney Estuary, the SCS-CN method replicated rainfall/runoff and was applied in numerical modelling experiments investigating the hydrodynamic characteristics affecting stratification and estuary recovery following high precipitation. Numerical modelling showed stratification in the Sydney Estuary was dominated by fresh-water discharge. Spring tides and up-estuary winds contributed to mixing and neap tides and down-estuary winds enhanced stratification.

  4. Method for the evaluation 3D noncontact inspection systems

    NASA Astrophysics Data System (ADS)

    Harding, Kevin

    2011-08-01

    Three dimensional, optical measurement systems are becoming more widely used in applications ranging from aerospace to automotive. These systems offer the potential for high speed, good accuracy, and more complete information than older contact based technology. However, the primary standards employed by many to evaluate these systems were specifically designed around touch probe based coordinate measurement machines (CMMs). These standards were designed to work with the limitations of touch probes, and in many cases cannot measure the types of features and errors associated with non-contact systems. This paper will discuss the deficiencies of employing contact based characterization tests to non-contact systems, and suggest a new set of tests specifically to cover the many aspects pertinent to non-contact, optical 3D measurement systems. Some of the performance aspects addressed in this characterization method include: sensitivity to surface reflectivity and roughness, the effect of angle of incidence of measurements, means to characterize volumetric variations that may fit complex functions, and considerations of both spatial and depth resolutions. Specific application areas will be discussed as well as the use of artifacts to provide practical functional data that can predict system performance on real world parts.

  5. A wideband magnetoresistive sensor for monitoring dynamic fault slip in laboratory fault friction experiments

    USGS Publications Warehouse

    Kilgore, Brian D.

    2017-01-01

    A non-contact, wideband method of sensing dynamic fault slip in laboratory geophysical experiments employs an inexpensive magnetoresistive sensor, a small neodymium rare earth magnet, and user built application-specific wideband signal conditioning. The magnetoresistive sensor generates a voltage proportional to the changing angles of magnetic flux lines, generated by differential motion or rotation of the near-by magnet, through the sensor. The performance of an array of these sensors compares favorably to other conventional position sensing methods employed at multiple locations along a 2 m long × 0.4 m deep laboratory strike-slip fault. For these magnetoresistive sensors, the lack of resonance signals commonly encountered with cantilever-type position sensor mounting, the wide band response (DC to ≈ 100 kHz) that exceeds the capabilities of many traditional position sensors, and the small space required on the sample, make them attractive options for capturing high speed fault slip measurements in these laboratory experiments. An unanticipated observation of this study is the apparent sensitivity of this sensor to high frequency electomagnetic signals associated with fault rupture and (or) rupture propagation, which may offer new insights into the physics of earthquake faulting.

  6. Fading-free transmission of 124-Gb/s PDM-DMT signal over 100-km SSMF using digital carrier regeneration.

    PubMed

    Li, Cai; Hu, Rong; Yang, Qi; Luo, Ming; Li, Wei; Yu, Shaohua

    2016-01-25

    The coherent reception of intensity modulated signal has been recently widely investigated, in which the signal is recovered by the envelop detection. High linewidth tolerance is achieved with such scheme. However, strong optical carrier exists during the transmission, which degrades the optical power efficiency. In this paper, an efficient modulation scheme for discrete multi-tone (DMT) signal is proposed based on the Mach-Zehnder modulator (MZM). Different from the traditional intensity modulation, the proposed method employs both intensity and phase domain. Thus, the optical carrier power can be greatly reduced by adjusting the bias of MZM around the null point. By employing coherent detection and digital carrier regeneration (DCR), the carrier suppressed DMT signal can be recovered using envelop detection. No carrier frequency or phase estimation is required. Numerical investigations are made to demonstrate the feasibility, in which significant improvements are found for the proposed DCR method, showing great tolerance against laser linewidth and carrier power reduction. Finally, a 124-Gb/s transmission of polarization-division multiplexed DMT (PDM-DMT) signal is demonstrated over 100-km SSMF, with only -8 dB optical carrier to signal power ratio (CSPR).

  7. Plant Identification Based on Leaf Midrib Cross-Section Images Using Fractal Descriptors.

    PubMed

    da Silva, Núbia Rosa; Florindo, João Batista; Gómez, María Cecilia; Rossatto, Davi Rodrigo; Kolb, Rosana Marta; Bruno, Odemir Martinez

    2015-01-01

    The correct identification of plants is a common necessity not only to researchers but also to the lay public. Recently, computational methods have been employed to facilitate this task, however, there are few studies front of the wide diversity of plants occurring in the world. This study proposes to analyse images obtained from cross-sections of leaf midrib using fractal descriptors. These descriptors are obtained from the fractal dimension of the object computed at a range of scales. In this way, they provide rich information regarding the spatial distribution of the analysed structure and, as a consequence, they measure the multiscale morphology of the object of interest. In Biology, such morphology is of great importance because it is related to evolutionary aspects and is successfully employed to characterize and discriminate among different biological structures. Here, the fractal descriptors are used to identify the species of plants based on the image of their leaves. A large number of samples are examined, being 606 leaf samples of 50 species from Brazilian flora. The results are compared to other imaging methods in the literature and demonstrate that fractal descriptors are precise and reliable in the taxonomic process of plant species identification.

  8. Organotin Polyethers as Biomaterials

    PubMed Central

    Carraher, Charles E.; Roner, Michael R.

    2009-01-01

    Organotin polyethers are easily synthesized employing interfacial polymerization systems involving the reaction of hydroxyl-containing Lewis bases and organotin halides. A wide variety of organotin-containing polymeric products have been synthesized including those derived from natural and synthetic polymers such as lignin, xylan, cellulose, dextran, and poly(vinyl alcohol). Others have been synthesized employing known drug diols such as dicumarol, DES, and dienestrol and a wide variety of synthetic diols. Included in these materials are the first water soluble organotin polymers. The organotin polyethers exhibit a wide range of biological activities. Some selectively inhibit a number of unwanted bacteria, including Staph. MRSA, and unwanted yeasts such as Candida albicans. Some also inhibit a variety of viruses including those responsible for herpes infections and smallpox. Others show good inhibition of a wide variety of cancer cell lines including cell lines associated with ovarian, colon, lung, prostrate, pancreatic and breast cancer. The synthesis, structural characterization, and biological characterization of these materials is described in this review.

  9. An Update on Teaching the Employment Search.

    ERIC Educational Resources Information Center

    Andrews, Deborah, Ed.; Dyrud, Marilyn A., Ed.

    1997-01-01

    Presents five articles dealing with teaching job search strategies: (1) "Preparing a Scannable Resume" (Carol Roever); (2) "Preparing an Online Resume" (Tim Krause); (3) "Using the World Wide Web to Teach Employment Communication" (K. Virginia Hemby); (4) "A Visual Heuristic for Promoting a Rhetorically Based Job Search" (Helen Foster); and (5)…

  10. Determiners, Feline Marsupials, and the Category-Function Distinction: A Critique of ELT Grammars

    ERIC Educational Resources Information Center

    Reynolds, Brett

    2013-01-01

    The concept of determiners is widely employed in linguistics, but mostly absent from English Language Teaching (ELT) materials (dictionaries, teacher-reference books, and student-oriented texts). Among those employing the concept, there is near-universal confusion between determiners and pronouns, arising mainly from an analytical and…

  11. Targeted Employment Subsidies: Issues of Structure and Design.

    ERIC Educational Resources Information Center

    Bishop, John; Haveman, Robert

    Effects of variations in the structure of targeted employment subsidy programs on the attainment of program objectives are explored in this paper. First, the objectives that underlie targeted subsidy programs are outlined in relation to individual program characteristics and the economics of such programs are discussed. Then the wide range of…

  12. Exploitation or Opportunity? Student Perceptions of Internships in Enhancing Employability Skills

    ERIC Educational Resources Information Center

    O'Connor, Henrietta; Bodicoat, Maxine

    2017-01-01

    Internships are now widely promoted as a valuable means of enhancing graduate employability. However, little is known about student perceptions of internships. Drawing on data from a pre-1992 university, two types of graduate are identified: engagers and disengagers. The engagers valued internship opportunities while the disengagers perceived…

  13. 22 CFR 506.4 - Annual goals and timetables.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Annual goals and timetables. 506.4 Section 506.4 Foreign Relations BROADCASTING BOARD OF GOVERNORS PART-TIME CAREER EMPLOYMENT PROGRAM § 506.4 Annual goals and timetables. A Board-wide plan for promoting part-time employment opportunities will be developed...

  14. What Can a Micronucleus Teach? Learning about Environmental Mutagenesis

    ERIC Educational Resources Information Center

    Linde, Ana R.; Garcia-Vazquez, Eva

    2009-01-01

    The micronucleus test is widely employed in environmental health research. It can also be an excellent tool for learning important concepts in environmental health. In this article we present an inquiry-based laboratory exercise where students explore several theoretical and practical aspects of environmental mutagenesis employing the micronucleus…

  15. Improved measurements of RNA structure conservation with generalized centroid estimators.

    PubMed

    Okada, Yohei; Saito, Yutaka; Sato, Kengo; Sakakibara, Yasubumi

    2011-01-01

    Identification of non-protein-coding RNAs (ncRNAs) in genomes is a crucial task for not only molecular cell biology but also bioinformatics. Secondary structures of ncRNAs are employed as a key feature of ncRNA analysis since biological functions of ncRNAs are deeply related to their secondary structures. Although the minimum free energy (MFE) structure of an RNA sequence is regarded as the most stable structure, MFE alone could not be an appropriate measure for identifying ncRNAs since the free energy is heavily biased by the nucleotide composition. Therefore, instead of MFE itself, several alternative measures for identifying ncRNAs have been proposed such as the structure conservation index (SCI) and the base pair distance (BPD), both of which employ MFE structures. However, these measurements are unfortunately not suitable for identifying ncRNAs in some cases including the genome-wide search and incur high false discovery rate. In this study, we propose improved measurements based on SCI and BPD, applying generalized centroid estimators to incorporate the robustness against low quality multiple alignments. Our experiments show that our proposed methods achieve higher accuracy than the original SCI and BPD for not only human-curated structural alignments but also low quality alignments produced by CLUSTAL W. Furthermore, the centroid-based SCI on CLUSTAL W alignments is more accurate than or comparable with that of the original SCI on structural alignments generated with RAF, a high quality structural aligner, for which twofold expensive computational time is required on average. We conclude that our methods are more suitable for genome-wide alignments which are of low quality from the point of view on secondary structures than the original SCI and BPD.

  16. A comparative study of ChIP-seq sequencing library preparation methods.

    PubMed

    Sundaram, Arvind Y M; Hughes, Timothy; Biondi, Shea; Bolduc, Nathalie; Bowman, Sarah K; Camilli, Andrew; Chew, Yap C; Couture, Catherine; Farmer, Andrew; Jerome, John P; Lazinski, David W; McUsic, Andrew; Peng, Xu; Shazand, Kamran; Xu, Feng; Lyle, Robert; Gilfillan, Gregor D

    2016-10-21

    ChIP-seq is the primary technique used to investigate genome-wide protein-DNA interactions. As part of this procedure, immunoprecipitated DNA must undergo "library preparation" to enable subsequent high-throughput sequencing. To facilitate the analysis of biopsy samples and rare cell populations, there has been a recent proliferation of methods allowing sequencing library preparation from low-input DNA amounts. However, little information exists on the relative merits, performance, comparability and biases inherent to these procedures. Notably, recently developed single-cell ChIP procedures employing microfluidics must also employ library preparation reagents to allow downstream sequencing. In this study, seven methods designed for low-input DNA/ChIP-seq sample preparation (Accel-NGS® 2S, Bowman-method, HTML-PCR, SeqPlex™, DNA SMART™, TELP and ThruPLEX®) were performed on five replicates of 1 ng and 0.1 ng input H3K4me3 ChIP material, and compared to a "gold standard" reference PCR-free dataset. The performance of each method was examined for the prevalence of unmappable reads, amplification-derived duplicate reads, reproducibility, and for the sensitivity and specificity of peak calling. We identified consistent high performance in a subset of the tested reagents, which should aid researchers in choosing the most appropriate reagents for their studies. Furthermore, we expect this work to drive future advances by identifying and encouraging use of the most promising methods and reagents. The results may also aid judgements on how comparable are existing datasets that have been prepared with different sample library preparation reagents.

  17. Gene features selection for three-class disease classification via multiple orthogonal partial least square discriminant analysis and S-plot using microarray data.

    PubMed

    Yang, Mingxing; Li, Xiumin; Li, Zhibin; Ou, Zhimin; Liu, Ming; Liu, Suhuan; Li, Xuejun; Yang, Shuyu

    2013-01-01

    DNA microarray analysis is characterized by obtaining a large number of gene variables from a small number of observations. Cluster analysis is widely used to analyze DNA microarray data to make classification and diagnosis of disease. Because there are so many irrelevant and insignificant genes in a dataset, a feature selection approach must be employed in data analysis. The performance of cluster analysis of this high-throughput data depends on whether the feature selection approach chooses the most relevant genes associated with disease classes. Here we proposed a new method using multiple Orthogonal Partial Least Squares-Discriminant Analysis (mOPLS-DA) models and S-plots to select the most relevant genes to conduct three-class disease classification and prediction. We tested our method using Golub's leukemia microarray data. For three classes with subtypes, we proposed hierarchical orthogonal partial least squares-discriminant analysis (OPLS-DA) models and S-plots to select features for two main classes and their subtypes. For three classes in parallel, we employed three OPLS-DA models and S-plots to choose marker genes for each class. The power of feature selection to classify and predict three-class disease was evaluated using cluster analysis. Further, the general performance of our method was tested using four public datasets and compared with those of four other feature selection methods. The results revealed that our method effectively selected the most relevant features for disease classification and prediction, and its performance was better than that of the other methods.

  18. M-Finder: Uncovering functionally associated proteins from interactome data integrated with GO annotations

    PubMed Central

    2013-01-01

    Background Protein-protein interactions (PPIs) play a key role in understanding the mechanisms of cellular processes. The availability of interactome data has catalyzed the development of computational approaches to elucidate functional behaviors of proteins on a system level. Gene Ontology (GO) and its annotations are a significant resource for functional characterization of proteins. Because of wide coverage, GO data have often been adopted as a benchmark for protein function prediction on the genomic scale. Results We propose a computational approach, called M-Finder, for functional association pattern mining. This method employs semantic analytics to integrate the genome-wide PPIs with GO data. We also introduce an interactive web application tool that visualizes a functional association network linked to a protein specified by a user. The proposed approach comprises two major components. First, the PPIs that have been generated by high-throughput methods are weighted in terms of their functional consistency using GO and its annotations. We assess two advanced semantic similarity metrics which quantify the functional association level of each interacting protein pair. We demonstrate that these measures outperform the other existing methods by evaluating their agreement to other biological features, such as sequence similarity, the presence of common Pfam domains, and core PPIs. Second, the information flow-based algorithm is employed to discover a set of proteins functionally associated with the protein in a query and their links efficiently. This algorithm reconstructs a functional association network of the query protein. The output network size can be flexibly determined by parameters. Conclusions M-Finder provides a useful framework to investigate functional association patterns with any protein. This software will also allow users to perform further systematic analysis of a set of proteins for any specific function. It is available online at http://bionet.ecs.baylor.edu/mfinder PMID:24565382

  19. Medicinal Plants Used by Traditional Healers in Sangurur, Elgeyo Marakwet County, Kenya.

    PubMed

    Kigen, Gabriel; Kipkore, Wilson; Wanjohi, Bernard; Haruki, Boniface; Kemboi, Jemutai

    2017-01-01

    Although herbal medical products are still widely used in Kenya, many of the medicinal plants used by traditional medical practitioners (TMPs) have not been documented, despite several challenges that are now threatening the sustainability of the practice. To document the medicinal plants and healing methods used by TMPs in a region of Kenya with several recognized herbalists for potential research. Semi-structured interviews, group discussions, and direct observations were used to collect ethnopharmacological information. The participant's bio-data, clinical conditions treated, methods of treatment, medicinal plants used, methods of preparation and administration, and dosage forms were recorded. A total of 99 medicinal plants and 12 complementary preparations employed in the treatment of 64 medical conditions were identified. The most widely used plant was Rotala tenella which was used to treat nine medicinal conditions; seven each for Aloe tweediae and Dovyalis abyssinica ; and six each for Basella alba and Euclea divinorum . The plants belonged to 55 families with Fabaceae family being the most frequently used (10), followed by Apocynaceae and Solanaceae, each with six species, respectively. We identified plants used to determine the sex of an unborn baby and those used to treat several conditions including anthrax and cerebral malaria and herbs used to detoxify meat from an animal that has died from anthrax. Of special interest was R. tenella which is used to prevent muscle injury. We have documented several plants with potential therapeutic effects. Further research may be conducted to determine their efficacy. The medicinal plants used by traditional healers in a community which still practices herbal medicine in Kenya were documented. A total of 99 medicinal plants and 12 complementary preparations employed in the treatment of 64 medical conditions were identified. Further research may be carried out in order to determine their therapeutic efficacies. Abbreviations Used: F ic : Informant consensus factor, N ur : Number of use reports in each category, N s : Number of reported species, TMPs: Traditional medical practitioners.

  20. [The commonest therapeutic methods for laser irradiation of blood].

    PubMed

    Moskvin, S V; Konchugova, T V; Khadartsev, A А

    2017-12-05

    One of the most widely employed methods of laser therapy is laser irradiation of blood (LIB). There are two modifications of this technique, one being intravenous low-intensity laser irradiation of blood (ILIB), the other non-invasive blood irradiation(NLIB). The two methods have been developing independently since either has its advantages and disadvantages. The present article was designed to review the main currently available techniques for laser irradiation of blood which are presented in the form of tables (charts). Replacing the UV irradiation of blood with UV lamps by laser ultraviolet irradiation of blood (LUVIB®) has made it possible to significantly simplify the technique and enhanced its efficiency. The most effective options for ILIB are the combined techniques: ILIB-635 + LUVIB® and ILIB-525 + LUVIB. The most effective technique for ELIB is believed to be the use of low-intensity pulsed laser light with a wavelength of 635 nm and output power up to 40 W.

  1. Similarity from multi-dimensional scaling: solving the accuracy and diversity dilemma in information filtering.

    PubMed

    Zeng, Wei; Zeng, An; Liu, Hao; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2014-01-01

    Recommender systems are designed to assist individual users to navigate through the rapidly growing amount of information. One of the most successful recommendation techniques is the collaborative filtering, which has been extensively investigated and has already found wide applications in e-commerce. One of challenges in this algorithm is how to accurately quantify the similarities of user pairs and item pairs. In this paper, we employ the multidimensional scaling (MDS) method to measure the similarities between nodes in user-item bipartite networks. The MDS method can extract the essential similarity information from the networks by smoothing out noise, which provides a graphical display of the structure of the networks. With the similarity measured from MDS, we find that the item-based collaborative filtering algorithm can outperform the diffusion-based recommendation algorithms. Moreover, we show that this method tends to recommend unpopular items and increase the global diversification of the networks in long term.

  2. Sensitive method for characterizing liquid helium cooled preamplifier feedback resistors

    NASA Technical Reports Server (NTRS)

    Smeins, L. G.; Arentz, R. F.

    1983-01-01

    It is pointed out that the simple and traditional method of measuring resistance using an electrometer is ineffective since it is limited to a narrow and nonrepresentative range of terminal voltages. The present investigation is concerned with a resistor measurement technique which was developed to select and calibrate the Transimpedance Mode Amplifier (TIA) load resistors on the Infrared Astronomical Satellite (IRAS) for the wide variety of time and voltage varying signals which will be processed during the flight. The developed method has great versatility and power, and makes it possible to measure the varied and complex responses of nonideal feedback resistors to IR photo-detector currents. When employed with a stable input coupling capacitor, and a narrow band RMS voltmeter, the five input waveforms thouroughly test and calibrate all the features of interest in a load resistor and its associated TIA circuitry.

  3. Integrating instance selection, instance weighting, and feature weighting for nearest neighbor classifiers by coevolutionary algorithms.

    PubMed

    Derrac, Joaquín; Triguero, Isaac; Garcia, Salvador; Herrera, Francisco

    2012-10-01

    Cooperative coevolution is a successful trend of evolutionary computation which allows us to define partitions of the domain of a given problem, or to integrate several related techniques into one, by the use of evolutionary algorithms. It is possible to apply it to the development of advanced classification methods, which integrate several machine learning techniques into a single proposal. A novel approach integrating instance selection, instance weighting, and feature weighting into the framework of a coevolutionary model is presented in this paper. We compare it with a wide range of evolutionary and nonevolutionary related methods, in order to show the benefits of the employment of coevolution to apply the techniques considered simultaneously. The results obtained, contrasted through nonparametric statistical tests, show that our proposal outperforms other methods in the comparison, thus becoming a suitable tool in the task of enhancing the nearest neighbor classifier.

  4. Simulation of two-dimensional turbulent flows in a rotating annulus

    NASA Astrophysics Data System (ADS)

    Storey, Brian D.

    2004-05-01

    Rotating water tank experiments have been used to study fundamental processes of atmospheric and geophysical turbulence in a controlled laboratory setting. When these tanks are undergoing strong rotation the forced turbulent flow becomes highly two dimensional along the axis of rotation. An efficient numerical method has been developed for simulating the forced quasi-geostrophic equations in an annular geometry to model current laboratory experiments. The algorithm employs a spectral method with Fourier series and Chebyshev polynomials as basis functions. The algorithm has been implemented on a parallel architecture to allow modelling of a wide range of spatial scales over long integration times. This paper describes the derivation of the model equations, numerical method, testing and performance of the algorithm. Results provide reasonable agreement with the experimental data, indicating that such computations can be used as a predictive tool to design future experiments.

  5. Application of 13C NMR cross-polarization inversion recovery experiments for the analysis of solid dosage forms.

    PubMed

    Pisklak, Dariusz Maciej; Zielińska-Pisklak, Monika; Szeleszczuk, Łukasz

    2016-11-20

    Solid-state nuclear magnetic resonance (ssNMR) is a powerful and unique method for analyzing solid forms of the active pharmaceutical ingredients (APIs) directly in their original formulations. Unfortunately, despite their wide range of application, the ssNMR experiments often suffer from low sensitivity and peaks overlapping between API and excipients. To overcome these limitations, the crosspolarization inversion recovery method was successfully used. The differences in the spin-lattice relaxation time constants for hydrogen atoms T1(H) between API and excipients were employed in order to separate and discriminate their peaks in ssNMR spectra as well as to increase the intensity of API signals in low-dose formulations. The versatility of this method was demonstrated by different examples, including the excipients mixture and commercial solid dosage forms (e.g. granules and tablets). Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Two-Phase and Graph-Based Clustering Methods for Accurate and Efficient Segmentation of Large Mass Spectrometry Images.

    PubMed

    Dexter, Alex; Race, Alan M; Steven, Rory T; Barnes, Jennifer R; Hulme, Heather; Goodwin, Richard J A; Styles, Iain B; Bunch, Josephine

    2017-11-07

    Clustering is widely used in MSI to segment anatomical features and differentiate tissue types, but existing approaches are both CPU and memory-intensive, limiting their application to small, single data sets. We propose a new approach that uses a graph-based algorithm with a two-phase sampling method that overcomes this limitation. We demonstrate the algorithm on a range of sample types and show that it can segment anatomical features that are not identified using commonly employed algorithms in MSI, and we validate our results on synthetic MSI data. We show that the algorithm is robust to fluctuations in data quality by successfully clustering data with a designed-in variance using data acquired with varying laser fluence. Finally, we show that this method is capable of generating accurate segmentations of large MSI data sets acquired on the newest generation of MSI instruments and evaluate these results by comparison with histopathology.

  7. Multi-classification of cell deformation based on object alignment and run length statistic.

    PubMed

    Li, Heng; Liu, Zhiwen; An, Xing; Shi, Yonggang

    2014-01-01

    Cellular morphology is widely applied in digital pathology and is essential for improving our understanding of the basic physiological processes of organisms. One of the main issues of application is to develop efficient methods for cell deformation measurement. We propose an innovative indirect approach to analyze dynamic cell morphology in image sequences. The proposed approach considers both the cellular shape change and cytoplasm variation, and takes each frame in the image sequence into account. The cell deformation is measured by the minimum energy function of object alignment, which is invariant to object pose. Then an indirect analysis strategy is employed to overcome the limitation of gradual deformation by run length statistic. We demonstrate the power of the proposed approach with one application: multi-classification of cell deformation. Experimental results show that the proposed method is sensitive to the morphology variation and performs better than standard shape representation methods.

  8. Isotropic stochastic rotation dynamics

    NASA Astrophysics Data System (ADS)

    Mühlbauer, Sebastian; Strobl, Severin; Pöschel, Thorsten

    2017-12-01

    Stochastic rotation dynamics (SRD) is a widely used method for the mesoscopic modeling of complex fluids, such as colloidal suspensions or multiphase flows. In this method, however, the underlying Cartesian grid defining the coarse-grained interaction volumes induces anisotropy. We propose an isotropic, lattice-free variant of stochastic rotation dynamics, termed iSRD. Instead of Cartesian grid cells, we employ randomly distributed spherical interaction volumes. This eliminates the requirement of a grid shift, which is essential in standard SRD to maintain Galilean invariance. We derive analytical expressions for the viscosity and the diffusion coefficient in relation to the model parameters, which show excellent agreement with the results obtained in iSRD simulations. The proposed algorithm is particularly suitable to model systems bound by walls of complex shape, where the domain cannot be meshed uniformly. The presented approach is not limited to SRD but is applicable to any other mesoscopic method, where particles interact within certain coarse-grained volumes.

  9. Rat erythrocyte glycophorins can be isolated by the lithium diiodosalicylate method used for other glycophorins.

    PubMed

    Herráez, A; Díez, J C; Luque, J

    1992-11-01

    1. The lithium diiodosalicylate/phenol method, widely employed for the isolation of membrane sialoglycoproteins (glycophorins) from mammalian erythrocytes, was applied for the first time to the purification of homologous glycoproteins from rat erythrocyte membranes. 2. The resulting preparations showed to be composed of four components, fractionated on SDS-PAGE. All four were positive for periodic acid-Schiff's reagent stain, the two largest of them being major. 3. Isolated rat glycophorins accounted for 60% of the ghost sialic acid and 1.5% of their protein. The presence of O-acetyl groups was confirmed in one-third of the sialic acid residues. 4. The molecular masses of the four glycophorin components were determined by a method which takes into account the anomalous mobility of glycoproteins on SDS-electrophoresis. Estimated values thus obtained for the actual molecular masses were 74, 32, 25 and 17 kDa.

  10. A compact LWIR imaging spectrometer with a variable gap Fabry-Perot interferometer

    NASA Astrophysics Data System (ADS)

    Zhang, Fang; Gao, Jiaobo; Wang, Nan; Zhao, Yujie; Zhang, Lei; Gao, Shan

    2017-02-01

    Fourier transform spectroscopy is a widely employed method for obtaining spectra, with applications ranging from the desktop to remote sensing. The long wave infrared (LWIR) interferometric spectral imaging system is always with huge volume and large weight. In order to miniaturize and light the instrument, a new method of LWIR spectral imaging system based on a variable gap Fabry-Perot (FP) interferometer is researched. With the system working principle analyzed, theoretically, it is researched that how to make certain the primary parameter, such as, the reflectivity of the two interferometric cavity surfaces, field of view (FOV) and f-number of the imaging lens. A prototype is developed and a good experimental result of CO2 laser is obtained. The research shows that besides high throughput and high spectral resolution, the advantage of miniaturization is also simultaneously achieved in this method.

  11. Spectroscopically Enhanced Method and System for Multi-Factor Biometric Authentication

    NASA Astrophysics Data System (ADS)

    Pishva, Davar

    This paper proposes a spectroscopic method and system for preventing spoofing of biometric authentication. One of its focus is to enhance biometrics authentication with a spectroscopic method in a multifactor manner such that a person's unique ‘spectral signatures’ or ‘spectral factors’ are recorded and compared in addition to a non-spectroscopic biometric signature to reduce the likelihood of imposter getting authenticated. By using the ‘spectral factors’ extracted from reflectance spectra of real fingers and employing cluster analysis, it shows how the authentic fingerprint image presented by a real finger can be distinguished from an authentic fingerprint image embossed on an artificial finger, or molded on a fingertip cover worn by an imposter. This paper also shows how to augment two widely used biometrics systems (fingerprint and iris recognition devices) with spectral biometrics capabilities in a practical manner and without creating much overhead or inconveniencing their users.

  12. Edge-illumination x-ray phase contrast imaging with Pt-based metallic glass masks

    NASA Astrophysics Data System (ADS)

    Saghamanesh, Somayeh; Aghamiri, Seyed Mahmoud-Reza; Olivo, Alessandro; Sadeghilarijani, Maryam; Kato, Hidemi; Kamali-Asl, Alireza; Yashiro, Wataru

    2017-06-01

    Edge-illumination x-ray phase contrast imaging (EI XPCI) is a non-interferometric phase-sensitive method where two absorption masks are employed. These masks are fabricated through a photolithography process followed by electroplating which is challenging in terms of yield as well as time- and cost-effectiveness. We report on the first implementation of EI XPCI with Pt-based metallic glass masks fabricated by an imprinting method. The new tested alloy exhibits good characteristics including high workability beside high x-ray attenuation. The fabrication process is easy and cheap, and can produce large-size masks for high x-ray energies within minutes. Imaging experiments show a good quality phase image, which confirms the potential of these masks to make the EI XPCI technique widely available and affordable.

  13. The elephant graveyard - A planet-wide Mars sample return

    NASA Astrophysics Data System (ADS)

    Heinsheimer, T. F.; Corn, Barbara

    1991-10-01

    A method is presented for collecting documented Martian samples from the surface of the entire planet based partly on research done for a 1994 Mars balloon mission. Smart balloons are employed to collect samples from difficult terrains, fly 100-200 km with the sample to more manageable terrains, and are retrieved by a rover mission for return to earth. Elements of the sample-return method are described in detail with attention given to the projected rates of success for each portion of the technology. The SNAKE, Canniballoon, and 'Brilliant Ants' concepts are described in terms of level of development, function within the mission, and technological requirements. Substantial research presently exists in the areas of deployment, on-site sample assessment, pick-up, and designs for the ballons and ground-traversing guideropes.

  14. Determination of selected azaarenes in water by bonded-phase extraction and liquid chromatography

    USGS Publications Warehouse

    Steinheimer, T.R.; Ondrus, M.G.

    1986-01-01

    A method for the rapid and simple quantitative determination of quinoline, isoquinoline, and five selected three-ring azaarenes in water has been developed. The azaarene fraction is separated from its carbon analogues on n-octadecyl packing material by edition with acidified water/acetonitrile. Concentration as great as 1000-fold is achieved readily. Instrumental analysis involves high-speed liquid chromatography on flexible-walled, wide-bore columns with fluorescence and ultraviolet detection at several wavelengths employing filter photometers in series. Method-validation data is provided as azaarene recovery efficiency from fortified samples. Distilled water, river water, contaminated ground water, and secondary-treatment effluent have been tested. Recoveries at part-per-billion levels are nearly quantitative for the three-ring compounds, but they decrease for quinoline and isoquinoline. ?? 1986 American Chemical Society.

  15. Introduction and expression of genes for metabolic engineering applications in Saccharomyces cerevisiae.

    PubMed

    Da Silva, Nancy A; Srikrishnan, Sneha

    2012-03-01

    Metabolic pathway engineering in the yeast Saccharomyces cerevisiae leads to improved production of a wide range of compounds, ranging from ethanol (from biomass) to natural products such as sesquiterpenes. The introduction of multienzyme pathways requires precise control over the level and timing of expression of the associated genes. Gene number and promoter strength/regulation are two critical control points, and multiple studies have focused on modulating these in yeast. This MiniReview focuses on methods for introducing genes and controlling their copy number and on the many promoters (both constitutive and inducible) that have been successfully employed. The advantages and disadvantages of the methods will be presented, and applications to pathway engineering will be highlighted. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  16. Sensitivity of Simulated Warm Rain Formation to Collision and Coalescence Efficiencies, Breakup, and Turbulence: Comparison of Two Bin-Resolved Numerical Models

    NASA Technical Reports Server (NTRS)

    Fridlind, Ann; Seifert, Axel; Ackerman, Andrew; Jensen, Eric

    2004-01-01

    Numerical models that resolve cloud particles into discrete mass size distributions on an Eulerian grid provide a uniquely powerful means of studying the closely coupled interaction of aerosols, cloud microphysics, and transport that determine cloud properties and evolution. However, such models require many experimentally derived paramaterizations in order to properly represent the complex interactions of droplets within turbulent flow. Many of these parameterizations remain poorly quantified, and the numerical methods of solving the equations for temporal evolution of the mass size distribution can also vary considerably in terms of efficiency and accuracy. In this work, we compare results from two size-resolved microphysics models that employ various widely-used parameterizations and numerical solution methods for several aspects of stochastic collection.

  17. Laser jetting of femto-liter metal droplets for high resolution 3D printed structures

    NASA Astrophysics Data System (ADS)

    Zenou, M.; Sa'Ar, A.; Kotler, Z.

    2015-11-01

    Laser induced forward transfer (LIFT) is employed in a special, high accuracy jetting regime, by adequately matching the sub-nanosecond pulse duration to the metal donor layer thickness. Under such conditions, an effective solid nozzle is formed, providing stability and directionality to the femto-liter droplets which are printed from a large gap in excess of 400 μm. We illustrate the wide applicability of this method by printing several 3D metal objects. First, very high aspect ratio (A/R > 20), micron scale, copper pillars in various configuration, upright and arbitrarily bent, then a micron scale 3D object composed of gold and copper. Such a digital printing method could serve the generation of complex, multi-material, micron-scale, 3D materials and novel structures.

  18. Effect of fringe-artifact correction on sub-tomogram averaging from Zernike phase-plate cryo-TEM

    PubMed Central

    Kishchenko, Gregory P.; Danev, Radostin; Fisher, Rebecca; He, Jie; Hsieh, Chyongere; Marko, Michael; Sui, Haixin

    2015-01-01

    Zernike phase-plate (ZPP) imaging greatly increases contrast in cryo-electron microscopy, however fringe artifacts appear in the images. A computational de-fringing method has been proposed, but it has not been widely employed, perhaps because the importance of de-fringing has not been clearly demonstrated. For testing purposes, we employed Zernike phase-plate imaging in a cryo-electron tomographic study of radial-spoke complexes attached to microtubule doublets. We found that the contrast enhancement by ZPP imaging made nonlinear denoising insensitive to the filtering parameters, such that simple low-frequency band-pass filtering made the same improvement in map quality. We employed sub-tomogram averaging, which compensates for the effect of the “missing wedge” and considerably improves map quality. We found that fringes (caused by the abrupt cut-on of the central hole in the phase plate) can lead to incorrect representation of a structure that is well-known from the literature. The expected structure was restored by amplitude scaling, as proposed in the literature. Our results show that de-fringing is an important part of image-processing for cryo-electron tomography of macromolecular complexes with ZPP imaging. PMID:26210582

  19. Fault Diagnosis for Rotating Machinery: A Method based on Image Processing

    PubMed Central

    Lu, Chen; Wang, Yang; Ragulskis, Minvydas; Cheng, Yujie

    2016-01-01

    Rotating machinery is one of the most typical types of mechanical equipment and plays a significant role in industrial applications. Condition monitoring and fault diagnosis of rotating machinery has gained wide attention for its significance in preventing catastrophic accident and guaranteeing sufficient maintenance. With the development of science and technology, fault diagnosis methods based on multi-disciplines are becoming the focus in the field of fault diagnosis of rotating machinery. This paper presents a multi-discipline method based on image-processing for fault diagnosis of rotating machinery. Different from traditional analysis method in one-dimensional space, this study employs computing method in the field of image processing to realize automatic feature extraction and fault diagnosis in a two-dimensional space. The proposed method mainly includes the following steps. First, the vibration signal is transformed into a bi-spectrum contour map utilizing bi-spectrum technology, which provides a basis for the following image-based feature extraction. Then, an emerging approach in the field of image processing for feature extraction, speeded-up robust features, is employed to automatically exact fault features from the transformed bi-spectrum contour map and finally form a high-dimensional feature vector. To reduce the dimensionality of the feature vector, thus highlighting main fault features and reducing subsequent computing resources, t-Distributed Stochastic Neighbor Embedding is adopt to reduce the dimensionality of the feature vector. At last, probabilistic neural network is introduced for fault identification. Two typical rotating machinery, axial piston hydraulic pump and self-priming centrifugal pumps, are selected to demonstrate the effectiveness of the proposed method. Results show that the proposed method based on image-processing achieves a high accuracy, thus providing a highly effective means to fault diagnosis for rotating machinery. PMID:27711246

  20. Fault Diagnosis for Rotating Machinery: A Method based on Image Processing.

    PubMed

    Lu, Chen; Wang, Yang; Ragulskis, Minvydas; Cheng, Yujie

    2016-01-01

    Rotating machinery is one of the most typical types of mechanical equipment and plays a significant role in industrial applications. Condition monitoring and fault diagnosis of rotating machinery has gained wide attention for its significance in preventing catastrophic accident and guaranteeing sufficient maintenance. With the development of science and technology, fault diagnosis methods based on multi-disciplines are becoming the focus in the field of fault diagnosis of rotating machinery. This paper presents a multi-discipline method based on image-processing for fault diagnosis of rotating machinery. Different from traditional analysis method in one-dimensional space, this study employs computing method in the field of image processing to realize automatic feature extraction and fault diagnosis in a two-dimensional space. The proposed method mainly includes the following steps. First, the vibration signal is transformed into a bi-spectrum contour map utilizing bi-spectrum technology, which provides a basis for the following image-based feature extraction. Then, an emerging approach in the field of image processing for feature extraction, speeded-up robust features, is employed to automatically exact fault features from the transformed bi-spectrum contour map and finally form a high-dimensional feature vector. To reduce the dimensionality of the feature vector, thus highlighting main fault features and reducing subsequent computing resources, t-Distributed Stochastic Neighbor Embedding is adopt to reduce the dimensionality of the feature vector. At last, probabilistic neural network is introduced for fault identification. Two typical rotating machinery, axial piston hydraulic pump and self-priming centrifugal pumps, are selected to demonstrate the effectiveness of the proposed method. Results show that the proposed method based on image-processing achieves a high accuracy, thus providing a highly effective means to fault diagnosis for rotating machinery.

  1. Single-step reinitialization and extending algorithms for level-set based multi-phase flow simulations

    NASA Astrophysics Data System (ADS)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-12-01

    We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.

  2. Mechanisms and Kinetics of Amyloid Aggregation Investigated by a Phenomenological Coarse-Grained Model

    NASA Astrophysics Data System (ADS)

    Magno, Andrea; Pellarin, Riccardo; Caflisch, Amedeo

    Amyloid fibrils are ordered polypeptide aggregates that have been implicated in several neurodegenerative pathologies, such as Alzheimer's, Parkinson's, Huntington's, and prion diseases, [1, 2] and, more recently, also in biological functionalities. [3, 4, 5] These findings have paved the way for a wide range of experimental and computational studies aimed at understanding the details of the fibril-formation mechanism. Computer simulations using low-resolution models, which employ a simplified representation of protein geometry and energetics, have provided insights into the basic physical principles underlying protein aggregation in general [6, 7, 8] and ordered amyloid aggregation. [9, 10, 11, 12, 13, 14, 15] For example, Dokholyan and coworkers have used the Discrete Molecular Dynamics method [16, 17] to shed light on the mechanisms of protein oligomerization [18] and the conformational changes that take place in proteins before the aggregation onset. [19, 20] One challenging observation, which is difficult to observe by computer simulations, is the wide range of aggregation scenarios emerging from a variety of biophysical measurements. [21, 22] Atomistic models have been employed to study the conformational space of amyloidogenic polypeptides in the monomeric state, [23, 24, 25] the very initial steps of amyloid formation, [26, 27, 28, 29, 30, 31, 32] and the structural stability of fibril models. [33, 34, 35) However, all-atom simulations of the kinetics of fibril formation are beyond what can be done with modern computers.

  3. A New Online Calibration Method Based on Lord's Bias-Correction.

    PubMed

    He, Yinhong; Chen, Ping; Li, Yong; Zhang, Shumei

    2017-09-01

    Online calibration technique has been widely employed to calibrate new items due to its advantages. Method A is the simplest online calibration method and has attracted many attentions from researchers recently. However, a key assumption of Method A is that it treats person-parameter estimates θ ^ s (obtained by maximum likelihood estimation [MLE]) as their true values θ s , thus the deviation of the estimated θ ^ s from their true values might yield inaccurate item calibration when the deviation is nonignorable. To improve the performance of Method A, a new method, MLE-LBCI-Method A, is proposed. This new method combines a modified Lord's bias-correction method (named as maximum likelihood estimation-Lord's bias-correction with iteration [MLE-LBCI]) with the original Method A in an effort to correct the deviation of θ ^ s which may adversely affect the item calibration precision. Two simulation studies were carried out to explore the performance of both MLE-LBCI and MLE-LBCI-Method A under several scenarios. Simulation results showed that MLE-LBCI could make a significant improvement over the ML ability estimates, and MLE-LBCI-Method A did outperform Method A in almost all experimental conditions.

  4. A step-by-step protocol for assaying protein carbonylation in biological samples.

    PubMed

    Colombo, Graziano; Clerici, Marco; Garavaglia, Maria Elisa; Giustarini, Daniela; Rossi, Ranieri; Milzani, Aldo; Dalle-Donne, Isabella

    2016-04-15

    Protein carbonylation represents the most frequent and usually irreversible oxidative modification affecting proteins. This modification is chemically stable and this feature is particularly important for storage and detection of carbonylated proteins. Many biochemical and analytical methods have been developed during the last thirty years to assay protein carbonylation. The most successful method consists on protein carbonyl (PCO) derivatization with 2,4-dinitrophenylhydrazine (DNPH) and consequent spectrophotometric assay. This assay allows a global quantification of PCO content due to the ability of DNPH to react with carbonyl giving rise to an adduct able to absorb at 366 nm. Similar approaches were also developed employing chromatographic separation, in particular HPLC, and parallel detection of absorbing adducts. Subsequently, immunological techniques, such as Western immunoblot or ELISA, have been developed leading to an increase of sensitivity in protein carbonylation detection. Currently, they are widely employed to evaluate change in total protein carbonylation and eventually to highlight the specific proteins undergoing selective oxidation. In the last decade, many mass spectrometry (MS) approaches have been developed for the identification of the carbonylated proteins and the relative amino acid residues modified to carbonyl derivatives. Although these MS methods are much more focused and detailed due to their ability to identify the amino acid residues undergoing carbonylation, they still require too expensive equipments and, therefore, are limited in distribution. In this protocol paper, we summarise and comment on the most diffuse protocols that a standard laboratory can employ to assess protein carbonylation; in particular, we describe step-by-step the different protocols, adding suggestions coming from our on-bench experience. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Constraints on Water Reservoir Lifetimes From Catchment-Wide 10Be Erosion Rates—A Case Study From Western Turkey

    NASA Astrophysics Data System (ADS)

    Heineke, Caroline; Hetzel, Ralf; Akal, Cüneyt; Christl, Marcus

    2017-11-01

    The functionality and retention capacity of water reservoirs is generally impaired by upstream erosion and reservoir sedimentation, making a reliable assessment of erosion indispensable to estimate reservoir lifetimes. Widely used river gauging methods may underestimate sediment yield, because they do not record rare, high-magnitude events and may underestimate bed load transport. Hence, reservoir lifetimes calculated from short-term erosion rates should be regarded as maximum values. We propose that erosion rates from cosmogenic 10Be, which commonly integrate over hundreds to thousands of years, are useful to complement short-term sediment yield estimates and should be employed to estimate minimum reservoir lifetimes. Here we present 10Be erosion rates for the drainage basins of six water reservoirs in Western Turkey, which are located in a tectonically active region with easily erodible bedrock. Our 10Be erosion rates for these catchments are high, ranging from ˜170 to ˜1,040 t/km2/yr. When linked to reservoir volumes, they yield minimum reservoir lifetimes between 25 ± 5 and 1,650 ± 360 years until complete filling, with four reservoirs having minimum lifespans of ≤110 years. In a neighboring region with more resistant bedrock and less tectonic activity, we obtain much lower catchment-wide 10Be erosion rates of ˜33 to ˜95 t/km2/yr, illustrating that differences in lithology and tectonic boundary conditions can cause substantial variations in erosion even at a spatial scale of only ˜50 km. In conclusion, we suggest that both short-term sediment yield estimates and 10Be erosion rates should be employed to predict the lifetimes of reservoirs.

  6. GalaxyRefineComplex: Refinement of protein-protein complex model structures driven by interface repacking.

    PubMed

    Heo, Lim; Lee, Hasup; Seok, Chaok

    2016-08-18

    Protein-protein docking methods have been widely used to gain an atomic-level understanding of protein interactions. However, docking methods that employ low-resolution energy functions are popular because of computational efficiency. Low-resolution docking tends to generate protein complex structures that are not fully optimized. GalaxyRefineComplex takes such low-resolution docking structures and refines them to improve model accuracy in terms of both interface contact and inter-protein orientation. This refinement method allows flexibility at the protein interface and in the overall docking structure to capture conformational changes that occur upon binding. Symmetric refinement is also provided for symmetric homo-complexes. This method was validated by refining models produced by available docking programs, including ZDOCK and M-ZDOCK, and was successfully applied to CAPRI targets in a blind fashion. An example of using the refinement method with an existing docking method for ligand binding mode prediction of a drug target is also presented. A web server that implements the method is freely available at http://galaxy.seoklab.org/refinecomplex.

  7. The Relationship between Academic Dishonesty and Unethical Business Practices.

    ERIC Educational Resources Information Center

    Sims, Randi L.

    1993-01-01

    An investigation of the relationship between the range and severity of academic dishonesty during undergraduate studies and that of dishonesty engaged in during employment revealed that subjects (n=60) who admitted to a wide range of academic dishonesty also admitted a wide range of work-related dishonesty. (Author/JOW)

  8. ENGAGEMENT IN OUTPATIENT SUBSTANCE ABUSE TREATMENT AND EMPLOYMENT OUTCOMES

    PubMed Central

    Dunigan, Robert; Acevedo, Andrea; Campbell, Kevin; Garnick, Deborah W.; Horgan, Constance M.; Huber, Alice; Lee, Margaret T.; Panas, Lee; Ritter, Grant A.

    2013-01-01

    This study, a collaboration between an academic research center and Washington State’s health, employment and correction departments, investigates the extent to which treatment engagement, a widely adopted performance measure, is associated with employment, an important outcome for individuals receiving treatment for substance use disorders. Two-stage Heckman probit regressions were conducted using 2008 administrative data for 7,570 adults receiving publicly-funded treatment. The first stage predicted employment in the year following the first treatment visit and three separate second stages models predicted number of quarters employed, wages, and hours worked. Engagement as a main effect was not significant for any of the employment outcomes. However, for clients with prior criminal justice involvement, engagement was associated with both employment and higher wages following treatment. Clients with criminal justice involvement face greater challenge regarding employment, so the identification of any actionable step which increases the likelihood of employment or wages is an important result. PMID:23686216

  9. Leveraging Work-Integrated Learning through On-Campus Employment: A University-Wide Approach

    ERIC Educational Resources Information Center

    Mitchell, Gaon; Kay, Judie

    2013-01-01

    At Victoria University, Melbourne, Australia, the majority of students engage in paid employment alongside their studies; and, every student has the opportunity to engage with work-integrated learning as a key component of their academic course. This paper explores an innovative structured approach the university has initiated to align these two…

  10. Improving Graduates' Employment Competitiveness: A Practice in Peking University

    ERIC Educational Resources Information Center

    Qi, Yanli

    2011-01-01

    The paper introduces data on the employment of postgraduates in the Department of Information Management of Peking University in 2000-2009. Master's graduates in LIS in Peking University have a wide job choice. In China, the job market for postgraduates in LIS is composed of enterprises and business organizations, rather than libraries and…

  11. 45 CFR 73.735-1401 - Prohibitions against post-employment conflicts of interest.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Prohibitions against post-employment conflicts of interest. 73.735-1401 Section 73.735-1401 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL.... (b) The Office of Government Ethics, Office of Personnel Management, has issued Government-wide...

  12. 45 CFR 73.735-1401 - Prohibitions against post-employment conflicts of interest.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Prohibitions against post-employment conflicts of interest. 73.735-1401 Section 73.735-1401 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL.... (b) The Office of Government Ethics, Office of Personnel Management, has issued Government-wide...

  13. 45 CFR 73.735-1401 - Prohibitions against post-employment conflicts of interest.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Prohibitions against post-employment conflicts of interest. 73.735-1401 Section 73.735-1401 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL.... (b) The Office of Government Ethics, Office of Personnel Management, has issued Government-wide...

  14. 45 CFR 73.735-1401 - Prohibitions against post-employment conflicts of interest.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Prohibitions against post-employment conflicts of interest. 73.735-1401 Section 73.735-1401 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL.... (b) The Office of Government Ethics, Office of Personnel Management, has issued Government-wide...

  15. 45 CFR 73.735-1401 - Prohibitions against post-employment conflicts of interest.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Prohibitions against post-employment conflicts of interest. 73.735-1401 Section 73.735-1401 Public Welfare Department of Health and Human Services GENERAL.... (b) The Office of Government Ethics, Office of Personnel Management, has issued Government-wide...

  16. Employability Skill Development in Work-Integrated Learning: Barriers and Best Practice

    ERIC Educational Resources Information Center

    Jackson, Denise

    2015-01-01

    Work-integrated learning (WIL) is widely considered instrumental in equipping new graduates with the required employability skills to function effectively in the work environment. Evaluation of WIL programs in enhancing skill development remains predominantly outcomes-focused with little attention to the process of what, how and from whom students…

  17. Legitimate Peripheral Participation by Sandwich Year Interns in the National Health Service

    ERIC Educational Resources Information Center

    Davies, Helen Maria; Sandiford, Peter John

    2014-01-01

    Student internships are widely seen as a valuable part of education provision and there is a growing body of research into internship programmes from student, employer and educator perspectives. This paper explores the experiences of a group of information technology interns employed in a small organisation involved in health care business…

  18. Creativity as a Desirable Graduate Attribute: Implications for Curriculum Design and Employability

    ERIC Educational Resources Information Center

    Rampersad, Giselle; Patel, Fay

    2014-01-01

    A wide range of graduate attributes are listed, categorized and prioritized by different higher education institutions. However, one attribute that is less visible in the literature is creativity. In the current study, creativity has emerged as a desirable graduate attribute among students and employers. This paper presents an exploratory…

  19. Optimal application of Morrison's iterative noise removal for deconvolution. Appendices

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1987-01-01

    Morrison's iterative method of noise removal, or Morrison's smoothing, is applied in a simulation to noise-added data sets of various noise levels to determine its optimum use. Morrison's smoothing is applied for noise removal alone, and for noise removal prior to deconvolution. For the latter, an accurate method is analyzed to provide confidence in the optimization. The method consists of convolving the data with an inverse filter calculated by taking the inverse discrete Fourier transform of the reciprocal of the transform of the response of the system. Various length filters are calculated for the narrow and wide Gaussian response functions used. Deconvolution of non-noisy data is performed, and the error in each deconvolution calculated. Plots are produced of error versus filter length; and from these plots the most accurate length filters determined. The statistical methodologies employed in the optimizations of Morrison's method are similar. A typical peak-type input is selected and convolved with the two response functions to produce the data sets to be analyzed. Both constant and ordinate-dependent Gaussian distributed noise is added to the data, where the noise levels of the data are characterized by their signal-to-noise ratios. The error measures employed in the optimizations are the L1 and L2 norms. Results of the optimizations for both Gaussians, both noise types, and both norms include figures of optimum iteration number and error improvement versus signal-to-noise ratio, and tables of results. The statistical variation of all quantities considered is also given.

  20. Development of electrochemical sensors for trace detection of explosives and for the detection of chemical warfare agents

    NASA Astrophysics Data System (ADS)

    Berger, T.; Ziegler, H.; Krausa, Michael

    2000-08-01

    A huge number of chemical sensors are based on electrochemical measurement methods. Particularly amperometric sensorsystems are employed for the fast detection of pollutants in industry and environment as well as for analytic systems in the medical diagnosis. The large number of different applications of electrochemical sensors is based on the high sensitivity of electrochemical methods and on the wide of possibilities to enhance the selectivity by variation of electrochemical and chemical parameters. Besides this, electrochemical sensorsystems are frequently simple to operate, transportable and cheap. Up to now the electrochemical method of cyclic voltammetry is used only seldom for sensors. Clearly the efficiency of cyclic voltammetry can be seen at the sensorsystem for the detection of nitro- and aminotoluenes in solids and waters as presented here. The potentiodynamic sensors system can be employed for the fast and easy risk estimation of contaminated areas. Because of the high sensitivity of electrochemical methods the detection of chemical substances with a low vapor pressure is possible also. The vapor pressure of TNT at room temperature is 7 ppb for instances. With a special electrochemical set-up we were able to measure TNT approximately 10 cm above a TNT-sample. In addition we were able to estimate TNT in the gaseous phase approximately 10 cm above a real plastic mine. Therefore it seems to be possible to develop an electrochemical mien detection. Moreover, we present that the electrochemical detection of RDX, HMX and chemical warfare agents is also possible.

  1. Matched Interface and Boundary Method for Elasticity Interface Problems

    PubMed Central

    Wang, Bao; Xia, Kelin; Wei, Guo-Wei

    2015-01-01

    Elasticity theory is an important component of continuum mechanics and has had widely spread applications in science and engineering. Material interfaces are ubiquity in nature and man-made devices, and often give rise to discontinuous coefficients in the governing elasticity equations. In this work, the matched interface and boundary (MIB) method is developed to address elasticity interface problems. Linear elasticity theory for both isotropic homogeneous and inhomogeneous media is employed. In our approach, Lamé’s parameters can have jumps across the interface and are allowed to be position dependent in modeling isotropic inhomogeneous material. Both strong discontinuity, i.e., discontinuous solution, and weak discontinuity, namely, discontinuous derivatives of the solution, are considered in the present study. In the proposed method, fictitious values are utilized so that the standard central finite different schemes can be employed regardless of the interface. Interface jump conditions are enforced on the interface, which in turn, accurately determines fictitious values. We design new MIB schemes to account for complex interface geometries. In particular, the cross derivatives in the elasticity equations are difficult to handle for complex interface geometries. We propose secondary fictitious values and construct geometry based interpolation schemes to overcome this difficulty. Numerous analytical examples are used to validate the accuracy, convergence and robustness of the present MIB method for elasticity interface problems with both small and large curvatures, strong and weak discontinuities, and constant and variable coefficients. Numerical tests indicate second order accuracy in both L∞ and L2 norms. PMID:25914439

  2. RSRE: RNA structural robustness evaluator

    PubMed Central

    Shu, Wenjie; Zheng, Zhiqiang; Wang, Shengqi

    2007-01-01

    Biological robustness, defined as the ability to maintain stable functioning in the face of various perturbations, is an important and fundamental topic in current biology, and has become a focus of numerous studies in recent years. Although structural robustness has been explored in several types of RNA molecules, the origins of robustness are still controversial. Computational analysis results are needed to make up for the lack of evidence of robustness in natural biological systems. The RNA structural robustness evaluator (RSRE) web server presented here provides a freely available online tool to quantitatively evaluate the structural robustness of RNA based on the widely accepted definition of neutrality. Several classical structure comparison methods are employed; five randomization methods are implemented to generate control sequences; sub-optimal predicted structures can be optionally utilized to mitigate the uncertainty of secondary structure prediction. With a user-friendly interface, the web application is easy to use. Intuitive illustrations are provided along with the original computational results to facilitate analysis. The RSRE will be helpful in the wide exploration of RNA structural robustness and will catalyze our understanding of RNA evolution. The RSRE web server is freely available at http://biosrv1.bmi.ac.cn/RSRE/ or http://biotech.bmi.ac.cn/RSRE/. PMID:17567615

  3. Portable lensless wide-field microscopy imaging platform based on digital inline holography and multi-frame pixel super-resolution

    PubMed Central

    Sobieranski, Antonio C; Inci, Fatih; Tekin, H Cumhur; Yuksekkaya, Mehmet; Comunello, Eros; Cobra, Daniel; von Wangenheim, Aldo; Demirci, Utkan

    2017-01-01

    In this paper, an irregular displacement-based lensless wide-field microscopy imaging platform is presented by combining digital in-line holography and computational pixel super-resolution using multi-frame processing. The samples are illuminated by a nearly coherent illumination system, where the hologram shadows are projected into a complementary metal-oxide semiconductor-based imaging sensor. To increase the resolution, a multi-frame pixel resolution approach is employed to produce a single holographic image from multiple frame observations of the scene, with small planar displacements. Displacements are resolved by a hybrid approach: (i) alignment of the LR images by a fast feature-based registration method, and (ii) fine adjustment of the sub-pixel information using a continuous optimization approach designed to find the global optimum solution. Numerical method for phase-retrieval is applied to decode the signal and reconstruct the morphological details of the analyzed sample. The presented approach was evaluated with various biological samples including sperm and platelets, whose dimensions are in the order of a few microns. The obtained results demonstrate a spatial resolution of 1.55 µm on a field-of-view of ≈30 mm2. PMID:29657866

  4. A nonrecursive order N preconditioned conjugate gradient: Range space formulation of MDOF dynamics

    NASA Technical Reports Server (NTRS)

    Kurdila, Andrew J.

    1990-01-01

    While excellent progress has been made in deriving algorithms that are efficient for certain combinations of system topologies and concurrent multiprocessing hardware, several issues must be resolved to incorporate transient simulation in the control design process for large space structures. Specifically, strategies must be developed that are applicable to systems with numerous degrees of freedom. In addition, the algorithms must have a growth potential in that they must also be amenable to implementation on forthcoming parallel system architectures. For mechanical system simulation, this fact implies that algorithms are required that induce parallelism on a fine scale, suitable for the emerging class of highly parallel processors; and transient simulation methods must be automatically load balancing for a wider collection of system topologies and hardware configurations. These problems are addressed by employing a combination range space/preconditioned conjugate gradient formulation of multi-degree-of-freedom dynamics. The method described has several advantages. In a sequential computing environment, the method has the features that: by employing regular ordering of the system connectivity graph, an extremely efficient preconditioner can be derived from the 'range space metric', as opposed to the system coefficient matrix; because of the effectiveness of the preconditioner, preliminary studies indicate that the method can achieve performance rates that depend linearly upon the number of substructures, hence the title 'Order N'; and the method is non-assembling. Furthermore, the approach is promising as a potential parallel processing algorithm in that the method exhibits a fine parallel granularity suitable for a wide collection of combinations of physical system topologies/computer architectures; and the method is easily load balanced among processors, and does not rely upon system topology to induce parallelism.

  5. SU-D-206-01: Employing a Novel Consensus Optimization Strategy to Achieve Iterative Cone Beam CT Reconstruction On a Multi-GPU Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, B; Southern Medical University, Guangzhou, Guangdong; Tian, Z

    Purpose: While compressed sensing-based cone-beam CT (CBCT) iterative reconstruction techniques have demonstrated tremendous capability of reconstructing high-quality images from undersampled noisy data, its long computation time still hinders wide application in routine clinic. The purpose of this study is to develop a reconstruction framework that employs modern consensus optimization techniques to achieve CBCT reconstruction on a multi-GPU platform for improved computational efficiency. Methods: Total projection data were evenly distributed to multiple GPUs. Each GPU performed reconstruction using its own projection data with a conventional total variation regularization approach to ensure image quality. In addition, the solutions from GPUs were subjectmore » to a consistency constraint that they should be identical. We solved the optimization problem with all the constraints considered rigorously using an alternating direction method of multipliers (ADMM) algorithm. The reconstruction framework was implemented using OpenCL on a platform with two Nvidia GTX590 GPU cards, each with two GPUs. We studied the performance of our method and demonstrated its advantages through a simulation case with a NCAT phantom and an experimental case with a Catphan phantom. Result: Compared with the CBCT images reconstructed using conventional FDK method with full projection datasets, our proposed method achieved comparable image quality with about one third projection numbers. The computation time on the multi-GPU platform was ∼55 s and ∼ 35 s in the two cases respectively, achieving a speedup factor of ∼ 3.0 compared with single GPU reconstruction. Conclusion: We have developed a consensus ADMM-based CBCT reconstruction method which enabled performing reconstruction on a multi-GPU platform. The achieved efficiency made this method clinically attractive.« less

  6. Structural reliability calculation method based on the dual neural network and direct integration method.

    PubMed

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  7. Hydrophobic Interaction Chromatography for Bottom-Up Proteomics Analysis of Single Proteins and Protein Complexes.

    PubMed

    Rackiewicz, Michal; Große-Hovest, Ludger; Alpert, Andrew J; Zarei, Mostafa; Dengjel, Jörn

    2017-06-02

    Hydrophobic interaction chromatography (HIC) is a robust standard analytical method to purify proteins while preserving their biological activity. It is widely used to study post-translational modifications of proteins and drug-protein interactions. In the current manuscript we employed HIC to separate proteins, followed by bottom-up LC-MS/MS experiments. We used this approach to fractionate antibody species followed by comprehensive peptide mapping as well as to study protein complexes in human cells. HIC-reversed-phase chromatography (RPC)-mass spectrometry (MS) is a powerful alternative to fractionate proteins for bottom-up proteomics experiments making use of their distinct hydrophobic properties.

  8. A combined EPR and MD simulation study of a nitroxyl spin label with restricted internal mobility sensitive to protein dynamics.

    PubMed

    Oganesyan, Vasily S; Chami, Fatima; White, Gaye F; Thomson, Andrew J

    2017-01-01

    EPR studies combined with fully atomistic Molecular Dynamics (MD) simulations and an MD-EPR simulation method provide evidence for intrinsic low rotameric mobility of a nitroxyl spin label, Rn, compared to the more widely employed label MTSL (R1). Both experimental and modelling results using two structurally different sites of attachment to Myoglobin show that the EPR spectra of Rn are more sensitive to the local protein environment than that of MTSL. This study reveals the potential of using the Rn spin label as a reporter of protein motions. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  10. Near-field three-dimensional radar imaging techniques and applications.

    PubMed

    Sheen, David; McMakin, Douglas; Hall, Thomas

    2010-07-01

    Three-dimensional radio frequency imaging techniques have been developed for a variety of near-field applications, including radar cross-section imaging, concealed weapon detection, ground penetrating radar imaging, through-barrier imaging, and nondestructive evaluation. These methods employ active radar transceivers that operate at various frequency ranges covering a wide range, from less than 100 MHz to in excess of 350 GHz, with the frequency range customized for each application. Computational wavefront reconstruction imaging techniques have been developed that optimize the resolution and illumination quality of the images. In this paper, rectilinear and cylindrical three-dimensional imaging techniques are described along with several application results.

  11. Three-dimensional stress intensity factor analysis of a surface crack in a high-speed bearing

    NASA Technical Reports Server (NTRS)

    Ballarini, Roberto; Hsu, Yingchun

    1990-01-01

    The boundary element method is applied to calculate the stress intensity factors of a surface crack in the rotating inner raceway of a high-speed roller bearing. The three-dimensional model consists of an axially stressed surface cracked plate subjected to a moving Hertzian contact loading. A multidomain formulation and singular crack-tip elements were employed to calculate the stress intensity factors accurately and efficiently for a wide range of configuration parameters. The results can provide the basis for crack growth calculations and fatigue life predictions of high-performance rolling element bearings that are used in aircraft engines.

  12. Three dimensional nozzle-exhaust flow field analysis by a reference plane technique.

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Del Guidice, P. D.

    1972-01-01

    A numerical method based on reference plane characteristics has been developed for the calculation of highly complex supersonic nozzle-exhaust flow fields. The difference equations have been developed for three coordinate systems. Local reference plane orientations are employed using the three coordinate systems concurrently thus catering to a wide class of flow geometries. Discontinuities such as the underexpansion shock and contact surfaces are computed explicitly for nonuniform vehicle external flows. The nozzles considered may have irregular cross-sections with swept throats and may be stacked in modules using the vehicle undersurface for additional expansion. Results are presented for several nozzle configurations.

  13. Plasmonic slow light waveguide with hyperbolic metamaterials claddings

    NASA Astrophysics Data System (ADS)

    Liang, Shuhai; Jiang, Chuhao; Yang, Zhiqiang; Li, Dacheng; Zhang, Wending; Mei, Ting; Zhang, Dawei

    2018-06-01

    Plasmonic waveguides with an insulator core sandwiched between hyperbolic metamaterials (HMMs) claddings, i.e. HIH waveguide, are investigated for achieving wide slow-light band with adjustable working wavelength. The transfer matrix method and the finite-difference-time-domain simulation are employed to study waveguide dispersion characteristics and pulse propagation. By selecting proper silver filling ratios for HMMs, the hetero-HIH waveguide presents a slow-light band with a zero group velocity dispersion wavelength of 1.55 μm and is capable of buffering pulses with pulse width as short as ∼20 fs. This type of waveguides might be applicable for ultrafast slow-light application.

  14. Solution of some types of differential equations: operational calculus and inverse differential operators.

    PubMed

    Zhukovsky, K

    2014-01-01

    We present a general method of operational nature to analyze and obtain solutions for a variety of equations of mathematical physics and related mathematical problems. We construct inverse differential operators and produce operational identities, involving inverse derivatives and families of generalised orthogonal polynomials, such as Hermite and Laguerre polynomial families. We develop the methodology of inverse and exponential operators, employing them for the study of partial differential equations. Advantages of the operational technique, combined with the use of integral transforms, generating functions with exponentials and their integrals, for solving a wide class of partial derivative equations, related to heat, wave, and transport problems, are demonstrated.

  15. Revisiting the diffusion mechanism of helium in UO2: A DFT+U study

    NASA Astrophysics Data System (ADS)

    Liu, X.-Y.; Andersson, D. A.

    2018-01-01

    The understanding of migration properties of helium atoms after their generation through α-decay of actinides in spent nuclear fuels is important for the safety of nuclear fuel storage and disposal. The diffusion of helium in UO2 is revisited by using the DFT+U simulation methodology employing the "U-ramping" method to address the issue of metastable energy states. A novel diffusion mechanism by helium interstitials, the "asymmetric hop" mechanism, is reported and compared to other diffusion mechanisms including an oxygen vacancy mediated mechanism and available experimental diffusion data. The new mechanism is shown to be the dominant one over a wide temperature range.

  16. Systematic reviews: Separating fact from fiction.

    PubMed

    Haddaway, Neal R; Bilotta, Gary S

    2016-01-01

    The volume of scientific literature continues to expand and decision-makers are faced with increasingly unmanageable volumes of evidence to assess. Systematic reviews (SRs) are powerful tools that aim to provide comprehensive, transparent, reproducible and updateable summaries of evidence. SR methods were developed, and have been employed, in healthcare for more than two decades, and they are now widely used across a broad range of topics, including environmental management and social interventions in crime and justice, education, international development, and social welfare. Despite these successes and the increasing acceptance of SR methods as a 'gold standard' in evidence-informed policy and practice, misconceptions still remain regarding their applicability. The aim of this article is to separate fact from fiction, addressing twelve common misconceptions that can influence the decision as to whether a SR is the most appropriate method for evidence synthesis for a given topic. Through examples, we illustrate the flexibility of SR methods and demonstrate their suitability for addressing issues on environmental health and chemical risk assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. On state-of-charge determination for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Li, Zhe; Huang, Jun; Liaw, Bor Yann; Zhang, Jianbo

    2017-04-01

    Accurate estimation of state-of-charge (SOC) of a battery through its life remains challenging in battery research. Although improved precisions continue to be reported at times, almost all are based on regression methods empirically, while the accuracy is often not properly addressed. Here, a comprehensive review is set to address such issues, from fundamental principles that are supposed to define SOC to methodologies to estimate SOC for practical use. It covers topics from calibration, regression (including modeling methods) to validation in terms of precision and accuracy. At the end, we intend to answer the following questions: 1) can SOC estimation be self-adaptive without bias? 2) Why Ah-counting is a necessity in almost all battery-model-assisted regression methods? 3) How to establish a consistent framework of coupling in multi-physics battery models? 4) To assess the accuracy in SOC estimation, statistical methods should be employed to analyze factors that contribute to the uncertainty. We hope, through this proper discussion of the principles, accurate SOC estimation can be widely achieved.

  18. Method for high-precision multi-layered thin film deposition for deep and extreme ultraviolet mirrors

    DOEpatents

    Ruffner, Judith Alison

    1999-01-01

    A method for coating (flat or non-flat) optical substrates with high-reflectivity multi-layer coatings for use at Deep Ultra-Violet ("DUV") and Extreme Ultra-Violet ("EUV") wavelengths. The method results in a product with minimum feature sizes of less than 0.10-.mu.m for the shortest wavelength (13.4-nm). The present invention employs a computer-based modeling and deposition method to enable lateral and vertical thickness control by scanning the position of the substrate with respect to the sputter target during deposition. The thickness profile of the sputter targets is modeled before deposition and then an appropriate scanning algorithm is implemented to produce any desired, radially-symmetric thickness profile. The present invention offers the ability to predict and achieve a wide range of thickness profiles on flat or figured substrates, i.e., account for 1/R.sup.2 factor in a model, and the ability to predict and accommodate changes in deposition rate as a result of plasma geometry, i.e., over figured substrates.

  19. Dechlorination by ultraviolet radiation: a suitable alternative to activated carbon in dialysis water systems?

    PubMed

    James, Ray

    2009-12-01

    Chlorine-based products are widely used in the water supply industry, and the potential for adverse effects in the haemodialysis setting is well documented. To date, the most commonly used method of chlorine removal has been granular activated carbon filters. An increasingly popular method of dechlorination is the use of high intensity, broad-spectrum UV systems to reduce both free chlorine and combined chlorine compounds (chloramines) into easily removed by-products. UV radiation has been successfully used in the pharmaceutical and food industries to destroy free chlorine and/or chloramines present in water, and kill all known spoilage microorganisms including bacteria, viruses, yeasts and moulds (and their spores). This nonchemical method can offer significant advantages and benefits compared to conventional dechlorination technologies currently employed in dialysis water systems. Whilst UV treatment at 254 nm wavelength has been routinely used for disinfection purposes in dialysis water systems, this paper considers whether UV radiation can be used as an alternative to more traditional methods of chlorine removal.

  20. Analytical Methods for Biomass Characterization during Pretreatment and Bioconversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pu, Yunqiao; Meng, Xianzhi; Yoo, Chang Geun

    2016-01-01

    Lignocellulosic biomass has been introduced as a promising resource for alternative fuels and chemicals because of its abundance and complement for petroleum resources. Biomass is a complex biopolymer and its compositional and structural characteristics largely vary depending on its species as well as growth environments. Because of complexity and variety of biomass, understanding its physicochemical characteristics is a key for effective biomass utilization. Characterization of biomass does not only provide critical information of biomass during pretreatment and bioconversion, but also give valuable insights on how to utilize the biomass. For better understanding biomass characteristics, good grasp and proper selection ofmore » analytical methods are necessary. This chapter introduces existing analytical approaches that are widely employed for biomass characterization during biomass pretreatment and conversion process. Diverse analytical methods using Fourier transform infrared (FTIR) spectroscopy, gel permeation chromatography (GPC), and nuclear magnetic resonance (NMR) spectroscopy for biomass characterization are reviewed. In addition, biomass accessibility methods by analyzing surface properties of biomass are also summarized in this chapter.« less

  1. Coherent diffraction imaging of non-isolated object with apodized illumination.

    PubMed

    Khakurel, Krishna P; Kimura, Takashi; Joti, Yasumasa; Matsuyama, Satoshi; Yamauchi, Kazuto; Nishino, Yoshinori

    2015-11-02

    Coherent diffraction imaging (CDI) is an established lensless imaging method widely used at the x-ray regime applicable to the imaging of non-periodic materials. Conventional CDI can practically image isolated objects only, which hinders the broader application of the method. We present the imaging of non-isolated objects by employing recently proposed "non-scanning" apodized-illumination CDI at an optical wavelength. We realized isolated apodized illumination with a specially designed optical configuration and succeeded in imaging phase objects as well as amplitude objects. The non-scanning nature of the method is important particularly in imaging live cells and tissues, where fast imaging is required for non-isolated objects, and is an advantage over ptychography. We believe that our result of phase contrast imaging at an optical wavelength can be extended to the quantitative phase imaging of cells and tissues. The method also provides the feasibility of the lensless single-shot imaging of extended objects with x-ray free-electron lasers.

  2. Detailed classification of swimming paths in the Morris Water Maze: multiple strategies within one trial

    PubMed Central

    Gehring, Tiago V.; Luksys, Gediminas; Sandi, Carmen; Vasilaki, Eleni

    2015-01-01

    The Morris Water Maze is a widely used task in studies of spatial learning with rodents. Classical performance measures of animals in the Morris Water Maze include the escape latency, and the cumulative distance to the platform. Other methods focus on classifying trajectory patterns to stereotypical classes representing different animal strategies. However, these approaches typically consider trajectories as a whole, and as a consequence they assign one full trajectory to one class, whereas animals often switch between these strategies, and their corresponding classes, within a single trial. To this end, we take a different approach: we look for segments of diverse animal behaviour within one trial and employ a semi-automated classification method for identifying the various strategies exhibited by the animals within a trial. Our method allows us to reveal significant and systematic differences in the exploration strategies of two animal groups (stressed, non-stressed), that would be unobserved by earlier methods. PMID:26423140

  3. Microwave Enhancement of Autocatalytic Growth of Nanometals.

    PubMed

    Ashley, Bridgett; Vakil, Parth N; Lynch, Brian B; Dyer, Christopher M; Tracy, Joseph B; Owens, Jeffery; Strouse, Geoffrey F

    2017-10-24

    The desire for designing efficient synthetic methods that lead to industrially important nanomaterials has led a desire to more fully understand the mechanism of growth and how modern synthetic techniques can be employed. Microwave (MW) synthesis is one such technique that has attracted attention as a green, sustainable method. The reports of enhancement of formation rates and improved quality for MW driven reactions are intriguing, but the lack of understanding of the reaction mechanism and how coupling to the MW field leads to these observations is concerning. In this manuscript, the growth of a metal nanoparticles (NPs) in a microwave cavity is spectroscopically analyzed and compared with the classical autocatalytic method of NP growth to elucidate the underpinnings for the observed enhanced growth behavior for metal NPs prepared in a MW field. The study illustrates that microwave synthesis of nickel and gold NPs below saturation conditions follows the Finke-Watzky mechanism of nucleation and growth. The enhancement of the reaction arises from the size-dependent increase in MW absorption cross section for the metal NPs. For Ni, the presence of oxides is considered via theoretical computations and compared to dielectric measurements of isolated nickel NPs. The study definitively shows that MW growth can be modeled by an autocatalytic mechanism that directly leads to the observed enhanced rates and improved quality widely reported in the nanomaterial community when MW irradiation is employed.

  4. Determination of tylosins A, B, C and D in bee larvae by liquid chromatography coupled to ion trap-tandem mass spectrometry.

    PubMed

    Bernal, J; Martín, Ma T; Toribio, L; Martín-Hernández, R; Higes, M; Bernal, J L; Nozal, M J

    2011-06-01

    A LC-MS/MS method has been developed to simultaneously quantify tylosins A, B, C and D in bee larvae, compounds currently used to treat one of the most lethal diseases affecting honey bees around the world, American Foulbrood (AFB). The influence of different aqueous media, temperature and light exposure on the stability of these four compounds was studied. The analytes were extracted from bee larvae with methanol and chromatographic separation was achieved on a Luna C(18) (150 × 4.6 mm i.d.) using a ternary gradient composed of a diluted formic acid, methanol and acetonitrile mobile phase. To facilitate sampling, bee larvae were initially dried at 60°C for 4h and afterwards, they were diluted to avoid problems of pressure. MSD-Ion Trap detection was employed with electrospray ionization (ESI). The calibration curves were linear over a wide range of concentrations and the method was validated as sensitive, precise and accurate within the limits of quantification (LOQ, 1.4-4.0 ng/g). The validated method was successfully employed to study bee larvae in field tests of bee hives treated with two formulations containing tylosin. In both cases it was evident that the minimal inhibitory concentration (MIC) had been reached. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Recent advances in integrated photonic sensors.

    PubMed

    Passaro, Vittorio M N; de Tullio, Corrado; Troia, Benedetto; La Notte, Mario; Giannoccaro, Giovanni; De Leonardis, Francesco

    2012-11-09

    Nowadays, optical devices and circuits are becoming fundamental components in several application fields such as medicine, biotechnology, automotive, aerospace, food quality control, chemistry, to name a few. In this context, we propose a complete review on integrated photonic sensors, with specific attention to materials, technologies, architectures and optical sensing principles. To this aim, sensing principles commonly used in optical detection are presented, focusing on sensor performance features such as sensitivity, selectivity and rangeability. Since photonic sensors provide substantial benefits regarding compatibility with CMOS technology and integration on chips characterized by micrometric footprints, design and optimization strategies of photonic devices are widely discussed for sensing applications. In addition, several numerical methods employed in photonic circuits and devices, simulations and design are presented, focusing on their advantages and drawbacks. Finally, recent developments in the field of photonic sensing are reviewed, considering advanced photonic sensor architectures based on linear and non-linear optical effects and to be employed in chemical/biochemical sensing, angular velocity and electric field detection.

  6. Detrimental Effect Elimination of Laser Frequency Instability in Brillouin Optical Time Domain Reflectometer by Using Self-Heterodyne Detection

    PubMed Central

    Li, Yongqian; Li, Xiaojuan; An, Qi; Zhang, Lixin

    2017-01-01

    A useful method for eliminating the detrimental effect of laser frequency instability on Brillouin signals by employing the self-heterodyne detection of Rayleigh and Brillouin scattering is presented. From the analysis of Brillouin scattering spectra from fibers with different lengths measured by heterodyne detection, the maximum usable pulse width immune to laser frequency instability is obtained to be about 4 µs in a self-heterodyne detection Brillouin optical time domain reflectometer (BOTDR) system using a broad-band laser with low frequency stability. Applying the self-heterodyne detection of Rayleigh and Brillouin scattering in BOTDR system, we successfully demonstrate that the detrimental effect of laser frequency instability on Brillouin signals can be eliminated effectively. Employing the broad-band laser modulated by a 130-ns wide pulse driven electro-optic modulator, the observed maximum errors in temperatures measured by the local heterodyne and self-heterodyne detection BOTDR systems are 7.9 °C and 1.2 °C, respectively. PMID:28335508

  7. Plasmonic Gold Nanorod Dispersions with Electrical and Optical Tunability

    NASA Astrophysics Data System (ADS)

    Grabowski, Christopher; Mahoney, Clare; Park, Kyoungweon; Jawaid, Ali; White, Timothy; Vaia, Richard

    The transmissive, absorptive, electrical, and thermal properties of plasmonic gold nanorods (NRs) have led to their employment in a broad range of applications. These electro-optical properties - governed by their size, shape, and composition - are widely and precisely tunable during synthesis. Gold NRs show promise for large scale optical elements as they have been demonstrated to align faster than liquid crystal films (μs) at low fields (1 V/ μm). Successfully dispersing a high volume fraction of gold NRs requires a strategy to control particle-particle separation and thus avoid aggregation. Herein, we discuss the role of theta temperature and the ability to swell or collapse the chains of polymer-grafted gold NRs to alter the interaction potential between particles. UV-Vis spectroscopy, scattering, and electrical susceptibility characterization methods were employed to determine nanoparticle dispersion along with the degree of gold NR alignment. The development of new agile photonic materials, controllable with both light and electric fields, will help address emerging needs in laser hardening (agile filters) and variable transmission visors.

  8. Recent Advances in Integrated Photonic Sensors

    PubMed Central

    Passaro, Vittorio M. N.; de Tullio, Corrado; Troia, Benedetto; La Notte, Mario; Giannoccaro, Giovanni; De Leonardis, Francesco

    2012-01-01

    Nowadays, optical devices and circuits are becoming fundamental components in several application fields such as medicine, biotechnology, automotive, aerospace, food quality control, chemistry, to name a few. In this context, we propose a complete review on integrated photonic sensors, with specific attention to materials, technologies, architectures and optical sensing principles. To this aim, sensing principles commonly used in optical detection are presented, focusing on sensor performance features such as sensitivity, selectivity and rangeability. Since photonic sensors provide substantial benefits regarding compatibility with CMOS technology and integration on chips characterized by micrometric footprints, design and optimization strategies of photonic devices are widely discussed for sensing applications. In addition, several numerical methods employed in photonic circuits and devices, simulations and design are presented, focusing on their advantages and drawbacks. Finally, recent developments in the field of photonic sensing are reviewed, considering advanced photonic sensor architectures based on linear and non-linear optical effects and to be employed in chemical/biochemical sensing, angular velocity and electric field detection. PMID:23202223

  9. Interaction of transient radiation in nongray gaseous systems

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Singh, D. J.

    1987-01-01

    A general formulation is presented to investigate the transient radiative interaction in nongray absorbing-emitting species between two parallel plates. Depending on the desired sophistication and accuracy, any nongray absorption model from line-by-line models to the wide band model correlations can be employed in the formulation to investigate the radiative interaction. Special attention is directed to investigate the radiative interaction in a system initially at a uniform reference temperature and suddenly the temperature of the bottom plate is reduced to a lower but constant temperature. The interaction is considered for the case of radiative equilibrium as well as for combined radiation and conduction. General as well as limiting forms of the governing equations are presented and solutions are obtained numerically by employing the method of variation of parameters. Specific results are obtained for CO, CO2, H2O, and OH. The information on species H2O and OH is of special interest for the proposed scramjet engine application. The results demonstrate the relative ability of different species for radiative interactions.

  10. DynaMIT: the dynamic motif integration toolkit

    PubMed Central

    Dassi, Erik; Quattrone, Alessandro

    2016-01-01

    De-novo motif search is a frequently applied bioinformatics procedure to identify and prioritize recurrent elements in sequences sets for biological investigation, such as the ones derived from high-throughput differential expression experiments. Several algorithms have been developed to perform motif search, employing widely different approaches and often giving divergent results. In order to maximize the power of these investigations and ultimately be able to draft solid biological hypotheses, there is the need for applying multiple tools on the same sequences and merge the obtained results. However, motif reporting formats and statistical evaluation methods currently make such an integration task difficult to perform and mostly restricted to specific scenarios. We thus introduce here the Dynamic Motif Integration Toolkit (DynaMIT), an extremely flexible platform allowing to identify motifs employing multiple algorithms, integrate them by means of a user-selected strategy and visualize results in several ways; furthermore, the platform is user-extendible in all its aspects. DynaMIT is freely available at http://cibioltg.bitbucket.org. PMID:26253738

  11. A New Local Debonding Model with Application to the Transverse Tensile and Creep Behavior of Continuously Reinforced Titanium Composites

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2000-01-01

    A new, widely applicable model for local interfacial debonding in composite materials is presented. Unlike its direct predecessors, the new model allows debonding to progress via unloading of interfacial stresses even as global loading of the composite continues. Previous debonding models employed for analysis of titanium matrix composites are surpassed by the accuracy, simplicity, and efficiency demonstrated by the new model. The new model was designed to operate seamlessly within NASA Glenn's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC), which was employed to simulate the time- and rate-dependent (viscoplastic) transverse tensile and creep behavior of SiC/Ti composites. MAC/GMC's ability to simulate the transverse behavior of titanium matrix composites has been significantly improved by the new debonding model. Further, results indicate the need for a more accurate constitutive representation of the titanium matrix behavior in order to enable predictions of the composite transverse response, without resorting to recalibration of the debonding model parameters.

  12. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    NASA Astrophysics Data System (ADS)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  13. On the value of incorporating spatial statistics in large-scale geophysical inversions: the SABRe case

    NASA Astrophysics Data System (ADS)

    Kokkinaki, A.; Sleep, B. E.; Chambers, J. E.; Cirpka, O. A.; Nowak, W.

    2010-12-01

    Electrical Resistance Tomography (ERT) is a popular method for investigating subsurface heterogeneity. The method relies on measuring electrical potential differences and obtaining, through inverse modeling, the underlying electrical conductivity field, which can be related to hydraulic conductivities. The quality of site characterization strongly depends on the utilized inversion technique. Standard ERT inversion methods, though highly computationally efficient, do not consider spatial correlation of soil properties; as a result, they often underestimate the spatial variability observed in earth materials, thereby producing unrealistic subsurface models. Also, these methods do not quantify the uncertainty of the estimated properties, thus limiting their use in subsequent investigations. Geostatistical inverse methods can be used to overcome both these limitations; however, they are computationally expensive, which has hindered their wide use in practice. In this work, we compare a standard Gauss-Newton smoothness constrained least squares inversion method against the quasi-linear geostatistical approach using the three-dimensional ERT dataset of the SABRe (Source Area Bioremediation) project. The two methods are evaluated for their ability to: a) produce physically realistic electrical conductivity fields that agree with the wide range of data available for the SABRe site while being computationally efficient, and b) provide information on the spatial statistics of other parameters of interest, such as hydraulic conductivity. To explore the trade-off between inversion quality and computational efficiency, we also employ a 2.5-D forward model with corrections for boundary conditions and source singularities. The 2.5-D model accelerates the 3-D geostatistical inversion method. New adjoint equations are developed for the 2.5-D forward model for the efficient calculation of sensitivities. Our work shows that spatial statistics can be incorporated in large-scale ERT inversions to improve the inversion results without making them computationally prohibitive.

  14. On-line moisture determination of ore concentrates 'a review of traditional methods and introduction of a novel solution'.

    PubMed

    Cancilla, P A; Barrette, P; Rosenblum, F

    2002-12-01

    The manual gravimetric drying moisture determination methods currently employed by most mineral processing plants fail to provide timely and accurate information required for automatic control. The costs associated with transporting and handling concentrates still represent a major portion of the overall treatment price. When considering the cash flow of a mining operation that is governed by both the smelter contract, with moisture penalties and the quantity and quality of the concentrates shipped, an efficient method of on-line moisture content would be a welcome tool. A novel on-line determination system for ore concentrate moisture content would replace the tedious manual procedure. Since the introduction of microelectronic-based control systems, operators have strived to reduce the treatment costs to the minimum. Therefore, a representative and timely determination of on-line moisture content becomes vital for control set points and timely feedback. Reliable sensors have long been on the 'wish list' of mineral processors since the problem has always been that you can only control what you can measure. Today, the task of moisture determination is still done by the classical technique of loss in weight utilizing uncontrolled procedures. These same methods were introduced in the earliest base metal concentrators. Generally, it is acceptable to have ore concentrate moisture content vary within a range of 7-9%, but controlling the moisture content below 8% is a difficult task with a manually controlled system. Many times, delays in manually achieving reliable feedback of the moisture content results in the moisture varying from 5-12% before corrective actions can be made. This paper first reviews the traditional and widely available methods for determining moisture content in granular materials by applying physical principles and properties to measure moisture content. All methods are in some form affected when employed on mineral ore concentrates. This paper introduces and describes a novel on-line moisture sensor employed for mineral processing de-watering applications, which not only automates the tedious tasks but also results in reliable moisture feedback that can be used in the optimization of the de-watering process equipment such as pressure or vacuum filters and fuel-fired driers. Finally, two measurement applications will be presented which indicate the usefulness and summarizes the measurement requirements for the proposed method of employing drag force and mechanical properties of the material itself to determine the moisture content. Copyright 2002 Elsevier Science Ltd.

  15. In vivo time-harmonic multifrequency elastography of the human liver

    NASA Astrophysics Data System (ADS)

    Tzschätzsch, Heiko; Ipek-Ugay, Selcan; Guo, Jing; Streitberger, Kaspar-Josche; Gentz, Enno; Fischer, Thomas; Klaua, Robert; Schultz, Michael; Braun, Jürgen; Sack, Ingolf

    2014-04-01

    Elastography is capable of noninvasively detecting hepatic fibrosis by imposing mechanical stress and measuring the viscoelastic response in the liver. Magnetic resonance elastography (MRE) relies on time-harmonic vibrations, while most dynamic ultrasound elastography methods employ transient stimulation methods. This study attempts to benefit from the advantages of time-harmonic tissue stimulation, i.e. relative insensitivity to obesity and ascites and mechanical approachability of the entire liver, and the advantages of ultrasound, i.e. time efficiency, low costs, and wide availability, by introducing in vivo time-harmonic elastography (THE) of the human liver using ultrasound and a broad range of harmonic stimulation frequencies. THE employs continuous harmonic shear vibrations at 7 frequencies from 30 to 60 Hz in a single examination and determines the elasticity and the viscosity of the liver from the dispersion of the shear wave speed within the applied frequency range. The feasibility of the method is demonstrated in the livers of eight healthy volunteers and a patient with cirrhosis. Multifrequency MRE at the same drive frequencies was used as elastographic reference method. Similar values of shear modulus and shear viscosity according the Kelvin-Voigt model were obtained by MRE and THE, indicating that the new method is suitable for in vivo quantification of the shear viscoelastic properties of the liver, however, in real-time and at a fraction of the costs of MRE. In conclusion, THE may provide a useful tool for fast assessment of the viscoelastic properties of the liver at low costs and without limitations in obesity, ascites or hemochromatosis.

  16. Biological Fuel Cells and Membranes.

    PubMed

    Ghassemi, Zahra; Slaughter, Gymama

    2017-01-17

    Biofuel cells have been widely used to generate bioelectricity. Early biofuel cells employ a semi-permeable membrane to separate the anodic and cathodic compartments. The impact of different membrane materials and compositions has also been explored. Some membrane materials are employed strictly as membrane separators, while some have gained significant attention in the immobilization of enzymes or microorganisms within or behind the membrane at the electrode surface. The membrane material affects the transfer rate of the chemical species (e.g., fuel, oxygen molecules, and products) involved in the chemical reaction, which in turn has an impact on the performance of the biofuel cell. For enzymatic biofuel cells, Nafion, modified Nafion, and chitosan membranes have been used widely and continue to hold great promise in the long-term stability of enzymes and microorganisms encapsulated within them. This article provides a review of the most widely used membrane materials in the development of enzymatic and microbial biofuel cells.

  17. Biological Fuel Cells and Membranes

    PubMed Central

    Ghassemi, Zahra; Slaughter, Gymama

    2017-01-01

    Biofuel cells have been widely used to generate bioelectricity. Early biofuel cells employ a semi-permeable membrane to separate the anodic and cathodic compartments. The impact of different membrane materials and compositions has also been explored. Some membrane materials are employed strictly as membrane separators, while some have gained significant attention in the immobilization of enzymes or microorganisms within or behind the membrane at the electrode surface. The membrane material affects the transfer rate of the chemical species (e.g., fuel, oxygen molecules, and products) involved in the chemical reaction, which in turn has an impact on the performance of the biofuel cell. For enzymatic biofuel cells, Nafion, modified Nafion, and chitosan membranes have been used widely and continue to hold great promise in the long-term stability of enzymes and microorganisms encapsulated within them. This article provides a review of the most widely used membrane materials in the development of enzymatic and microbial biofuel cells. PMID:28106711

  18. An Improved Image Matching Method Based on Surf Algorithm

    NASA Astrophysics Data System (ADS)

    Chen, S. J.; Zheng, S. Z.; Xu, Z. G.; Guo, C. C.; Ma, X. L.

    2018-04-01

    Many state-of-the-art image matching methods, based on the feature matching, have been widely studied in the remote sensing field. These methods of feature matching which get highly operating efficiency, have a disadvantage of low accuracy and robustness. This paper proposes an improved image matching method which based on the SURF algorithm. The proposed method introduces color invariant transformation, information entropy theory and a series of constraint conditions to increase feature points detection and matching accuracy. First, the model of color invariant transformation is introduced for two matching images aiming at obtaining more color information during the matching process and information entropy theory is used to obtain the most information of two matching images. Then SURF algorithm is applied to detect and describe points from the images. Finally, constraint conditions which including Delaunay triangulation construction, similarity function and projective invariant are employed to eliminate the mismatches so as to improve matching precision. The proposed method has been validated on the remote sensing images and the result benefits from its high precision and robustness.

  19. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method.

  20. Breast ultrasound computed tomography using waveform inversion with source encoding

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A.

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the speed-of-sound distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Computer-simulation studies are conducted to demonstrate the use of the WISE method. Using a single graphics processing unit card, each iteration can be completed within 25 seconds for a 128 × 128 mm2 reconstruction region. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  1. Contractual obligations and the sharing of confidential health information in sport.

    PubMed

    Anderson, L

    2008-09-01

    As an employee, a sports doctor has obligations to their employer, but also professional and widely accepted obligations of a doctor to the patient (in this case the individual team member). The conflict is evident when sports doctors are asked by an athlete to keep personal health information confidential from the coach and team management, and yet both doctor and athlete have employment contracts specifying that such information shall be shared. Recent research in New Zealand shows that despite the presence of an employment contract, there appears to be a wide range of behaviours among sports doctors when an athlete requests that information about them be kept from team management. Many seem willing to honour requests to keep health information about the athlete confidential, thereby being in breach of the employment contract, while others insist on informing team management against the wishes of the athlete. There are a number of potential solutions to this dilemma from forcing doctors to meet their contractual obligations, to limiting the expectations of the employment contract. This paper suggests that at times it may be appropriate to do both, making the position of the doctor clearer and supporting the ability of this group to resist pressure by coaches and management through having a robust code of ethics.

  2. SU-E-I-07: An Improved Technique for Scatter Correction in PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, S; Wang, Y; Lue, K

    2014-06-01

    Purpose: In positron emission tomography (PET), the single scatter simulation (SSS) algorithm is widely used for scatter estimation in clinical scans. However, bias usually occurs at the essential steps of scaling the computed SSS distribution to real scatter amounts by employing the scatter-only projection tail. The bias can be amplified when the scatter-only projection tail is too small, resulting in incorrect scatter correction. To this end, we propose a novel scatter calibration technique to accurately estimate the amount of scatter using pre-determined scatter fraction (SF) function instead of the employment of scatter-only tail information. Methods: As the SF depends onmore » the radioactivity distribution and the attenuating material of the patient, an accurate theoretical relation cannot be devised. Instead, we constructed an empirical transformation function between SFs and average attenuation coefficients based on a serious of phantom studies with different sizes and materials. From the average attenuation coefficient, the predicted SFs were calculated using empirical transformation function. Hence, real scatter amount can be obtained by scaling the SSS distribution with the predicted SFs. The simulation was conducted using the SimSET. The Siemens Biograph™ 6 PET scanner was modeled in this study. The Software for Tomographic Image Reconstruction (STIR) was employed to estimate the scatter and reconstruct images. The EEC phantom was adopted to evaluate the performance of our proposed technique. Results: The scatter-corrected image of our method demonstrated improved image contrast over that of SSS. For our technique and SSS of the reconstructed images, the normalized standard deviation were 0.053 and 0.182, respectively; the root mean squared errors were 11.852 and 13.767, respectively. Conclusion: We have proposed an alternative method to calibrate SSS (C-SSS) to the absolute scatter amounts using SF. This method can avoid the bias caused by the insufficient tail information and therefore improve the accuracy of scatter estimation.« less

  3. Occupational health of home care aides: results of the safe home care survey

    PubMed Central

    Quinn, Margaret M; Markkanen, Pia K; Galligan, Catherine J; Sama, Susan R; Kriebel, David; Gore, Rebecca J; Brouillette, Natalie M; Okyere, Daniel; Sun, Chuan; Punnett, Laura; Laramie, Angela K; Davis, Letitia

    2016-01-01

    Objectives In countries with ageing populations, home care (HC) aides are among the fastest growing jobs. There are few quantitative studies of HC occupational safety and health (OSH) conditions. The objectives of this study were to: (1) assess quantitatively the OSH hazards and benefits for a wide range of HC working conditions, and (2) compare OSH experiences of HC aides who are employed via different medical and social services systems in Massachusetts, USA. Methods HC aides were recruited for a survey via agencies that employ aides and schedule their visits with clients, and through a labour union of aides employed directly by clients or their families. The questionnaire included detailed questions about the most recent HC visits, as well as about individual aides’ OSH experiences. Results The study population included 1249 HC aides (634 agency-employed, 615 client-employed) contributing information on 3484 HC visits. Hazards occurring most frequently related to musculoskeletal strain, exposure to potentially infectious agents and cleaning chemicals for infection prevention and experience of violence. Client-hired and agency-hired aides had similar OSH experiences with a few exceptions, including use of sharps and experience of verbal violence. Conclusions The OSH experience of HC aides is similar to that of aides in institutional healthcare settings. Despite OSH challenges, HC aides enjoy caring for others and the benefits of HC work should be enhanced. Quantification of HC hazards and benefits is useful to prioritise resources for the development of preventive interventions and to provide an evidence base for policy-setting. PMID:26209318

  4. Structured illumination for wide-field Raman imaging of cell membranes

    NASA Astrophysics Data System (ADS)

    Chen, Houkai; Wang, Siqi; Zhang, Yuquan; Yang, Yong; Fang, Hui; Zhu, Siwei; Yuan, Xiaocong

    2017-11-01

    Although the diffraction limit still restricts their lateral resolution, conventional wide-field Raman imaging techniques offer fast imaging speeds compared with scanning schemes. To extend the lateral resolution of wide-field Raman microscopy using filters, standing-wave illumination technique is used, and an improvement of lateral resolution by a factor of more than two is achieved. Specifically, functionalized surface enhanced Raman scattering nanoparticles are employed to strengthen the desired scattering signals to label cell membranes. This wide-field Raman imaging technique affords various significant opportunities in the biological applications.

  5. 78 FR 37532 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... Annual Burden Hours: 212. Abstract: The Vocational Rehabilitation (VR) Program provides a wide range of... employment, who can benefit from VR services for employment, and who require VR services. If a State is... Services Administration (RSA) to State VR agencies receive funding from the basic Title I formula grant...

  6. Impacts of Social Economic Status on Higher Education Opportunity and Graduate Employment in China

    ERIC Educational Resources Information Center

    Wen, Dong-mao

    2006-01-01

    Based on a nation-wide survey of higher education graduates, this paper analyzes the impact of family background, using paternal occupation and education as indicators, on their scores in the National College Entrance Examination, the level and type of higher education institutions they attend, their employment after graduation, and the income…

  7. Job Search Strategies of Recent University Graduates in Poland: Plans and Effectiveness

    ERIC Educational Resources Information Center

    Piróg, Danuta

    2016-01-01

    The objective of this article was to highlight plans versus actual actions of university graduates in Poland aimed at finding employment. The paper also empirically verifies the impact of chosen job-seeking strategies on the success or failure of their transition to employment. The study was Polish-wide and included graduates of geography. It…

  8. Undergraduate Perceptions of Value: Degree Skills and Career Skills

    ERIC Educational Resources Information Center

    Galloway, Kyle W.

    2017-01-01

    Recent data suggests that of the UK students graduating with a degree in chemistry in 2015, only 18.9% continued to employment as "Science Professionals". While this shows the wide range of employment that is available for chemistry graduates, it also highlights the need for them to have relevant transferable skills, rather than just the…

  9. 76 FR 47241 - Training and Employment Guidance (TEGL) Letter No. 33-10: Special Procedures: Labor Certification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... an industry-wide standard exists among commercial beekeeping employers to transport honey bee... their honey bee colonies north in the summer and south in the winter, stopping as needed to pollinate crops in bloom. For both commercial beekeepers and farmers, the need to move bees from one State to...

  10. The Need for Competencies Due to the Increasing Use of Information and Communication Technologies. CEDEFOP Panorama.

    ERIC Educational Resources Information Center

    Boreham, N. C.; Lammont, Norma

    The widespread introduction of information and communication technologies (ICT) is affecting the skills needed in employment. Programs of vocational education and training (VET) should ensure people have the wide range of skills needed to be employable in these environments. ICT is spreading into more workplaces as a tool for reorganizing…

  11. OUTLOOK BY DENVER AREA OCCUPATIONS. OCCUPATIONS IN COLORADO, PART II.

    ERIC Educational Resources Information Center

    Colorado State Univ., Ft. Collins.

    EMPLOYMENT STATISTICS FOR 1960, ESTIMATED EMPLOYMENT FOR 1965 AND 1970, ESTIMATES OF ADDITIONAL WORKERS NEEDED BY 1970, AND SALARY INFORMATION ARE PROVIDED FOR A WIDE RANGE OF OCCUPATIONS IN THE DENVER AREA. DATA WERE OBTAINED FROM A DENVER STUDY, "JOBS AND THE FUTURE," BY ROBERT VAUGHAN OF THE MOUNTAIN STATES TELEPHONE CO., 1962, AND…

  12. Pension Plans at Risk: A Potential Hazard of Deficit Reduction and Tax Reform.

    ERIC Educational Resources Information Center

    Logue, Dennis E.

    The most widely used pension plans in the United States are defined-benefit plans under which employers pay workers a fixed pension, usually a percentage of their final salaries. Defined-contribution pension plans, under which employers and employees set aside funds that are invested for the employees, are growing in popularity and are…

  13. Six Degrees of Separation and Employment: Disability Services Reconsidered

    ERIC Educational Resources Information Center

    Stensrud, Robert; Sover-Wright, Ehren; Gilbride, Dennis

    2009-01-01

    If six degrees of separation is all that is needed for anyone to find anyone else, and if three clicks can find almost anything on the World Wide Web, perhaps analyzing how social networks form and connect people to each other offers a useful way to reconsider how the rehabilitation profession pursues job placement and employer development. Social…

  14. Undergraduate Work Placements: An Analysis of the Effects on Career Progression

    ERIC Educational Resources Information Center

    Brooks, Ruth; Youngson, Paul L.

    2016-01-01

    Combining work experience with degree-level study is seen as a key differentiator for securing employment upon graduation in a competitive employment market. The positive benefits of sandwich courses, where up to 12 months is spent working in industry, are widely acknowledged in academic literature though data analysis tends to focus on cohorts in…

  15. Labor's Key Role in Workplace Training.

    ERIC Educational Resources Information Center

    Roberts, Markley; Wozniak, Robert

    AFL-CIO unions representing a wide range of workers in virtually every sector of the economy have teamed with employers to develop and sustain successful programs resulting in better trained, more productive workers. Joint training and education programs come in various forms and offer a wide range of services depending on the industry and worker…

  16. Wake Forest University: Building a Campus-Wide Mentoring Culture

    ERIC Educational Resources Information Center

    McWilliams, Allison E.

    2017-01-01

    This article describes recent efforts by Wake Forest University to develop a campus-wide mentoring culture to support holistic student development, to assist with the critical transition from high school to college to life after college, and to develop skills and practices that will be valued by employers and graduate schools. The article…

  17. Comparison of genome-wide selection strategies to identify furfural tolerance genes in Escherichia coli.

    PubMed

    Glebes, Tirzah Y; Sandoval, Nicholas R; Gillis, Jacob H; Gill, Ryan T

    2015-01-01

    Engineering both feedstock and product tolerance is important for transitioning towards next-generation biofuels derived from renewable sources. Tolerance to chemical inhibitors typically results in complex phenotypes, for which multiple genetic changes must often be made to confer tolerance. Here, we performed a genome-wide search for furfural-tolerant alleles using the TRackable Multiplex Recombineering (TRMR) method (Warner et al. (2010), Nature Biotechnology), which uses chromosomally integrated mutations directed towards increased or decreased expression of virtually every gene in Escherichia coli. We employed various growth selection strategies to assess the role of selection design towards growth enrichments. We also compared genes with increased fitness from our TRMR selection to those from a previously reported genome-wide identification study of furfural tolerance genes using a plasmid-based genomic library approach (Glebes et al. (2014) PLOS ONE). In several cases, growth improvements were observed for the chromosomally integrated promoter/RBS mutations but not for the plasmid-based overexpression constructs. Through this assessment, four novel tolerance genes, ahpC, yhjH, rna, and dicA, were identified and confirmed for their effect on improving growth in the presence of furfural. © 2014 Wiley Periodicals, Inc.

  18. Availability and assessment of fixing additives for the in situ remediation of heavy metal contaminated soils: a review.

    PubMed

    Guo, Guanlin; Zhou, Qixing; Ma, Lene Q

    2006-05-01

    The use of low-cost and environmental safety amendments for the in situ immobilization of heavy metals has been investigated as a promising method for contaminated soil remediation. Natural materials and waste products from certain industries with high captive capacity of heavy metals can be obtained and employed. Reduction of extractable metal concentration and phytotoxicity could be evaluated and demonstrated by the feasibility of various amendments in fixing remediation. In this review, an extensive list of references has been compiled to provide a summary of information on a wide range of potentially amendment resources, including organic, inorganic and combined organic-inorganic materials. The assessment based on the economic efficiency and environmental risks brought forth the potential application values and future development directions of this method on solving the soil contamination.

  19. Poly/vinyl alcohol/ membranes for reverse osmosis

    NASA Technical Reports Server (NTRS)

    Katz, M. G.; Wydeven, T., Jr.

    1981-01-01

    A description is presented of the results of studies of the water and salt transport properties of PVA membranes, taking into account radiation crosslinked PVA membranes, diffusive salt permeability through PVA membranes, and heat treated PVA membranes. The experimental findings support an occurrence of independent water, and salt permeation processes. It is suggested that the salt permeation is governed by a solution-diffusion transport mechanism. The preparation of thin skinned, asymmetric PVA membranes is also discussed. The employed method has a certain similarity to the classical phase inversion method, which is widely applied in the casting of asymmetric reverse osmosis membranes. Instead of using a gelling bath composed of a nonsolvent for the membrane material and miscible with the solvent from which the membrane is cast, a 'complexing' bath is used, which is a solution of a complexing agent in water.

  20. Benzyl Alcohol-Mediated Versatile Method to Fabricate Nonstoichiometric Metal Oxide Nanostructures.

    PubMed

    Qamar, Mohammad; Adam, Alaaldin; Azad, Abdul-Majeed; Kim, Yong-Wah

    2017-11-22

    Nanostructured metal oxides with cationic or anionic deficiency find applications in a wide range of technological areas including the energy sector and environment. However, a facile route to prepare such materials in bulk with acceptable reproducibility is still lacking; many synthesis techniques are still only bench-top and cannot be easily scaled-up. Here, we report that the benzyl alcohol (BA)-mediated method is capable of producing a host of nanostructured metal oxides (MO x , where M = Ti, Zn, Ce, Sn, In, Ga, or Fe) with inherent nonstoichiometry. It employs multifunctional BA as a solvent, a reducing agent, and a structure-directing agent. Depending on the oxidation states of metal, elemental or nonstoichiometric oxide forms are obtained. Augmented photoelectrochemical oxidation of water under visible light by some of these nonstoichiometric oxides highlights the versatility of the BA-mediated synthesis protocol.

  1. Transonic Flow Field Analysis for Wing-Fuselage Configurations

    NASA Technical Reports Server (NTRS)

    Boppe, C. W.

    1980-01-01

    A computational method for simulating the aerodynamics of wing-fuselage configurations at transonic speeds is developed. The finite difference scheme is characterized by a multiple embedded mesh system coupled with a modified or extended small disturbance flow equation. This approach permits a high degree of computational resolution in addition to coordinate system flexibility for treating complex realistic aircraft shapes. To augment the analysis method and permit applications to a wide range of practical engineering design problems, an arbitrary fuselage geometry modeling system is incorporated as well as methodology for computing wing viscous effects. Configuration drag is broken down into its friction, wave, and lift induced components. Typical computed results for isolated bodies, isolated wings, and wing-body combinations are presented. The results are correlated with experimental data. A computer code which employs this methodology is described.

  2. Analysis of 2-ethylhexyl-p-methoxycinnamate in sunscreen products by HPLC and Raman spectroscopy.

    PubMed

    Cheng, J; Li, Y S; L Roberts, R; Walker, G

    1997-10-01

    The analyses of 2-ethylhexyl-p-methoxycinnamate (EHMC) using HPLC and Raman spectroscopy have been undertaken and compared. EHMC, which is one of the most widely used sunscreen agents in suncare products in the US, exhibits a strong Raman signal. This signal clearly appears in both ethanol solutions of EHMC as well as in commercial sunscreen lotions containing this sun screen agent. A method for the direct detection and analysis of EHMC has been developed using Raman spectroscopy. This was accomplished by correlating the Raman intensities with the HPLC assays for a series of prototype suncare formulations. Based upon this information, it would be possible to employ Raman spectroscopy as an in-process control method in the commercial production of suncare products containing EHMC. The possibility of applying surface-enhanced Raman scattering for trace analysis was discussed.

  3. Power System Reliability Assessment by Analysing Voltage Dips on the Blue Horizon Bay 22KV Overhead Line in the Nelson Mandela Bay Municipality

    NASA Astrophysics Data System (ADS)

    Lamour, B. G.; Harris, R. T.; Roberts, A. G.

    2010-06-01

    Power system reliability problems are very difficult to solve because the power systems are complex and geographically widely distributed and influenced by numerous unexpected events. It is therefore imperative to employ the most efficient optimization methods in solving the problems relating to reliability of the power system. This paper presents a reliability analysis and study of the power interruptions resulting from severe power outages in the Nelson Mandela Bay Municipality (NMBM), South Africa and includes an overview of the important factors influencing reliability, and methods to improve the reliability. The Blue Horizon Bay 22 kV overhead line, supplying a 6.6 kV residential sector has been selected. It has been established that 70% of the outages, recorded at the source, originate on this feeder.

  4. Driving towards ecotechnologies.

    PubMed

    Najjar, Devora A; Normandin, Avery M; Strait, Elizabeth A; Esvelt, Kevin M

    2017-12-01

    The prospect of using genetic methods to target vector, parasite, and reservoir species offers tremendous potential benefits to public health, but the use of genome editing to alter the shared environment will require special attention to public perception and community governance in order to benefit the world. Public skepticism combined with the media scrutiny of gene drive systems could easily derail unpopular projects entirely, especially given the potential for trade barriers to be raised against countries that employ self-propagating gene drives. Hence, open and community-guided development of thoughtfully chosen applications is not only the most ethical approach, but also the most likely to overcome the economic, social, and diplomatic barriers. Here we review current and past attempts to alter ecosystems using biological methods, identify key determinants of social acceptance, and chart a stepwise path for developers towards safe and widely supported use.

  5. Effectiveness of Winkler Litter Extraction and Pitfall Traps in Sampling Ant Communities and Functional Groups in a Temperate Forest.

    PubMed

    Mahon, Michael B; Campbell, Kaitlin U; Crist, Thomas O

    2017-06-01

    Selection of proper sampling methods for measuring a community of interest is essential whether the study goals are to conduct a species inventory, environmental monitoring, or a manipulative experiment. Insect diversity studies often employ multiple collection methods at the expense of researcher time and funding. Ants (Formicidae) are widely used in environmental monitoring owing to their sensitivity to ecosystem changes. When sampling ant communities, two passive techniques are recommended in combination: pitfall traps and Winkler litter extraction. These recommendations are often based on studies from highly diverse tropical regions or when a species inventory is the goal. Studies in temperate regions often focus on measuring consistent community response along gradients of disturbance or among management regimes; therefore, multiple sampling methods may be unnecessary. We compared the effectiveness of pitfalls and Winkler litter extraction in an eastern temperate forest for measuring ant species richness, composition, and occurrence of ant functional groups in response to experimental manipulations of two key forest ecosystem drivers, white-tailed deer and an invasive shrub (Amur honeysuckle). We found no significant effect of sampling method on the outcome of the ecological experiment; however, we found differences between the two sampling methods in the resulting ant species richness and functional group occurrence. Litter samples approximated the overall combined species richness and composition, but pitfalls were better at sampling large-bodied (Camponotus) species. We conclude that employing both methods is essential only for species inventories or monitoring ants in the Cold-climate Specialists functional group. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Sodium hydroxide permethylation of heparin disaccharides.

    PubMed

    Heiss, Christian; Wang, Zhirui; Azadi, Parastoo

    2011-03-30

    Permethylation is a valuable and widely used tool for the mass spectrometry of carbohydrates, improving sensitivity and fragmentation and increasing the amount of information that can be obtained from tandem mass spectrometric experiments. Permethylation of most glycans is easily performed with sodium hydroxide and iodomethane in dimethyl sulfoxide (DMSO). However, permethylation has not been widely used in the mass spectrometry of glycosaminoglycan (GAG) oligosaccharides, partly because it has required the use of the difficult Hakomori method employing the methylsulfinylmethanide ('dimsyl') base, which has to be made in a tedious process. Additionally, the Hakomori method is not as effective as the sodium hydroxide method in making fully methylated derivatives. A further problem in the permethylation of highly sulfated oligosaccharides is their limited solubility in DMSO. This paper describes the use of the triethylammonium counterion to overcome this problem, as well as the application of the sodium hydroxide method to make permethylated heparin disaccharides and their workup to yield fully methylated disaccharides for electrospray ionization mass spectrometry. The ease, speed, and effectiveness of the described methodology should open up permethylation of GAG oligosaccharides to a wider circle of mass spectrometrists and enable them to develop further derivatization schemes in the effort to rapidly elucidate the structure of these important molecules. Permethylation may also provide new ways of separating GAG oligosaccharides in LC/MS, their increased hydrophobicity making them amenable for reversed-phase chromatography without the need for ion pairing reagents. Copyright © 2011 John Wiley & Sons, Ltd.

  7. Combining Structural Modeling with Ensemble Machine Learning to Accurately Predict Protein Fold Stability and Binding Affinity Effects upon Mutation

    PubMed Central

    Garcia Lopez, Sebastian; Kim, Philip M.

    2014-01-01

    Advances in sequencing have led to a rapid accumulation of mutations, some of which are associated with diseases. However, to draw mechanistic conclusions, a biochemical understanding of these mutations is necessary. For coding mutations, accurate prediction of significant changes in either the stability of proteins or their affinity to their binding partners is required. Traditional methods have used semi-empirical force fields, while newer methods employ machine learning of sequence and structural features. Here, we show how combining both of these approaches leads to a marked boost in accuracy. We introduce ELASPIC, a novel ensemble machine learning approach that is able to predict stability effects upon mutation in both, domain cores and domain-domain interfaces. We combine semi-empirical energy terms, sequence conservation, and a wide variety of molecular details with a Stochastic Gradient Boosting of Decision Trees (SGB-DT) algorithm. The accuracy of our predictions surpasses existing methods by a considerable margin, achieving correlation coefficients of 0.77 for stability, and 0.75 for affinity predictions. Notably, we integrated homology modeling to enable proteome-wide prediction and show that accurate prediction on modeled structures is possible. Lastly, ELASPIC showed significant differences between various types of disease-associated mutations, as well as between disease and common neutral mutations. Unlike pure sequence-based prediction methods that try to predict phenotypic effects of mutations, our predictions unravel the molecular details governing the protein instability, and help us better understand the molecular causes of diseases. PMID:25243403

  8. Multiple-energy Techniques in Industrial Computerized Tomography

    DOE R&D Accomplishments Database

    Schneberk, D.; Martz, H.; Azevedo, S.

    1990-08-01

    Considerable effort is being applied to develop multiple-energy industrial CT techniques for materials characterization. Multiple-energy CT can provide reliable estimates of effective Z (Z{sub eff}), weight fraction, and rigorous calculations of absolute density, all at the spatial resolution of the scanner. Currently, a wide variety of techniques exist for CT scanners, but each has certain problems and limitations. Ultimately, the best multi-energy CT technique would combine the qualities of accuracy, reliability, and wide range of application, and would require the smallest number of additional measurements. We have developed techniques for calculating material properties of industrial objects that differ somewhat from currently used methods. In this paper, we present our methods for calculating Z{sub eff}, weight fraction, and density. We begin with the simplest case -- methods for multiple-energy CT using isotopic sources -- and proceed to multiple-energy work with x-ray machine sources. The methods discussed here are illustrated on CT scans of PBX-9502 high explosives, a lexan-aluminum phantom, and a cylinder of glass beads used in a preliminary study to determine if CT can resolve three phases: air, water, and a high-Z oil. In the CT project at LLNL, we have constructed several CT scanners of varying scanning geometries using {gamma}- and x-ray sources. In our research, we employed two of these scanners: pencil-beam CAT for CT data using isotopic sources and video-CAT equipped with an IRT micro-focal x-ray machine source.

  9. Visual homing with a pan-tilt based stereo camera

    NASA Astrophysics Data System (ADS)

    Nirmal, Paramesh; Lyons, Damian M.

    2013-01-01

    Visual homing is a navigation method based on comparing a stored image of the goal location and the current image (current view) to determine how to navigate to the goal location. It is theorized that insects, such as ants and bees, employ visual homing methods to return to their nest. Visual homing has been applied to autonomous robot platforms using two main approaches: holistic and feature-based. Both methods aim at determining distance and direction to the goal location. Navigational algorithms using Scale Invariant Feature Transforms (SIFT) have gained great popularity in the recent years due to the robustness of the feature operator. Churchill and Vardy have developed a visual homing method using scale change information (Homing in Scale Space, HiSS) from SIFT. HiSS uses SIFT feature scale change information to determine distance between the robot and the goal location. Since the scale component is discrete with a small range of values, the result is a rough measurement with limited accuracy. We have developed a method that uses stereo data, resulting in better homing performance. Our approach utilizes a pan-tilt based stereo camera, which is used to build composite wide-field images. We use the wide-field images combined with stereo-data obtained from the stereo camera to extend the keypoint vector described in to include a new parameter, depth (z). Using this info, our algorithm determines the distance and orientation from the robot to the goal location. We compare our method with HiSS in a set of indoor trials using a Pioneer 3-AT robot equipped with a BumbleBee2 stereo camera. We evaluate the performance of both methods using a set of performance measures described in this paper.

  10. Use of Ultrasound Elastography in the Assessment of the Musculoskeletal System.

    PubMed

    Paluch, Łukasz; Nawrocka-Laskus, Ewa; Wieczorek, Janusz; Mruk, Bartosz; Frel, Małgorzata; Walecki, Jerzy

    2016-01-01

    This article presents possible applications of ultrasound elastography in musculoskeletal imaging based on the available literature, as well as the possibility of extending indications for the use of elastography in the future. Ultrasound elastography (EUS) is a new method that shows structural changes in tissues following application of physical stress. Elastography techniques have been widely used to assess muscles and tendons in vitro since the early parts of the twentieth century. Only recently with the advent of new technology and creation of highly specialized ultrasound devices, has elastography gained widespread use in numerous applications. The authors performed a search of the Medline/PubMed databases for original research and reviewed publications on the application of ultrasound elastography for musculoskeletal imaging. All publications demonstrate possible uses of ultrasound elastography in examinations of the musculoskeletal system. The most widely studied areas include the muscles, tendons and rheumatic diseases. There are also reports on the employment in vessel imaging. The main limitation of elastography as a technique is above all the variability of applied pressure during imaging, which is operator-dependent. It would therefore be reasonable to provide clear guidelines on the technique applied, as well as clear indications for performing the test. It is important to develop methods for creating artifact-free, closed-loop, compression-decompression cycles. The main advantages include cost-effectiveness, short duration of the study, non-invasive nature of the procedure, as well as a potentially broader clinical availability. There are no clear guidelines with regard to indications as well as examination techniques. Ultrasound elastography is a new and still poorly researched method. We conclude, however, that it can be widely used in the examinations of musculoskeletal system. Therefore, it is necessary to conduct large, multi-center studies to determine the methodology, indications and technique of examination.

  11. An accurate online calibration system based on combined clamp-shape coil for high voltage electronic current transformers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhen-hua; Li, Hong-bin; Zhang, Zhi

    Electronic transformers are widely used in power systems because of their wide bandwidth and good transient performance. However, as an emerging technology, the failure rate of electronic transformers is higher than that of traditional transformers. As a result, the calibration period needs to be shortened. Traditional calibration methods require the power of transmission line be cut off, which results in complicated operation and power off loss. This paper proposes an online calibration system which can calibrate electronic current transformers without power off. In this work, the high accuracy standard current transformer and online operation method are the key techniques. Basedmore » on the clamp-shape iron-core coil and clamp-shape air-core coil, a combined clamp-shape coil is designed as the standard current transformer. By analyzing the output characteristics of the two coils, the combined clamp-shape coil can achieve verification of the accuracy. So the accuracy of the online calibration system can be guaranteed. Moreover, by employing the earth potential working method and using two insulating rods to connect the combined clamp-shape coil to the high voltage bus, the operation becomes simple and safe. Tests in China National Center for High Voltage Measurement and field experiments show that the proposed system has a high accuracy of up to 0.05 class.« less

  12. Prism-coupled Cherenkov phase-matched terahertz wave generation using a DAST crystal.

    PubMed

    Suizu, Koji; Shibuya, Takayuki; Uchida, Hirohisa; Kawase, Kodo

    2010-02-15

    Terahertz (THz) wave generation based on nonlinear frequency conversion is a promising method for realizing a tunable monochromatic high-power THz-wave source. Unfortunately, many nonlinear crystals have strong absorption in the THz frequency region. This limits efficient and widely tunable THz-wave generation. The Cherenkov phase-matching method is one of the most promising techniques for overcoming these problems. Here, we propose a prism-coupled Cherenkov phase-matching (PCC-PM) method, in which a prism with a suitable refractive index at THz frequencies is coupled to a nonlinear crystal. This has the following advantages. Many crystals can be used as THz-wave emitters; the phase-matching condition inside the crystal does not have to be observed; the absorption of the crystal does not prevent efficient generation of radiation; and pump sources with arbitrary wavelengths can be employed. Here we demonstrate PCC-PM THz-wave generation using the organic crystal 4-dimethylamino-N-metyl-4-stilbazolium tosylate (DAST) and a Si prism coupler. We obtain THz-wave radiation with tunability of approximately 0.1 to 10 THz and with no deep absorption features resulting from the absorption spectrum of the crystal. The obtained spectra did not depend on the pump wavelength in the range 1300 to 1450 nm. This simple technique shows promise for generating THz radiation using a wide variety of nonlinear crystals.

  13. An Algorithm Based Wavelet Entropy for Shadowing Effect of Human Detection Using Ultra-Wideband Bio-Radar

    PubMed Central

    Liu, Miao; Zhang, Yang; Liang, Fulai; Qi, Fugui; Lv, Hao; Wang, Jianqi; Zhang, Yang

    2017-01-01

    Ultra-wide band (UWB) radar for short-range human target detection is widely used to find and locate survivors in some rescue missions after a disaster. The results of the application of bistatic UWB radar for detecting multi-stationary human targets have shown that human targets close to the radar antennas are very often visible, while those farther from radar antennas are detected with less reliability. In this paper, on account of the significant difference of frequency content between the echo signal of the human target and that of noise in the shadowing region, an algorithm based on wavelet entropy is proposed to detect multiple targets. Our findings indicate that the entropy value of human targets was much lower than that of noise. Compared with the method of adaptive filtering and the energy spectrum, wavelet entropy can accurately detect the person farther from the radar antennas, and it can be employed as a useful tool in detecting multiple targets by bistatic UWB radar. PMID:28973988

  14. An Algorithm Based Wavelet Entropy for Shadowing Effect of Human Detection Using Ultra-Wideband Bio-Radar.

    PubMed

    Xue, Huijun; Liu, Miao; Zhang, Yang; Liang, Fulai; Qi, Fugui; Chen, Fuming; Lv, Hao; Wang, Jianqi; Zhang, Yang

    2017-09-30

    Ultra-wide band (UWB) radar for short-range human target detection is widely used to find and locate survivors in some rescue missions after a disaster. The results of the application of bistatic UWB radar for detecting multi-stationary human targets have shown that human targets close to the radar antennas are very often visible, while those farther from radar antennas are detected with less reliability. In this paper, on account of the significant difference of frequency content between the echo signal of the human target and that of noise in the shadowing region, an algorithm based on wavelet entropy is proposed to detect multiple targets. Our findings indicate that the entropy value of human targets was much lower than that of noise. Compared with the method of adaptive filtering and the energy spectrum, wavelet entropy can accurately detect the person farther from the radar antennas, and it can be employed as a useful tool in detecting multiple targets by bistatic UWB radar.

  15. Thermal properties of nuclear matter in a variational framework with relativistic corrections

    NASA Astrophysics Data System (ADS)

    Zaryouni, S.; Hassani, M.; Moshfegh, H. R.

    2014-01-01

    The properties of hot symmetric nuclear matter for a wide range of densities and temperatures are investigated by employing the AV14 potential within the lowest order constrained variational (LOCV) method with the inclusion of a phenomenological three-body force as well as relativistic corrections. The relativistic corrections of many-body kinetic energies as well as the boot interaction corrections are presented for a wide range of densities and temperatures. The free energy, pressure, incompressibility, and other thermodynamic quantities of symmetric nuclear matter are obtained and discussed. The critical temperature is found, and the liquid-gas phase transition is analyzed both with and without the inclusion of three-body forces and relativistic corrections in the LOCV approach. It is shown that the critical temperature is strongly affected by the three-body forces but does not depend on the relativistic corrections. Finally, the results obtained in the present study are compared with other many-body calculations and experimental predictions.

  16. A strategy for evaluating pathway analysis methods.

    PubMed

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth, either established or assumed, of the pathways perturbed by a specific clinical or experimental condition. As such, our strategy allows researchers to systematically and objectively evaluate pathway analysis methods by employing any number of datasets for a variety of conditions.

  17. Predicting absorption and dispersion in acoustics by direct simulation Monte Carlo: Quantum and classical models for molecular relaxation.

    PubMed

    Hanford, Amanda D; O'Connor, Patrick D; Anderson, James B; Long, Lyle N

    2008-06-01

    In the current study, real gas effects in the propagation of sound waves are simulated using the direct simulation Monte Carlo method for a wide range of frequencies. This particle method allows for treatment of acoustic phenomena at high Knudsen numbers, corresponding to low densities and a high ratio of the molecular mean free path to wavelength. Different methods to model the internal degrees of freedom of diatomic molecules and the exchange of translational, rotational and vibrational energies in collisions are employed in the current simulations of a diatomic gas. One of these methods is the fully classical rigid-rotor/harmonic-oscillator model for rotation and vibration. A second method takes into account the discrete quantum energy levels for vibration with the closely spaced rotational levels classically treated. This method gives a more realistic representation of the internal structure of diatomic and polyatomic molecules. Applications of these methods are investigated in diatomic nitrogen gas in order to study the propagation of sound and its attenuation and dispersion along with their dependence on temperature. With the direct simulation method, significant deviations from continuum predictions are also observed for high Knudsen number flows.

  18. Research on facial expression simulation based on depth image

    NASA Astrophysics Data System (ADS)

    Ding, Sha-sha; Duan, Jin; Zhao, Yi-wu; Xiao, Bo; Wang, Hao

    2017-11-01

    Nowadays, face expression simulation is widely used in film and television special effects, human-computer interaction and many other fields. Facial expression is captured by the device of Kinect camera .The method of AAM algorithm based on statistical information is employed to detect and track faces. The 2D regression algorithm is applied to align the feature points. Among them, facial feature points are detected automatically and 3D cartoon model feature points are signed artificially. The aligned feature points are mapped by keyframe techniques. In order to improve the animation effect, Non-feature points are interpolated based on empirical models. Under the constraint of Bézier curves we finish the mapping and interpolation. Thus the feature points on the cartoon face model can be driven if the facial expression varies. In this way the purpose of cartoon face expression simulation in real-time is came ture. The experiment result shows that the method proposed in this text can accurately simulate the facial expression. Finally, our method is compared with the previous method. Actual data prove that the implementation efficiency is greatly improved by our method.

  19. Application of the dual-kinetic-balance sets in the relativistic many-body problem of atomic structure

    NASA Astrophysics Data System (ADS)

    Beloy, Kyle; Derevianko, Andrei

    2008-05-01

    The dual-kinetic-balance (DKB) finite basis set method for solving the Dirac equation for hydrogen-like ions [V. M. Shabaev et al., Phys. Rev. Lett. 93, 130405 (2004)] is extended to problems with a non-local spherically-symmetric Dirac-Hartree-Fock potential. We implement the DKB method using B-spline basis sets and compare its performance with the widely- employed approach of Notre Dame (ND) group [W.R. Johnson, S.A. Blundell, J. Sapirstein, Phys. Rev. A 37, 307-15 (1988)]. We compare the performance of the ND and DKB methods by computing various properties of Cs atom: energies, hyperfine integrals, the parity-non-conserving amplitude of the 6s1/2-7s1/2 transition, and the second-order many-body correction to the removal energy of the valence electrons. We find that for a comparable size of the basis set the accuracy of both methods is similar for matrix elements accumulated far from the nuclear region. However, for atomic properties determined by small distances, the DKB method outperforms the ND approach.

  20. Differential Characteristics Based Iterative Multiuser Detection for Wireless Sensor Networks

    PubMed Central

    Chen, Xiaoguang; Jiang, Xu; Wu, Zhilu; Zhuang, Shufeng

    2017-01-01

    High throughput, low latency and reliable communication has always been a hot topic for wireless sensor networks (WSNs) in various applications. Multiuser detection is widely used to suppress the bad effect of multiple access interference in WSNs. In this paper, a novel multiuser detection method based on differential characteristics is proposed to suppress multiple access interference. The proposed iterative receive method consists of three stages. Firstly, a differential characteristics function is presented based on the optimal multiuser detection decision function; then on the basis of differential characteristics, a preliminary threshold detection is utilized to find the potential wrongly received bits; after that an error bit corrector is employed to correct the wrong bits. In order to further lower the bit error ratio (BER), the differential characteristics calculation, threshold detection and error bit correction process described above are iteratively executed. Simulation results show that after only a few iterations the proposed multiuser detection method can achieve satisfactory BER performance. Besides, BER and near far resistance performance are much better than traditional suboptimal multiuser detection methods. Furthermore, the proposed iterative multiuser detection method also has a large system capacity. PMID:28212328

Top