Sample records for current processing methods

  1. On-line tool breakage monitoring of vibration tapping using spindle motor current

    NASA Astrophysics Data System (ADS)

    Li, Guangjun; Lu, Huimin; Liu, Gang

    2008-10-01

    Input current of driving motor has been employed successfully as monitoring the cutting state in manufacturing processes for more than a decade. In vibration tapping, however, the method of on-line monitoring motor electric current has not been reported. In this paper, a tap failure prediction method is proposed to monitor the vibration tapping process using the electrical current signal of the spindle motor. The process of vibration tapping is firstly described. Then the relationship between the torque of vibration tapping and the electric current of motor is investigated by theoretic deducing and experimental measurement. According to those results, a monitoring method of tool's breakage is proposed through monitoring the ratio of the current amplitudes during adjacent vibration tapping periods. Finally, a low frequency vibration tapping system with motor current monitoring is built up using a servo motor B-106B and its driver CR06. The proposed method has been demonstrated with experiment data of vibration tapping in titanic alloys. The result of experiments shows that the method, which can avoid the tool breakage and giving a few error alarms when the threshold of amplitude ratio is 1.2 and there is at least 2 times overrun among 50 adjacent periods, is feasible for tool breakage monitoring in the process of vibration tapping small thread holes.

  2. An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*

    PubMed Central

    Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.

    2014-01-01

    Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144

  3. Method for controlling gas metal arc welding

    DOEpatents

    Smartt, Herschel B.; Einerson, Carolyn J.; Watkins, Arthur D.

    1989-01-01

    The heat input and mass input in a Gas Metal Arc welding process are controlled by a method that comprises calculating appropriate values for weld speed, filler wire feed rate and an expected value for the welding current by algorithmic function means, applying such values for weld speed and filler wire feed rate to the welding process, measuring the welding current, comparing the measured current to the calculated current, using said comparison to calculate corrections for the weld speed and filler wire feed rate, and applying corrections.

  4. Relationships between Lexical Processing Speed, Language Skills, and Autistic Traits in Children

    ERIC Educational Resources Information Center

    Abrigo, Erin

    2012-01-01

    According to current models of spoken word recognition listeners understand speech as it unfolds over time. Eye tracking provides a non-invasive, on-line method to monitor attention, providing insight into the processing of spoken language. In the current project a spoken lexical processing assessment (LPA) confirmed current theories of spoken…

  5. Surveillance system and method having parameter estimation and operating mode partitioning

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor)

    2003-01-01

    A system and method for monitoring an apparatus or process asset including partitioning an unpartitioned training data set into a plurality of training data subsets each having an operating mode associated thereto; creating a process model comprised of a plurality of process submodels each trained as a function of at least one of the training data subsets; acquiring a current set of observed signal data values from the asset; determining an operating mode of the asset for the current set of observed signal data values; selecting a process submodel from the process model as a function of the determined operating mode of the asset; calculating a current set of estimated signal data values from the selected process submodel for the determined operating mode; and outputting the calculated current set of estimated signal data values for providing asset surveillance and/or control.

  6. Literature Review on Processing and Analytical Methods for ...

    EPA Pesticide Factsheets

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  7. Method for controlling gas metal arc welding

    DOEpatents

    Smartt, H.B.; Einerson, C.J.; Watkins, A.D.

    1987-08-10

    The heat input and mass input in a Gas Metal Arc welding process are controlled by a method that comprises calculating appropriate values for weld speed, filler wire feed rate and an expected value for the welding current by algorithmic function means, applying such values for weld speed and filler wire feed rate to the welding process, measuring the welding current, comparing the measured current to the calculated current, using said comparison to calculate corrections for the weld speed and filler wire feed rate, and applying corrections. 3 figs., 1 tab.

  8. Possibilities of Processing Archival Photogrammetric Images Captured by Rollei 6006 Metric Camera Using Current Method

    NASA Astrophysics Data System (ADS)

    Dlesk, A.; Raeva, P.; Vach, K.

    2018-05-01

    Processing of analog photogrammetric negatives using current methods brings new challenges and possibilities, for example, creation of a 3D model from archival images which enables the comparison of historical state and current state of cultural heritage objects. The main purpose of this paper is to present possibilities of processing archival analog images captured by photogrammetric camera Rollei 6006 metric. In 1994, the Czech company EuroGV s.r.o. carried out photogrammetric measurements of former limestone quarry the Great America located in the Central Bohemian Region in the Czech Republic. All the negatives of photogrammetric images, complete documentation, coordinates of geodetically measured ground control points, calibration reports and external orientation of images calculated in the Combined Adjustment Program are preserved and were available for the current processing. Negatives of images were scanned and processed using structure from motion method (SfM). The result of the research is a statement of what accuracy is possible to expect from the proposed methodology using Rollei metric images originally obtained for terrestrial intersection photogrammetry while adhering to the proposed methodology.

  9. Atomic memory access hardware implementations

    DOEpatents

    Ahn, Jung Ho; Erez, Mattan; Dally, William J

    2015-02-17

    Atomic memory access requests are handled using a variety of systems and methods. According to one example method, a data-processing circuit having an address-request generator that issues requests to a common memory implements a method of processing the requests using a memory-access intervention circuit coupled between the generator and the common memory. The method identifies a current atomic-memory access request from a plurality of memory access requests. A data set is stored that corresponds to the current atomic-memory access request in a data storage circuit within the intervention circuit. It is determined whether the current atomic-memory access request corresponds to at least one previously-stored atomic-memory access request. In response to determining correspondence, the current request is implemented by retrieving data from the common memory. The data is modified in response to the current request and at least one other access request in the memory-access intervention circuit.

  10. The Effect of an Enrichment Reading Program on the Cognitive Processes and Neural Structures of Children Having Reading Difficulties

    ERIC Educational Resources Information Center

    Kuruyer, Hayriye Gül; Akyol, Hayati; Karli Oguz, Kader; Has, Arzu Ceylan

    2017-01-01

    The main purpose of the current study is to explain the effect of an enrichment reading program on the cognitive processes and neural structures of children experiencing reading difficulties. The current study was carried out in line with a single-subject research method and the between-subjects multiple probe design belonging to this method. This…

  11. Eddy current measurement of the thickness of top Cu film of the multilayer interconnects in the integrated circuit (IC) manufacturing process

    NASA Astrophysics Data System (ADS)

    Qu, Zilian; Meng, Yonggang; Zhao, Qian

    2015-03-01

    This paper proposes a new eddy current method, named equivalent unit method (EUM), for the thickness measurement of the top copper film of multilayer interconnects in the chemical mechanical polishing (CMP) process, which is an important step in the integrated circuit (IC) manufacturing. The influence of the underneath circuit layers on the eddy current is modeled and treated as an equivalent film thickness. By subtracting this equivalent film component, the accuracy of the thickness measurement of the top copper layer with an eddy current sensor is improved and the absolute error is 3 nm for sampler measurement.

  12. Analytical evaluation of current starch methods used in the international sugar industry: Part I

    USDA-ARS?s Scientific Manuscript database

    Several analytical starch methods currently exist in the international sugar industry that are used to prevent or mitigate starch-related processing challenges as well as assess the quality of traded end-products. These methods use simple iodometric chemistry, mostly potato starch standards, and uti...

  13. Studies related to ocean dynamics. Task 3.2: Aircraft Field Test Program to investigate the ability of remote sensing methods to measure current/wind-wave interactions

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Flood, W. A.; Brown, G. S.

    1975-01-01

    The feasibility of remote sensing of current flows in the ocean and the remote sensing of ocean currents by backscattering cross section techniques was studied. It was established that for capillary waves, small scale currents could be accurately measured through observation of wave kinematics. Drastic modifications of waves by changing currents were noted. The development of new methods for the measurement of capillary waves are discussed. Improvement methods to resolve data processing problems are suggested.

  14. Improved Imaging With Laser-Induced Eddy Currents

    NASA Technical Reports Server (NTRS)

    Chern, Engmin J.

    1993-01-01

    System tests specimen of material nondestructively by laser-induced eddy-current imaging improved by changing method of processing of eddy-current signal. Changes in impedance of eddy-current coil measured in absolute instead of relative units.

  15. Method for measuring and controlling beam current in ion beam processing

    DOEpatents

    Kearney, Patrick A.; Burkhart, Scott C.

    2003-04-29

    A method for producing film thickness control of ion beam sputter deposition films. Great improvements in film thickness control is accomplished by keeping the total current supplied to both the beam and suppressor grids of a radio frequency (RF) in beam source constant, rather than just the current supplied to the beam grid. By controlling both currents, using this method, deposition rates are more stable, and this allows the deposition of layers with extremely well controlled thicknesses to about 0.1%. The method is carried out by calculating deposition rates based on the total of the suppressor and beam currents and maintaining the total current constant by adjusting RF power which gives more consistent values.

  16. Recovery and purification process development for monoclonal antibody production

    PubMed Central

    Ma, Junfen; Winter, Charles; Bayer, Robert

    2010-01-01

    Hundreds of therapeutic monoclonal antibodies (mAbs) are currently in development, and many companies have multiple antibodies in their pipelines. Current methodology used in recovery processes for these molecules are reviewed here. Basic unit operations such as harvest, Protein A affinity chromatography and additional polishing steps are surveyed. Alternative processes such as flocculation, precipitation and membrane chromatography are discussed. We also cover platform approaches to purification methods development, use of high throughput screening methods, and offer a view on future developments in purification methodology as applied to mAbs. PMID:20647768

  17. Automated Chromium Plating Line for Gun Barrels

    DTIC Science & Technology

    1979-09-01

    consistent pretreatments and bath dwell times. Some of the advantages of automated processing include increased productivity (average of 20^) due to...when automated processing procedures’ are used. The current method of applying chromium electrodeposits to gun tubes is a manual, batch operation...currently practiced with rotary swaged gun tubes would substantially reduce the difficulties in automated processing . RECOMMENDATIONS

  18. Sterilization processes. Meeting the demands of today's health care technology.

    PubMed

    Crow, S

    1993-09-01

    Universal Precautions dictate sterilization for all invasive equipment that break the blood barrier; however, current methods of sterilization, such as steam and ethylene oxide gas (ETO), are not compatible with many of the delicate, heat-sensitive surgical instruments used in modern health care. In addition, traditional sterilization methods are often too time consuming for practical use in the operating room. Clearly, new sterilization processes need to be developed. In this article, the criteria modern sterilization processes must meet and how some manufacturers plan to meet this challenge are discussed. In addition, the pros and cons of using peracetic acid (the newest sterilization process currently available) are examined.

  19. Method for isolating nucleic acids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, Jr., Richard Ashley; Elias, Dwayne A.

    The current disclosure provides methods and kits for isolating nucleic acid from an environmental sample. The current methods and compositions further provide methods for isolating nucleic acids by reducing adsorption of nucleic acids by charged ions and particles within an environmental sample. The methods of the current disclosure provide methods for isolating nucleic acids by releasing adsorbed nucleic acids from charged particles during the nucleic acid isolation process. The current disclosure facilitates the isolation of nucleic acids of sufficient quality and quantity to enable one of ordinary skill in the art to utilize or analyze the isolated nucleic acids formore » a wide variety of applications including, sequencing or species population analysis.« less

  20. Plastics processing: statistics, current practices, and evaluation.

    PubMed

    Cooke, F

    1993-11-01

    The health care industry uses a huge quantity of plastic materials each year. Much of the machinery currently used, or supplied, for plastics processing is unsuitable for use in a clean environment. In this article, the author outlines the reasons for the current situation and urges companies to re-examine their plastic-processing methods, whether performed in-house or subcontracted out. Some of the factors that should be considered when evaluating plastics-processing equipment are outlined to assist companies in remaining competitive and complying with impending EC regulations on clean room standards for manufacturing areas.

  1. Business Process Improvement Applied to Written Temporary Duty Travel Orders within the United States Air Force

    DTIC Science & Technology

    1993-12-01

    Generally Accepted Process While neither DoD Directives nor USAF Regulations specify exact mandatory TDY order processing methods, most USAF units...functional input. Finally, TDY order processing functional experts at Hanscom, Los Angeles and McClellan AFBs provided inputs based on their experiences...current electronic auditing capabilities. 81 DTPS Initiative. This DFAS-initiated action to standardize TDY order processing throughout DoD is currently

  2. How current ginning processes affect fiber length uniformity index

    USDA-ARS?s Scientific Manuscript database

    There is a need to develop cotton ginning methods that improve fiber characteristics that are compatible with the newer and more efficient spinning technologies. A literature search produced recent studies that described how current ginning processes affect HVI fiber length uniformity index. Resul...

  3. A novel eco-friendly technique for efficient control of lime water softening process.

    PubMed

    Ostovar, Mohamad; Amiri, Mohamad

    2013-12-01

    Lime softening is an established type of water treatment used for water softening. The performance of this process is highly dependent on lime dosage. Currently, lime dosage is adjusted manually based on chemical tests, aimed at maintaining the phenolphthalein (P) and total (M) alkalinities within a certain range (2 P - M > or = 5). In this paper, a critical study of the softening process has been presented. It has been shown that the current method is frequently incorrect. Furthermore, electrical conductivity (EC) has been introduced as a novel indicator for effectively characterizing the lime softening process.This novel technique has several advantages over the current alkalinities method. Because no chemical reagents are needed for titration, which is a simple test, there is a considerable reduction in test costs. Additionally, there is a reduction in the treated water hardness and generated sludge during the lime softening process. Therefore, it is highly eco-friendly, and is a very cost effective alternative technique for efficient control of the lime softening process.

  4. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  5. Experimental Study on Rebar Corrosion Using the Galvanic Sensor Combined with the Electronic Resistance Technique

    PubMed Central

    Xu, Yunze; Li, Kaiqiang; Liu, Liang; Yang, Lujia; Wang, Xiaona; Huang, Yi

    2016-01-01

    In this paper, a new kind of carbon steel (CS) and stainless steel (SS) galvanic sensor system was developed for the study of rebar corrosion in different pore solution conditions. Through the special design of the CS and SS electronic coupons, the electronic resistance (ER) method and zero resistance ammeter (ZRA) technique were used simultaneously for the measurement of both the galvanic current and the corrosion depth. The corrosion processes in different solution conditions were also studied by linear polarization resistance (LPR) and the measurements of polarization curves. The test result shows that the galvanic current noise can provide detailed information of the corrosion processes. When localized corrosion occurs, the corrosion rate measured by the ER method is lower than the real corrosion rate. However, the value measured by the LPR method is higher than the real corrosion rate. The galvanic current and the corrosion current measured by the LPR method shows linear correlation in chloride-containing saturated Ca(OH)2 solution. The relationship between the corrosion current differences measured by the CS electronic coupons and the galvanic current between the CS and SS electronic coupons can also be used to evaluate the localized corrosion in reinforced concrete. PMID:27618054

  6. Experimental Study on Rebar Corrosion Using the Galvanic Sensor Combined with the Electronic Resistance Technique.

    PubMed

    Xu, Yunze; Li, Kaiqiang; Liu, Liang; Yang, Lujia; Wang, Xiaona; Huang, Yi

    2016-09-08

    In this paper, a new kind of carbon steel (CS) and stainless steel (SS) galvanic sensor system was developed for the study of rebar corrosion in different pore solution conditions. Through the special design of the CS and SS electronic coupons, the electronic resistance (ER) method and zero resistance ammeter (ZRA) technique were used simultaneously for the measurement of both the galvanic current and the corrosion depth. The corrosion processes in different solution conditions were also studied by linear polarization resistance (LPR) and the measurements of polarization curves. The test result shows that the galvanic current noise can provide detailed information of the corrosion processes. When localized corrosion occurs, the corrosion rate measured by the ER method is lower than the real corrosion rate. However, the value measured by the LPR method is higher than the real corrosion rate. The galvanic current and the corrosion current measured by the LPR method shows linear correlation in chloride-containing saturated Ca(OH)₂ solution. The relationship between the corrosion current differences measured by the CS electronic coupons and the galvanic current between the CS and SS electronic coupons can also be used to evaluate the localized corrosion in reinforced concrete.

  7. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  8. Curing conditions to inactivate Trichinella spiralis muscle larvae in ready-to-eat pork sausage

    USDA-ARS?s Scientific Manuscript database

    Curing processes for ready to eat (RTE) pork products currently require individual validation of methods to demonstrate inactivation of Trichinella spiralis. This is a major undertaking for each process; currently no model of meat chemistry exists that can be correlated with inactivation of Trichin...

  9. A review of traditional and current methods used to potentially reduce toxicity of Aconitum roots in Traditional Chinese Medicine.

    PubMed

    Liu, Shuai; Li, Fei; Li, Yan; Li, Weifei; Xu, Jinkai; Du, Hong

    2017-07-31

    Aconitum species are well-known for their medicinal value and high lethal toxicity in many Asian countries, notably China, India and Japan. The tubers are only used after processing in Traditional Chinese Medicine (TCM). They can be used safely and effectively with the methods of decoction, rational compatibility, and correct processing based on traditional experiences and new technologies. However, high toxicological risks still remain due to improper preparation and usage in China and other countries. Therefore, there is a need to clarify the methods of processing and compatibility to ensure their effectiveness and minimize the potential risks. The aim of this paper is to provide a review of traditional and current methods used to potentially reduce toxicity of Aconitum roots in TCM. The use of Aconitum has been investigated and the methods of processing and compatibility throughout history, including recent research, have been reviewed. Using of the methods of rational preparation, reasonable compatibility, and proper processing based on traditional experiences and new technologies, can enable Aconitum to be used safely and effectively. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  10. Towards Real Time Diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Mcjunkin; Dennis C. Kunerth; Corrie Nichol

    2013-07-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  11. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.

    2014-02-18

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  12. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    NASA Astrophysics Data System (ADS)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.; Todorov, E.; Levesque, S.

    2014-02-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defects or precursors to defects and correct when possible during the weld process.

  13. Survey of NASA V and V Processes/Methods

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy

    2002-01-01

    The purpose of this report is to describe current NASA Verification and Validation (V&V) techniques and to explain how these techniques are applicable to 2nd Generation RLV Integrated Vehicle Health Management (IVHM) software. It also contains recommendations for special V&V requirements for IVHM. This report is divided into the following three sections: 1) Survey - Current NASA V&V Processes/Methods; 2) Applicability of NASA V&V to 2nd Generation RLV IVHM; and 3) Special 2nd Generation RLV IVHM V&V Requirements.

  14. Acute and impaired wound healing: pathophysiology and current methods for drug delivery, part 2: role of growth factors in normal and pathological wound healing: therapeutic potential and methods of delivery.

    PubMed

    Demidova-Rice, Tatiana N; Hamblin, Michael R; Herman, Ira M

    2012-08-01

    This is the second of 2 articles that discuss the biology and pathophysiology of wound healing, reviewing the role that growth factors play in this process and describing the current methods for growth factor delivery into the wound bed.

  15. Acute and Impaired Wound Healing: Pathophysiology and Current Methods for Drug Delivery, Part 2: Role of Growth Factors in Normal and Pathological Wound Healing: Therapeutic Potential and Methods of Delivery

    PubMed Central

    Demidova-Rice, Tatiana N.; Hamblin, Michael R.; Herman, Ira M.

    2012-01-01

    This is the second of 2 articles that discuss the biology and pathophysiology of wound healing, reviewing the role that growth factors play in this process and describing the current methods for growth factor delivery into the wound bed. PMID:22820962

  16. A practical method of predicting the loudness of complex electrical stimuli

    NASA Astrophysics Data System (ADS)

    McKay, Colette M.; Henshall, Katherine R.; Farrell, Rebecca J.; McDermott, Hugh J.

    2003-04-01

    The output of speech processors for multiple-electrode cochlear implants consists of current waveforms with complex temporal and spatial patterns. The majority of existing processors output sequential biphasic current pulses. This paper describes a practical method of calculating loudness estimates for such stimuli, in addition to the relative loudness contributions from different cochlear regions. The method can be used either to manipulate the loudness or levels in existing processing strategies, or to control intensity cues in novel sound processing strategies. The method is based on a loudness model described by McKay et al. [J. Acoust. Soc. Am. 110, 1514-1524 (2001)] with the addition of the simplifying approximation that current pulses falling within a temporal integration window of several milliseconds' duration contribute independently to the overall loudness of the stimulus. Three experiments were carried out with six implantees who use the CI24M device manufactured by Cochlear Ltd. The first experiment validated the simplifying assumption, and allowed loudness growth functions to be calculated for use in the loudness prediction method. The following experiments confirmed the accuracy of the method using multiple-electrode stimuli with various patterns of electrode locations and current levels.

  17. Digital Signal Processing and Generation for a DC Current Transformer for Particle Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zorzetti, Silvia

    2013-01-01

    The thesis topic, digital signal processing and generation for a DC current transformer, focuses on the most fundamental beam diagnostics in the field of particle accelerators, the measurement of the beam intensity, or beam current. The technology of a DC current transformer (DCCT) is well known, and used in many areas, including particle accelerator beam instrumentation, as non-invasive (shunt-free) method to monitor the DC current in a conducting wire, or in our case, the current of charged particles travelling inside an evacuated metal pipe. So far, custom and commercial DCCTs are entirely based on analog technologies and signal processing, whichmore » makes them inflexible, sensitive to component aging, and difficult to maintain and calibrate.« less

  18. Organizational Use of a Framework for Innovation Adoption

    DTIC Science & Technology

    2011-09-01

    in current processes , the eight practices identified by Denning and Dunham’s The Innovator’s Way, Essential Practices For Successful Innovation (2010...framework for identifying gaps in current processes , the eight practices identified by Denning and Dunham’s The Innovator’s Way, Essential Practices For...60 2. Methods to Use within the Eight Practice Framework ..................63 a. Marine Corps Planning Process (MCPP) for Executing

  19. Review of GDOT's organization evaluation process by examining current employee survey and other state DOT's methods - phase I.

    DOT National Transportation Integrated Search

    2014-03-01

    During the past decade, the Georgia Department of Transportation (GDOT) has conducted surveys of its over 5,000 employees. Currently, the survey is being used to compare results year-to-year and formulate initiatives and methods of organizational imp...

  20. Large Aircraft Robotic Paint Stripping (LARPS) system and the high pressure water process

    NASA Astrophysics Data System (ADS)

    See, David W.; Hofacker, Scott A.; Stone, M. Anthony; Harbaugh, Darcy

    1993-03-01

    The aircraft maintenance industry is beset by new Environmental Protection Agency (EPA) guidelines on air emissions, Occupational Safety and Health Administration (OSHA) standards, dwindling labor markets, Federal Aviation Administration (FAA) safety guidelines, and increased operating costs. In light of these factors, the USAF's Wright Laboratory Manufacturing Technology Directorate and the Aircraft Division of the Oklahoma City Air Logistics Center initiated a MANTECH/REPTECH effort to automate an alternate paint removal method and eliminate the current manual methylene chloride chemical stripping methods. This paper presents some of the background and history of the LARPS program, describes the LARPS system, documents the projected operational flow, quantifies some of the projected system benefits and describes the High Pressure Water Stripping Process. Certification of an alternative paint removal method to replace the current chemical process is being performed in two phases: Process Optimization and Process Validation. This paper also presents the results of the Process Optimization for metal substrates. Data on the coating removal rate, residual stresses, surface roughness, preliminary process envelopes, and technical plans for process Validation Testing will be discussed.

  1. Method for making high-critical-current-density YBa.sub.2 Cu.sub.3 O.sub.7 superconducting layers on metallic substrates

    DOEpatents

    Feenstra, Roeland; Christen, David; Paranthaman, Mariappan

    1999-01-01

    A method is disclosed for fabricating YBa.sub.2 Cu.sub.3 O.sub.7 superconductor layers with the capability of carrying large superconducting currents on a metallic tape (substrate) supplied with a biaxially textured oxide buffer layer. The method represents a simplification of previously established techniques and provides processing requirements compatible with scale-up to long wire (tape) lengths and high processing speeds. This simplification has been realized by employing the BaF.sub.2 method to grow a YBa.sub.2 Cu.sub.3 O.sub.7 film on a metallic substrate having a biaxially textured oxide buffer layer.

  2. A method to evaluate process performance by integrating time and resources

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  3. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  4. Transition probabilities for general birth-death processes with applications in ecology, genetics, and evolution

    PubMed Central

    Crawford, Forrest W.; Suchard, Marc A.

    2011-01-01

    A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. In the general process with n current particles, a new particle is born with instantaneous rate λn and a particle dies with instantaneous rate μn. Currently no robust and efficient method exists to evaluate the finite-time transition probabilities in a general birth-death process with arbitrary birth and death rates. In this paper, we first revisit the theory of continued fractions to obtain expressions for the Laplace transforms of these transition probabilities and make explicit an important derivation connecting transition probabilities and continued fractions. We then develop an efficient algorithm for computing these probabilities that analyzes the error associated with approximations in the method. We demonstrate that this error-controlled method agrees with known solutions and outperforms previous approaches to computing these probabilities. Finally, we apply our novel method to several important problems in ecology, evolution, and genetics. PMID:21984359

  5. Analysis and evalaution in the production process and equipment area of the low-cost solar array project. [including modifying gaseous diffusion and using ion implantation

    NASA Technical Reports Server (NTRS)

    Goldman, H.; Wolf, M.

    1979-01-01

    The manufacturing methods for photovoltaic solar energy utilization are assessed. Economic and technical data on the current front junction formation processes of gaseous diffusion and ion implantation are presented. Future proposals, including modifying gaseous diffusion and using ion implantation, to decrease the cost of junction formation are studied. Technology developments in current processes and an economic evaluation of the processes are included.

  6. Reduction of aerobic and lactic acid bacteria in dairy desludge using an integrated compressed CO2 and ultrasonic process.

    PubMed

    Overton, Tim W; Lu, Tiejun; Bains, Narinder; Leeke, Gary A

    Current treatment routes are not suitable to reduce and stabilise bacterial content in some dairy process streams such as separator and bactofuge desludges which currently present a major emission problem faced by dairy producers. In this study, a novel method for the processing of desludge was developed. The new method, elevated pressure sonication (EPS), uses a combination of low frequency ultrasound (20 kHz) and elevated CO 2 pressure (50 to 100 bar). Process conditions (pressure, sonicator power, processing time) were optimised for batch and continuous EPS processes to reduce viable numbers of aerobic and lactic acid bacteria in bactofuge desludge by ≥3-log fold. Coagulation of proteins present in the desludge also occurred, causing separation of solid (curd) and liquid (whey) fractions. The proposed process offers a 10-fold reduction in energy compared to high temperature short time (HTST) treatment of milk.

  7. U.S. Climate Change Technology Program: Strategic Plan

    DTIC Science & Technology

    2006-09-01

    and Long Term, provides details on the 85 technologies in the R&D portfolio. 21 (Figure 2-1) Continuing Process The United States, in partnership with...locations may be centered near or in residential locations, and work processes and products may be more commonly communicated or delivered via digital... chemical properties, along with advanced methods to simulate processes , will stem from advances in computational technology. Current Portfolio The current

  8. Process Guide for Deburring Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frey, David L.

    This report is an updated and consolidated view of the current deburring processes at the Kansas City Plant (KCP). It includes specific examples of current burr problems and the methods used for their detection. Also included is a pictorial review of the large variety of available deburr tools, along with a complete numerical listing of existing tools and their descriptions. The process for deburring all the major part feature categories is discussed.

  9. Analysis and Evaluation of Processes and Equipment in Tasks 2 and 4 of the Low-cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Goldman, H.; Wolf, M.

    1978-01-01

    The significant economic data for the current production multiblade wafering and inner diameter slicing processes were tabulated and compared to data on the experimental and projected multiblade slurry, STC ID diamond coated blade, multiwire slurry and crystal systems fixed abrasive multiwire slicing methods. Cost calculations were performed for current production processes and for 1982 and 1986 projected wafering techniques.

  10. Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective

    PubMed Central

    Shterenshis, Michael

    2017-01-01

    Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes. PMID:29138741

  11. Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective.

    PubMed

    Shterenshis, Michael

    2017-01-01

    Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes.

  12. Currently available methodologies for the processing of intravascular ultrasound and optical coherence tomography images.

    PubMed

    Athanasiou, Lambros; Sakellarios, Antonis I; Bourantas, Christos V; Tsirka, Georgia; Siogkas, Panagiotis; Exarchos, Themis P; Naka, Katerina K; Michalis, Lampros K; Fotiadis, Dimitrios I

    2014-07-01

    Optical coherence tomography and intravascular ultrasound are the most widely used methodologies in clinical practice as they provide high resolution cross-sectional images that allow comprehensive visualization of the lumen and plaque morphology. Several methods have been developed in recent years to process the output of these imaging modalities, which allow fast, reliable and reproducible detection of the luminal borders and characterization of plaque composition. These methods have proven useful in the study of the atherosclerotic process as they have facilitated analysis of a vast amount of data. This review presents currently available intravascular ultrasound and optical coherence tomography processing methodologies for segmenting and characterizing the plaque area, highlighting their advantages and disadvantages, and discusses the future trends in intravascular imaging.

  13. Current Knowledge and Projection on Assessing the Effectiveness of Training.

    ERIC Educational Resources Information Center

    Orlansky, Jesse

    This discussion of methods used to assess the effectiveness of training for U.S. Army personnel identifies various types of training, describes methods currently used, and suggests ways of improving the assessment process. The methodology and results of assessments of effectiveness, including the costs associated with the level of performance, are…

  14. 42 CFR 82.30 - How will NIOSH inform the public of any plans to change scientific elements underlying the dose...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... change scientific elements underlying the dose reconstruction process to maintain methods reasonably current with scientific progress? 82.30 Section 82.30 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF... methods reasonably current with scientific progress? Periodically, NIOSH will publish a notice in the...

  15. 42 CFR 82.30 - How will NIOSH inform the public of any plans to change scientific elements underlying the dose...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... change scientific elements underlying the dose reconstruction process to maintain methods reasonably current with scientific progress? 82.30 Section 82.30 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF... methods reasonably current with scientific progress? Periodically, NIOSH will publish a notice in the...

  16. 42 CFR 82.30 - How will NIOSH inform the public of any plans to change scientific elements underlying the dose...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... change scientific elements underlying the dose reconstruction process to maintain methods reasonably current with scientific progress? 82.30 Section 82.30 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF... methods reasonably current with scientific progress? Periodically, NIOSH will publish a notice in the...

  17. 42 CFR 82.30 - How will NIOSH inform the public of any plans to change scientific elements underlying the dose...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... change scientific elements underlying the dose reconstruction process to maintain methods reasonably current with scientific progress? 82.30 Section 82.30 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF... methods reasonably current with scientific progress? Periodically, NIOSH will publish a notice in the...

  18. Digital signal processing methods for biosequence comparison.

    PubMed Central

    Benson, D C

    1990-01-01

    A method is discussed for DNA or protein sequence comparison using a finite field fast Fourier transform, a digital signal processing technique; and statistical methods are discussed for analyzing the output of this algorithm. This method compares two sequences of length N in computing time proportional to N log N compared to N2 for methods currently used. This method makes it feasible to compare very long sequences. An example is given to show that the method correctly identifies sites of known homology. PMID:2349096

  19. A two dimensional power spectral estimate for some nonstationary processes. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Smith, Gregory L.

    1989-01-01

    A two dimensional estimate for the power spectral density of a nonstationary process is being developed. The estimate will be applied to helicopter noise data which is clearly nonstationary. The acoustic pressure from the isolated main rotor and isolated tail rotor is known to be periodically correlated (PC) and the combined noise from the main and tail rotors is assumed to be correlation autoregressive (CAR). The results of this nonstationary analysis will be compared with the current method of assuming that the data is stationary and analyzing it as such. Another method of analysis is to introduce a random phase shift into the data as shown by Papoulis to produce a time history which can then be accurately modeled as stationary. This method will also be investigated for the helicopter data. A method used to determine the period of a PC process when the period is not know is discussed. The period of a PC process must be known in order to produce an accurate spectral representation for the process. The spectral estimate is developed. The bias and variability of the estimate are also discussed. Finally, the current method for analyzing nonstationary data is compared to that of using a two dimensional spectral representation. In addition, the method of phase shifting the data is examined.

  20. Design of diversity and focused combinatorial libraries in drug discovery.

    PubMed

    Young, S Stanley; Ge, Nanxiang

    2004-05-01

    Using well-characterized chemical reactions and readily available monomers, chemists are able to create sets of compounds, termed libraries, which are useful in drug discovery processes. The design of combinatorial chemical libraries can be complex and there has been much information recently published offering suggestions on how the design process can be carried out. This review focuses on literature with the goal of organizing current thinking. At this point in time, it is clear that benchmarking of current suggested methods is required as opposed to further new methods.

  1. Management system to a photovoltaic panel based on the measurement of short-circuit currents

    NASA Astrophysics Data System (ADS)

    Dordescu, M.

    2016-12-01

    This article is devoted to fundamental issues arising from operation in terms of increased energy efficiency for photovoltaic panel (PV). By measuring the current from functioning cage determine the current value prescribed amount corresponding to maximum power point results obtained by requiring proof of pregnancy with this method are the maximum energy possible, thus justifying the usefulness of this process very simple and inexpensive to implement in practice. The proposed adjustment method is much simpler and more economical than conventional methods that rely on measuring power cut.

  2. Detection of wood failure by image processing method: influence of algorithm, adhesive and wood species

    Treesearch

    Lanying Lin; Sheng He; Feng Fu; Xiping Wang

    2015-01-01

    Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...

  3. A Process for Reviewing and Evaluating Generated Test Items

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis

    2016-01-01

    Testing organization needs large numbers of high-quality items due to the proliferation of alternative test administration methods and modern test designs. But the current demand for items far exceeds the supply. Test items, as they are currently written, evoke a process that is both time-consuming and expensive because each item is written,…

  4. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  5. Towards a Better Corrosion Resistance and Biocompatibility Improvement of Nitinol Medical Devices

    NASA Astrophysics Data System (ADS)

    Rokicki, Ryszard; Hryniewicz, Tadeusz; Pulletikurthi, Chandan; Rokosz, Krzysztof; Munroe, Norman

    2015-04-01

    Haemocompatibility of Nitinol implantable devices and their corrosion resistance as well as resistance to fracture are very important features of advanced medical implants. The authors of the paper present some novel methods capable to improve Nitinol implantable devices to some marked degree beyond currently used electropolishing (EP) processes. Instead, a magnetoelectropolishing process should be advised. The polarization study shows that magnetoelectropolished Nitinol surface is more corrosion resistant than that obtained after a standard EP and has a unique ability to repassivate the surface. Currently used sterilization processes of Nitinol implantable devices can dramatically change physicochemical properties of medical device and by this influence its biocompatibility. The Authors' experimental results clearly show the way to improve biocompatibility of NiTi alloy surface. The final sodium hypochlorite treatment should replace currently used Nitinol implantable devices sterilization methods which rationale was also given in our previous study.

  6. An Investigation into Semantic and Phonological Processing in Individuals with Williams Syndrome

    ERIC Educational Resources Information Center

    Lee, Cheryl S.; Binder, Katherine S.

    2014-01-01

    Purpose: The current study examined semantic and phonological processing in individuals with Williams syndrome (WS). Previous research in language processing in individuals with WS suggests a complex linguistic system characterized by "deviant" semantic organization and differential phonological processing. Method: Two experiments…

  7. Evaluation of Aqueous and Powder Processing Techniques for Production of Pu-238-Fueled General Purpose Heat Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2008-06-01

    This report evaluates alternative processes that could be used to produce Pu-238 fueled General Purpose Heat Sources (GPHS) for radioisotope thermoelectric generators (RTG). Fabricating GPHSs with the current process has remained essentially unchanged since its development in the 1970s. Meanwhile, 30 years of technological advancements have been made in the fields of chemistry, manufacturing, ceramics, and control systems. At the Department of Energy’s request, alternate manufacturing methods were compared to current methods to determine if alternative fabrication processes could reduce the hazards, especially the production of respirable fines, while producing an equivalent GPHS product. An expert committee performed the evaluationmore » with input from four national laboratories experienced in Pu-238 handling.« less

  8. Cell-Specific Multifunctional Processing of Heterogeneous Cell Systems in a Single Laser Pulse Treatment

    PubMed Central

    Lukianova-Hleb, Ekaterina Y.; Mutonga, Martin B. G.; Lapotko, Dmitri O.

    2012-01-01

    Current methods of cell processing for gene and cell therapies use several separate procedures for gene transfer and cell separation or elimination, because no current technology can offer simultaneous multi-functional processing of specific cell sub-sets in highly heterogeneous cell systems. Using the cell-specific generation of plasmonic nanobubbles of different sizes around cell-targeted gold nanoshells and nanospheres, we achieved simultaneous multifunctional cell-specific processing in a rapid single 70 ps laser pulse bulk treatment of heterogeneous cell suspension. This method supported the detection of cells, delivery of external molecular cargo to one type of cells and the concomitant destruction of another type of cells without damaging other cells in suspension, and real-time guidance of the two above cellular effects. PMID:23167546

  9. Laser processing for manufacturing nanocarbon materials

    NASA Astrophysics Data System (ADS)

    Van, Hai Hoang

    CNTs have been considered as the excellent candidate to revolutionize a broad range of applications. There have been many method developed to manipulate the chemistry and the structure of CNTs. Laser with non-contact treatment capability exhibits many processing advantages, including solid-state treatment, extremely fast processing rate, and high processing resolution. In addition, the outstanding monochromatic, coherent, and directional beam generates the powerful energy absorption and the resultant extreme processing conditions. In my research, a unique laser scanning method was developed to process CNTs, controlling the oxidation and the graphitization. The achieved controllability of this method was applied to address the important issues of the current CNT processing methods for three applications. The controllable oxidation of CNTs by laser scanning method was applied to cut CNT films to produce high-performance cathodes for FE devices. The production method includes two important self-developed techniques to produce the cold cathodes: the production of highly oriented and uniformly distributed CNT sheets and the precise laser trimming process. Laser cutting is the unique method to produce the cathodes with remarkable features, including ultrathin freestanding structure (~200 nm), greatly high aspect ratio, hybrid CNT-GNR emitter arrays, even emitter separation, and directional emitter alignment. This unique cathode structure was unachievable by other methods. The developed FE devices successfully solved the screening effect issue encounter by current FE devices. The laser-control oxidation method was further developed to sequentially remove graphitic walls of CNTs. The laser oxidation process was directed to occur along the CNT axes by the laser scanning direction. Additionally, the oxidation was further assisted by the curvature stress and the thermal expansion of the graphitic nanotubes, ultimately opening (namely unzipping) the tubular structure to produce GNRs. Therefore the developed laser scanning method optimally exploited the thermal laser-CNT interaction, successfully transforming CNTs into 2D GNRs. The solid-state laser unzipping process effectively addressed the issues of contamination and scalability encountered by the current unzipping methods. Additionally, the produced GNRs were uniquely featured with the freestanding structure and the smooth surfaces. If the scanning process was performed in an inert environment without the appearance of oxygen, the oxidation of CNTs would not happen. Instead, the greatly mobile carbon atoms of the heated CNTs would reorganize the crystal structure, inducing the graphitization process to improve the crystallinity. Many observations showing the structural improvement of CNTs under laser irradiation has been reported, confirming the capability of laser to heal graphitic defects. Laser methods were more time-efficient and energy-efficient than other annealing methods because laser can quickly heat CNTs to generate graphitization in less than one second. This subsecond heating process of laser irradiation was more effective than other heating methods because it avoided the undesired coalescence of CNTs. In my research, the laser scanning method was applied to generate the graphitization, healing the structural defects of CNTs. Different from the reported laser methods, the laser scanning directed the locally annealed areas to move along the CNT axes, migrating and coalescencing the graphitic defects to achieve better healing results. The critical information describing the CNT structural transformation caused by the moving laser irradiation was explored from the successful applications of the developed laser method. This knowledge inspires an important method to modifiy the general graphitic structure for important applications, such as carbon fiber production, CNT self-assembly process and CNT welding. This method will be effective, facile, versatile, and adaptable for laboratory and industrial facilities.

  10. Non-destructive control of graphite electrodes with use of current displacement effect

    NASA Astrophysics Data System (ADS)

    Myatezh, A. V.; Malozyomov, B. V.; Smirnov, M. A.

    2017-10-01

    The work is devoted to methods of nondestructive diagnostics and their use for solving the problem of diagnosing various defects in solid surface of graphite electrodes used in steelmaking furnaces. Various non-destructive control methods of materials are analyzed. In the article, methods of eddy-current defectoscopy of graphite electrodes are considered. Rationalization of the sensitivity increase of the method and localization of damage is described. Imitating modeling of electromagnetic processes was executed; results were made and conclusions were drawn.

  11. Electrokinetic remediation prefield test methods

    NASA Technical Reports Server (NTRS)

    Hodko, Dalibor (Inventor)

    2000-01-01

    Methods for determining the parameters critical in designing an electrokinetic soil remediation process including electrode well spacing, operating current/voltage, electroosmotic flow rate, electrode well wall design, and amount of buffering or neutralizing solution needed in the electrode wells at operating conditions are disclosed These methods are preferably performed prior to initiating a full scale electrokinetic remediation process in order to obtain efficient remediation of the contaminants.

  12. System for evaluating weld quality using eddy currents

    DOEpatents

    Todorov, Evgueni I.; Hay, Jacob

    2017-12-12

    Electromagnetic and eddy current techniques for fast automated real-time and near real-time inspection and monitoring systems for high production rate joining processes. An eddy current system, array and method for the fast examination of welds to detect anomalies such as missed seam (MS) and lack of penetration (LOP) the system, array and methods capable of detecting and sizing surface and slightly subsurface flaws at various orientations in connection with at least the first and second weld pass.

  13. DEVELOPMENT OF INFRARED METHODS FOR CHARACTERIZATION OF INORGANIC SULFUR SPECIES RELATED TO INJECTION DESULFURIZATION PROCESSES

    EPA Science Inventory

    Current methods designed to control and reduce the amount of sulfur dioxide emitted into the atmosphere from coal-fired power plants and factories rely upon the reaction between SO2 and alkaline earth compounds and are called flue gas desulfurization (FGD) processes. Of these met...

  14. Human Factors Engineering as a System in the Vision for Exploration

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Smith, Danielle; Holden, Kritina

    2006-01-01

    In order to accomplish NASA's Vision for Exploration, while assuring crew safety and productivity, human performance issues must be well integrated into system design from mission conception. To that end, a two-year Technology Development Project (TDP) was funded by NASA Headquarters to develop a systematic method for including the human as a system in NASA's Vision for Exploration. The specific goals of this project are to review current Human Systems Integration (HSI) standards (i.e., industry, military, NASA) and tailor them to selected NASA Exploration activities. Once the methods are proven in the selected domains, a plan will be developed to expand the effort to a wider scope of Exploration activities. The methods will be documented for inclusion in NASA-specific documents (such as the Human Systems Integration Standards, NASA-STD-3000) to be used in future space systems. The current project builds on a previous TDP dealing with Human Factors Engineering processes. That project identified the key phases of the current NASA design lifecycle, and outlined the recommended HFE activities that should be incorporated at each phase. The project also resulted in a prototype of a webbased HFE process tool that could be used to support an ideal HFE development process at NASA. This will help to augment the limited human factors resources available by providing a web-based tool that explains the importance of human factors, teaches a recommended process, and then provides the instructions, templates and examples to carry out the process steps. The HFE activities identified by the previous TDP are being tested in situ for the current effort through support to a specific NASA Exploration activity. Currently, HFE personnel are working with systems engineering personnel to identify HSI impacts for lunar exploration by facilitating the generation of systemlevel Concepts of Operations (ConOps). For example, medical operations scenarios have been generated for lunar habitation in order to identify HSI requirements for the lunar communications architecture. Throughout these ConOps exercises, HFE personnel are testing various tools and methodologies that have been identified in the literature. A key part of the effort is the identification of optimal processes, methods, and tools for these early development phase activities, such as ConOps, requirements development, and early conceptual design. An overview of the activities completed thus far, as well as the tools and methods investigated will be presented.

  15. 21 CFR 225.1 - Current good manufacturing practice.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Current good manufacturing practice. (a) Section 501(a)(2)(B) of the Federal Food, Drug, and Cosmetic Act... the methods used in, or the facilities or controls used for, its manufacture, processing, packing, or...

  16. 21 CFR 225.1 - Current good manufacturing practice.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Current good manufacturing practice. (a) Section 501(a)(2)(B) of the Federal Food, Drug, and Cosmetic Act... the methods used in, or the facilities or controls used for, its manufacture, processing, packing, or...

  17. 21 CFR 225.1 - Current good manufacturing practice.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Current good manufacturing practice. (a) Section 501(a)(2)(B) of the Federal Food, Drug, and Cosmetic Act... the methods used in, or the facilities or controls used for, its manufacture, processing, packing, or...

  18. 21 CFR 225.1 - Current good manufacturing practice.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Current good manufacturing practice. (a) Section 501(a)(2)(B) of the Federal Food, Drug, and Cosmetic Act... the methods used in, or the facilities or controls used for, its manufacture, processing, packing, or...

  19. Method of manufacturing carbon nanotubes

    NASA Technical Reports Server (NTRS)

    Benavides, Jeanette M. (Inventor); Leidecker, Henning W. (Inventor); Frazier, Jeffrey (Inventor)

    2004-01-01

    A process for manufacturing carbon nanotubes, including a step of inducing electrical current through a carbon anode and a carbon cathode under conditions effective to produce the carbon nanotubes, wherein the carbon cathode is larger than the carbon anode. Preferably, a welder is used to induce the electrical current via an arc welding process. Preferably, an exhaust hood is placed on the anode, and the process does not require a closed or pressurized chamber. The process provides high-quality, single-walled carbon nanotubes, while eliminating the need for a metal catalyst.

  20. 21 CFR 129.1 - Current good manufacturing practice.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Current good manufacturing practice. 129.1 Section... Current good manufacturing practice. The applicable criteria in part 110 of this chapter, as well as the..., methods, practices, and controls used in the processing, bottling, holding, and shipping of bottled...

  1. Industrial process surveillance system

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Singer, Ralph M.; Mott, Jack E.

    1998-01-01

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  2. Industrial process surveillance system

    DOEpatents

    Gross, K.C.; Wegerich, S.W.; Singer, R.M.; Mott, J.E.

    1998-06-09

    A system and method are disclosed for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy. 96 figs.

  3. Industrial Process Surveillance System

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W; Singer, Ralph M.; Mott, Jack E.

    2001-01-30

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  4. Signal processing system for electrotherapy applications

    NASA Astrophysics Data System (ADS)

    Płaza, Mirosław; Szcześniak, Zbigniew

    2017-08-01

    The system of signal processing for electrotherapeutic applications is proposed in the paper. The system makes it possible to model the curve of threshold human sensitivity to current (Dalziel's curve) in full medium frequency range (1kHz-100kHz). The tests based on the proposed solution were conducted and their results were compared with those obtained according to the assumptions of High Tone Power Therapy method and referred to optimum values. Proposed system has high dynamics and precision of mapping the curve of threshold human sensitivity to current and can be used in all methods where threshold curves are modelled.

  5. Optical Profilometers Using Adaptive Signal Processing

    NASA Technical Reports Server (NTRS)

    Hall, Gregory A.; Youngquist, Robert; Mikhael, Wasfy

    2006-01-01

    A method of adaptive signal processing has been proposed as the basis of a new generation of interferometric optical profilometers for measuring surfaces. The proposed profilometers would be portable, hand-held units. Sizes could be thus reduced because the adaptive-signal-processing method would make it possible to substitute lower-power coherent light sources (e.g., laser diodes) for white light sources and would eliminate the need for most of the optical components of current white-light profilometers. The adaptive-signal-processing method would make it possible to attain scanning ranges of the order of decimeters in the proposed profilometers.

  6. The method of educational assessment affects children's neural processing and performance: behavioural and fMRI Evidence

    NASA Astrophysics Data System (ADS)

    Howard, Steven J.; Burianová, Hana; Calleia, Alysha; Fynes-Clinton, Samuel; Kervin, Lisa; Bokosmaty, Sahar

    2017-08-01

    Standardised educational assessments are now widespread, yet their development has given comparatively more consideration to what to assess than how to optimally assess students' competencies. Existing evidence from behavioural studies with children and neuroscience studies with adults suggest that the method of assessment may affect neural processing and performance, but current evidence remains limited. To investigate the impact of assessment methods on neural processing and performance in young children, we used functional magnetic resonance imaging to identify and quantify the neural correlates during performance across a range of current approaches to standardised spelling assessment. Results indicated that children's test performance declined as the cognitive load of assessment method increased. Activation of neural nodes associated with working memory further suggests that this performance decline may be a consequence of a higher cognitive load, rather than the complexity of the content. These findings provide insights into principles of assessment (re)design, to ensure assessment results are an accurate reflection of students' true levels of competency.

  7. Temperature and voltage stress dependent dielectric relaxation process of the doped Ba0.67Sr0.33TiO3 ceramics

    NASA Astrophysics Data System (ADS)

    Yan, Shiguang; Mao, Chaoliang; Wang, Genshui; Yao, Chunhua; Cao, Fei; Dong, Xianlin

    2013-09-01

    The current decay characteristic in the time domain is studied in Y3+ and Mn2+ modified Ba0.67Sr0.33TiO3 ceramics under different temperatures (25 °C-213 °C) and voltage stresses (0 V-800 V). The decay of the current is correlated with the overlapping of the relaxation process and leakage current. With respect to the inherent remarkable dielectric nonlinearity, a simple method through curve fitting is derived to differentiate these two currents. Two mechanisms of the relaxation process are proposed: a distribution of the potential barriers mode around room temperature and an electron injection mode at the elevated temperature of 110 °C.

  8. Low-cost manufacturing of the point focus concentrating module and its key component, the Fresnel lens. Final subcontract report, 31 January 1991--6 May 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saifee, T.; Konnerth, A. III

    1991-11-01

    Solar Kinetics, Inc. (SKI) has been developing point-focus concentrating PV modules since 1986. SKI is currently in position to manufacture between 200 to 600 kilowatts annually of the current design by a combination of manual and semi-automated methods. This report reviews the current status of module manufacture and specifies the required approach to achieve a high-volume manufacturing capability and low cost. The approach taken will include process development concurrent with module design for automated manufacturing. The current effort reviews the major manufacturing costs and identifies components and processes whose improvements would produce the greatest effect on manufacturability and cost reduction.more » The Fresnel lens is one such key component. Investigating specific alternative manufacturing methods and sources has substantially reduced the lens costs and has exceeded the DOE cost-reduction goals. 15 refs.« less

  9. Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.

    1991-01-01

    An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.

  10. A unified method to process biosolids samples for the recovery of bacterial, viral, and helminths pathogens.

    PubMed

    Alum, Absar; Rock, Channah; Abbaszadegan, Morteza

    2014-01-01

    For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.

  11. Polymeric Packaging for Fully Implantable Wireless Neural Microsensors

    PubMed Central

    Aceros, Juan; Yin, Ming; Borton, David A.; Patterson, William R.; Bull, Christopher; Nurmikko, Arto V.

    2014-01-01

    We present polymeric packaging methods used for subcutaneous, fully implantable, broadband, and wireless neurosensors. A new tool for accelerated testing and characterization of biocompatible polymeric packaging materials and processes is described along with specialized test units to simulate our fully implantable neurosensor components, materials and fabrication processes. A brief description of the implantable systems is presented along with their current encapsulation methods based on polydimethylsiloxane (PDMS). Results from in-vivo testing of multiple implanted neurosensors in swine and non-human primates are presented. Finally, a novel augmenting polymer thin film material to complement the currently employed PDMS is introduced. This thin layer coating material is based on the Plasma Enhanced Chemical Vapor Deposition (PECVD) process of Hexamethyldisiloxane (HMDSO) and Oxygen (O2). PMID:23365999

  12. CO2 Capture Using Electric Fields: Low-Cost Electrochromic Film on Plastic for Net-Zero Energy Building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-01-01

    Broad Funding Opportunity Announcement Project: Two faculty members at Lehigh University created a new technique called supercapacitive swing adsorption (SSA) that uses electrical charges to encourage materials to capture and release CO2. Current CO2 capture methods include expensive processes that involve changes in temperature or pressure. Lehigh University’s approach uses electric fields to improve the ability of inexpensive carbon sorbents to trap CO2. Because this process uses electric fields and not electric current, the overall energy consumption is projected to be much lower than conventional methods. Lehigh University is now optimizing the materials to maximize CO2 capture and minimize themore » energy needed for the process.« less

  13. Computational problems and signal processing in SETI

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. K.; Stauduhar, Richard

    1991-01-01

    The Search for Extraterrestrial Intelligence (SETI), currently being planned at NASA, will require that an enormous amount of data (on the order of 10 exp 11 distinct signal paths for a typical observation) be analyzed in real time by special-purpose hardware. Even though the SETI system design is not based on maximum entropy and Bayesian methods (partly due to the real-time processing constraint), it is expected that enough data will be saved to be able to apply these and other methods off line where computational complexity is not an overriding issue. Interesting computational problems that relate directly to the system design for processing such an enormous amount of data have emerged. Some of these problems are discussed, along with the current status on their solution.

  14. Holographic Methods Of Dynamic Particulate Measurements ¬â€?Current Status

    NASA Astrophysics Data System (ADS)

    Thompson, Brian J.

    1983-03-01

    The field of holographic particulate measurements continues to be very active with many new applications in such diverse fields as bubble chamber recording and contaminant measurements in small vials. The methods have also been extended to measure velocity distributions of particles within a volume, particularly by the application of subsequent image processing methods. These techniques could be coupled with hybrid systems to become near real time. The current status of these more recent developments is reviewed.

  15. Ion beam activation for materials analysis: Methods and application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conlon, T.W.

    1981-04-01

    A number of ion beam methods for materials analysis have been developed using Harwell's high voltage accelerators and these are currently being exploited for applications 'in house' and in industry. Ion beam activation is a relatively new area which has exhibited exceptional growth over the last few years. Activation by ion beams to produce a single dominant radioisotope as a surface label (thin layer activation or TLA) is becoming a mature technology offering ever increasing sensitivity for surface loss measurement (currently better than 0.1 ..mu..m or 10/sup -7/ cm/sup 3/ depending on the method of measurement) and remote monitoring ofmore » inaccessible components during studies of wear/erosion/ corrosion/sputtering and the like. With the increasingly established credibility of the method has come the realisation that: (i) more complex and even multiple activation profiles can be used to extract more information on the characteristics of the surface loss process, (ii) that an analogous method can be used even on radiation sensitive materials through the newly established indirect recoil implantation process. (iii) that there is scope for treatment of truly immovable objects through the implantation of fission fragments, (iv) there is vast potential in the area of activation analysis. The current state of development of these methods which greatly extend the scope of conventional TLA will be briefly reviewed. Current applications of these and TLA in industry are discussed.« less

  16. 21 CFR 113.5 - Current good manufacturing practice.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Current good manufacturing practice. 113.5 Section... CONTAINERS General Provisions § 113.5 Current good manufacturing practice. The criteria in §§ 113.10, 113.40..., methods, practices, and controls used by the commercial processor in the manufacture, processing, or...

  17. Orthorectification by Using Gpgpu Method

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kulur, S.

    2012-07-01

    Thanks to the nature of the graphics processing, the newly released products offer highly parallel processing units with high-memory bandwidth and computational power of more than teraflops per second. The modern GPUs are not only powerful graphic engines but also they are high level parallel programmable processors with very fast computing capabilities and high-memory bandwidth speed compared to central processing units (CPU). Data-parallel computations can be shortly described as mapping data elements to parallel processing threads. The rapid development of GPUs programmability and capabilities attracted the attentions of researchers dealing with complex problems which need high level calculations. This interest has revealed the concepts of "General Purpose Computation on Graphics Processing Units (GPGPU)" and "stream processing". The graphic processors are powerful hardware which is really cheap and affordable. So the graphic processors became an alternative to computer processors. The graphic chips which were standard application hardware have been transformed into modern, powerful and programmable processors to meet the overall needs. Especially in recent years, the phenomenon of the usage of graphics processing units in general purpose computation has led the researchers and developers to this point. The biggest problem is that the graphics processing units use different programming models unlike current programming methods. Therefore, an efficient GPU programming requires re-coding of the current program algorithm by considering the limitations and the structure of the graphics hardware. Currently, multi-core processors can not be programmed by using traditional programming methods. Event procedure programming method can not be used for programming the multi-core processors. GPUs are especially effective in finding solution for repetition of the computing steps for many data elements when high accuracy is needed. Thus, it provides the computing process more quickly and accurately. Compared to the GPUs, CPUs which perform just one computing in a time according to the flow control are slower in performance. This structure can be evaluated for various applications of computer technology. In this study covers how general purpose parallel programming and computational power of the GPUs can be used in photogrammetric applications especially direct georeferencing. The direct georeferencing algorithm is coded by using GPGPU method and CUDA (Compute Unified Device Architecture) programming language. Results provided by this method were compared with the traditional CPU programming. In the other application the projective rectification is coded by using GPGPU method and CUDA programming language. Sample images of various sizes, as compared to the results of the program were evaluated. GPGPU method can be used especially in repetition of same computations on highly dense data, thus finding the solution quickly.

  18. Inspection of cup-shaped steel parts from the I.D. side using eddy current

    NASA Astrophysics Data System (ADS)

    Griffiths, Erick W.; Pearson, Lee H.

    2018-04-01

    An eddy current method was developed to inspect cup-shaped steel parts from the I.D. side. During the manufacturing process of these parts, a thin Al tape foil is applied to the I.D. side of the part. One of the critical process parameters is that only one foil layer can be applied. An eddy current inspection system was developed to reject parts with more than one foil layer. The Al tape foil is cut to length to fit the inner diameter, however, after application of the foil there is a gap created between the beginning and end of the foil. It was found that this gap interfered with the eddy current inspection causing a false positive indication. To solve this problem a sensor design and data analysis process were developed to overcome the effects of these gaps. The developed system incorporates simultaneous measurements from multiple eddy current sensors and signal processing to achieve a reliable inspection.

  19. Manufacturing Methods & Technology Project Execution Report. First CY 83.

    DTIC Science & Technology

    1983-11-01

    UCCURRENCE. H 83 5180 MMT FOR METAL DEWAR AND UNBONDED LEADS THE GOLD WIRE BONDED CONNECTIOkS ARE MADE BY HAND WHICH IS A TEDIOUS AND EXPENSIVE PROCESS. THE...ATTACHMENTS CURRENT FILAMENT WOUND COMPOSIIE ROCKET MOTOR CASES REQUIRE FORGED METAL POLE PIECESt NOZZLE CLOSURE ATTACHMENT RINGS, AND OTHER ATTACHMENT RINGS... ELASTOMER INSULATOR PROCESS LARGE TACTICAL ROCKET MOTOR INSULATORS ARE COSTLY, LACK DESIGN CHANGE FLEXIBILITY AND SUFFER LONG LEAD TIMES. CURRENT

  20. Exploring selection and recruitment processes for newly qualified nurses: a sequential-explanatory mixed-method study.

    PubMed

    Newton, Paul; Chandler, Val; Morris-Thomson, Trish; Sayer, Jane; Burke, Linda

    2015-01-01

    To map current selection and recruitment processes for newly qualified nurses and to explore the advantages and limitations of current selection and recruitment processes. The need to improve current selection and recruitment practices for newly qualified nurses is highlighted in health policy internationally. A cross-sectional, sequential-explanatory mixed-method design with 4 components: (1) Literature review of selection and recruitment of newly qualified nurses; and (2) Literature review of a public sector professions' selection and recruitment processes; (3) Survey mapping existing selection and recruitment processes for newly qualified nurses; and (4) Qualitative study about recruiters' selection and recruitment processes. Literature searches on the selection and recruitment of newly qualified candidates in teaching and nursing (2005-2013) were conducted. Cross-sectional, mixed-method data were collected from thirty-one (n = 31) individuals in health providers in London who had responsibility for the selection and recruitment of newly qualified nurses using a survey instrument. Of these providers who took part, six (n = 6) purposively selected to be interviewed qualitatively. Issues of supply and demand in the workforce, rather than selection and recruitment tools, predominated in the literature reviews. Examples of tools to measure values, attitudes and skills were found in the nursing literature. The mapping exercise found that providers used many selection and recruitment tools, some providers combined tools to streamline process and assure quality of candidates. Most providers had processes which addressed the issue of quality in the selection and recruitment of newly qualified nurses. The 'assessment centre model', which providers were adopting, allowed for multiple levels of assessment and streamlined recruitment. There is a need to validate the efficacy of the selection tools. © 2014 John Wiley & Sons Ltd.

  1. [Male contraception - the current state of knowledge].

    PubMed

    Zdrojewicz, Zygmynt; Kasperska, Karolina; Lewandowska, Marta

    2016-08-01

    Contraception is important from a health, psychological and socioeconomic point of view. Due to the fact that male-based contraceptive methods are mostly represented by condoms and vasectomy, researchers are working on the new solutions, which could let the men be more involved in a conscious family planning. In this review we will present the current state of knowledge on this subject. There is a lot going on in the field of hormonal contraception. Studies including testosterone, progestins, synthetic androgens and other derivatives are on a different stages of clinical trials and mostly demonstrate high efficacy rates. Recent discovers of Izumo and Juno proteins, essential for the fertilization process, give hope for an easily reversible, non-hormonal method. Researchers are also trying to interfere with the process of spermatogenesis using BRDT inhibitor - JQ1, or neutralize the sperm by injecting styrene maleic anhydride (SMA) into the lumen of the vas deferens. The other studies explore processes involved in proper sperm motility. A vaccine which induces an immune response to the reproductive system is also an interesting method. The latest research use ultrasound waves and mechanical device which blocks the patency of vas deferens. The aim of the study current state of knowledge male contraception. © 2016 MEDPRESS.

  2. Analytical technologies for influenza virus-like particle candidate vaccines: challenges and emerging approaches

    PubMed Central

    2013-01-01

    Influenza virus-like particle vaccines are one of the most promising ways to respond to the threat of future influenza pandemics. VLPs are composed of viral antigens but lack nucleic acids making them non-infectious which limit the risk of recombination with wild-type strains. By taking advantage of the advancements in cell culture technologies, the process from strain identification to manufacturing has the potential to be completed rapidly and easily at large scales. After closely reviewing the current research done on influenza VLPs, it is evident that the development of quantification methods has been consistently overlooked. VLP quantification at all stages of the production process has been left to rely on current influenza quantification methods (i.e. Hemagglutination assay (HA), Single Radial Immunodiffusion assay (SRID), NA enzymatic activity assays, Western blot, Electron Microscopy). These are analytical methods developed decades ago for influenza virions and final bulk influenza vaccines. Although these methods are time-consuming and cumbersome they have been sufficient for the characterization of final purified material. Nevertheless, these analytical methods are impractical for in-line process monitoring because VLP concentration in crude samples generally falls out of the range of detection for these methods. This consequently impedes the development of robust influenza-VLP production and purification processes. Thus, development of functional process analytical techniques, applicable at every stage during production, that are compatible with different production platforms is in great need to assess, optimize and exploit the full potential of novel manufacturing platforms. PMID:23642219

  3. Enzymatic cell disruption of microalgae biomass in biorefinery processes.

    PubMed

    Demuez, Marie; Mahdy, Ahmed; Tomás-Pejó, Elia; González-Fernández, Cristina; Ballesteros, Mercedes

    2015-10-01

    When employing biotechnological processes for the procurement of biofuels and bio-products from microalgae, one of the most critical steps affecting economy and yields is the "cell disruption" stage. Currently, enzymatic cell disruption has delivered effective and cost competitive results when compared to mechanical and chemical cell disruption methods. However, the introduction of enzymes implies additional associated cost within the overall process. In order to reduce this cost, autolysis of microalgae is proposed as alternative enzymatic cell disruption method. This review aims to provide the state of the art of enzymatic cell disruption treatments employed in biorefinery processes and highlights the use of endopeptidases. During the enzymatic processes of microalgae life cycle, some lytic enzymes involved in cell division and programmed cell death have been proven useful in performing cell lysis. In this context, the role of endopeptidases is emphasized. Mirroring these natural events, an alternative cell disruption approach is proposed and described with the potential to induce the autolysis process using intrinsic cell enzymes. Integrating induced autolysis within biofuel production processes offers a promising approach to reduce overall global costs and energetic input associated with those of current cell disruption methods. A number of options for further inquiry are also discussed. © 2015 Wiley Periodicals, Inc.

  4. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.

  5. Process for using surface strain measurements to obtain operational loads for complex structures

    NASA Technical Reports Server (NTRS)

    Ko, William L. (Inventor); Richards, William Lance (Inventor)

    2010-01-01

    The invention is an improved process for using surface strain data to obtain real-time, operational loads data for complex structures that significantly reduces the time and cost versus current methods.

  6. Flexible bottom-gate graphene transistors on Parylene C substrate and the effect of current annealing

    PubMed Central

    Kim, Hyungsoo; Bong, Jihye; Mikael, Solomon; Kim, Tong June; Williams, Justin C.; Ma, Zhenqiang

    2016-01-01

    Flexible graphene transistors built on a biocompatible Parylene C substrate would enable active circuitry to be integrated into flexible implantable biomedical devices. An annealing method to improve the performance of a flexible transistor without damaging the flexible substrate is also desirable. Here, we present a fabrication method of a flexible graphene transistor with a bottom-gate coplanar structure on a Parylene C substrate. Also, a current annealing method and its effect on the device performance have been studied. The localized heat generated by the current annealing method improves the drain current, which is attributed to the decreased contact resistance between graphene and S/D electrodes. A maximum current annealing power in the Parylene C-based graphene transistor has been extracted to provide a guideline for an appropriate current annealing. The fabricated flexible graphene transistor shows a field-effect mobility, maximum transconductance, and a Ion/Ioff ratio of 533.5 cm2/V s, 58.1 μS, and 1.76, respectively. The low temperature process and the current annealing method presented here would be useful to fabricate two-dimensional materials-based flexible electronics. PMID:27795570

  7. A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.

    2015-01-01

    A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.

  8. Study of flow over object problems by a nodal discontinuous Galerkin-lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Shen, Meng; Liu, Chen

    2018-04-01

    The flow over object problems are studied by a nodal discontinuous Galerkin-lattice Boltzmann method (NDG-LBM) in this work. Different from the standard lattice Boltzmann method, the current method applies the nodal discontinuous Galerkin method into the streaming process in LBM to solve the resultant pure convection equation, in which the spatial discretization is completed on unstructured grids and the low-storage explicit Runge-Kutta scheme is used for time marching. The present method then overcomes the disadvantage of standard LBM for depending on the uniform meshes. Moreover, the collision process in the LBM is completed by using the multiple-relaxation-time scheme. After the validation of the NDG-LBM by simulating the lid-driven cavity flow, the simulations of flows over a fixed circular cylinder, a stationary airfoil and rotating-stationary cylinders are performed. Good agreement of present results with previous results is achieved, which indicates that the current NDG-LBM is accurate and effective for flow over object problems.

  9. Locating and targeting moving tumors with radiation beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dieterich, Sonja; Cleary, Kevin; D'Souza, Warren

    2008-12-15

    The current climate of rapid technological evolution is reflected in newer and better methods to modulate and direct radiation beams for cancer therapy. This Vision 20/20 paper focuses on part of this evolution, locating and targeting moving tumors. The two processes are somewhat independent and in principle different implementations of the locating and targeting processes can be interchanged. Advanced localization and targeting methods have an impact on treatment planning and also present new challenges for quality assurance (QA), that of verifying real-time delivery. Some methods to locate and target moving tumors with radiation beams are currently FDA approved for clinicalmore » use--and this availability and implementation will increase with time. Extensions of current capabilities will be the integration of higher order dimensionality, such as rotation and deformation in addition to translation, into the estimate of the patient pose and real-time reoptimization and adaption of delivery to the dynamically changing anatomy of cancer patients.« less

  10. Apparatus and method for measuring critical current properties of a coated conductor

    DOEpatents

    Mueller, Fred M [Los Alamos, NM; Haenisch, Jens [Dresden, DE

    2012-07-24

    The transverse critical-current uniformity in a superconducting tape was determined using a magnetic knife apparatus. A critical current I.sub.c distribution and transverse critical current density J.sub.c distribution in YBCO coated conductors was measured nondestructively with high resolution using a magnetic knife apparatus. The method utilizes the strong depression of J.sub.c in applied magnetic fields. A narrow region of low, including zero, magnetic field in a surrounding higher field is moved transversely across a sample of coated conductor. This reveals the critical current density distribution. A Fourier series inversion process was used to determine the transverse J.sub.c distribution in the sample.

  11. Fault Detection and Diagnosis In Hall-Héroult Cells Based on Individual Anode Current Measurements Using Dynamic Kernel PCA

    NASA Astrophysics Data System (ADS)

    Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey

    2018-04-01

    Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.

  12. Elaboration and formalization of current scientific knowledge of risks and preventive measures illustrated by colorectal cancer.

    PubMed

    Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M

    2001-01-01

    Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.

  13. Kinetics of electrolysis current reversal boriding of tool steels in a boron-containing oxychloride melt based on CaCl2

    NASA Astrophysics Data System (ADS)

    Chernov, Ya. B.; Filatov, E. S.

    2017-08-01

    The kinetics of thermal diffusion boriding in a melt based on calcium chloride with a boron oxide additive is studied using reversed current. The main temperature, concentration, and current parameters of the process are determined. The phase composition of the coating is determined by a metallographic method.

  14. A Study on Improving Information Processing Abilities Based on PBL

    ERIC Educational Resources Information Center

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  15. Early strength prediction of concrete based on accelerated curing methods : final report.

    DOT National Transportation Integrated Search

    1995-12-01

    Concrete mix designs and components may currently be changed during the course of a project. The possible negative effects of such changes on concrete strength, are not determined under the current plant control/project control process. Also, the cur...

  16. Characterization of plasma processing induced charging damage to MOS devices

    NASA Astrophysics Data System (ADS)

    Ma, Shawming

    1997-12-01

    Plasma processing has become an integral part of the fabrication of integrated circuits and takes at least 30% of whole process steps since it offers advantages in terms of directionality, low temperature and process convenience. However, wafer charging during plasma processes is a significant concern for both thin oxide damage and profile distortion. In this work, the factors affecting this damage will be explained by plasma issues, device structure and oxide quality. The SPORT (Stanford Plasma On-wafer Real Time) charging probe was developed to investigate the charging mechanism of different plasma processes including poly-Si etching, resist ashing and PECVD. The basic idea of this probe is that it simulates a real device structure in the plasma environment and allows measurement of plasma induced charging voltages and currents directly in real time. This measurement is fully compatible with other charging voltage measurement but it is the only one to do in real-time. Effect of magnetic field induced plasma nonuniformity on spatial dependent charging is well understood by this measurement. In addition, the plasma parameters including ion current density and electron temperature can also be extracted from the probe's plasma I-V characteristics using a dc Langmuir probe like theory. It will be shown that the MOS device tunneling current from charging, the dependence on antenna ratio and the etch uniformity can all be predicted by using this measurement. Moreover, the real-time measurement reveals transient and electrode edge effect during processing. Furthermore, high aspect ratio pattern induced electron shading effects can also be characterized by the probe. On the oxide quality issue, wafer temperature during plasma processing has been experimentally shown to be critical to charging damage. Finally, different MOS capacitor testing methods including breakdown voltage, charge-to-breakdown, gate leakage current and voltage-time at constant current bias were compared to find the optimum method for charging device reliability testing.

  17. Optimization of Gas Metal Arc Welding Process Parameters

    NASA Astrophysics Data System (ADS)

    Kumar, Amit; Khurana, M. K.; Yadav, Pradeep K.

    2016-09-01

    This study presents the application of Taguchi method combined with grey relational analysis to optimize the process parameters of gas metal arc welding (GMAW) of AISI 1020 carbon steels for multiple quality characteristics (bead width, bead height, weld penetration and heat affected zone). An orthogonal array of L9 has been implemented to fabrication of joints. The experiments have been conducted according to the combination of voltage (V), current (A) and welding speed (Ws). The results revealed that the welding speed is most significant process parameter. By analyzing the grey relational grades, optimal parameters are obtained and significant factors are known using ANOVA analysis. The welding parameters such as speed, welding current and voltage have been optimized for material AISI 1020 using GMAW process. To fortify the robustness of experimental design, a confirmation test was performed at selected optimal process parameter setting. Observations from this method may be useful for automotive sub-assemblies, shipbuilding and vessel fabricators and operators to obtain optimal welding conditions.

  18. The California general plan process and sustainable transportation planning

    DOT National Transportation Integrated Search

    2001-11-01

    This study reviewed the current and potential utility of California's General Plan process as a tool for promoting more sustainable local transportation systems The study used multiple methods to investigate this issue, including: 1. An extensive lit...

  19. Using transcranial direct-current stimulation (tDCS) to understand cognitive processing.

    PubMed

    Reinhart, Robert M G; Cosman, Josh D; Fukuda, Keisuke; Woodman, Geoffrey F

    2017-01-01

    Noninvasive brain stimulation methods are becoming increasingly common tools in the kit of the cognitive scientist. In particular, transcranial direct-current stimulation (tDCS) is showing great promise as a tool to causally manipulate the brain and understand how information is processed. The popularity of this method of brain stimulation is based on the fact that it is safe, inexpensive, its effects are long lasting, and you can increase the likelihood that neurons will fire near one electrode and decrease the likelihood that neurons will fire near another. However, this method of manipulating the brain to draw causal inferences is not without complication. Because tDCS methods continue to be refined and are not yet standardized, there are reports in the literature that show some striking inconsistencies. Primary among the complications of the technique is that the tDCS method uses two or more electrodes to pass current and all of these electrodes will have effects on the tissue underneath them. In this tutorial, we will share what we have learned about using tDCS to manipulate how the brain perceives, attends, remembers, and responds to information from our environment. Our goal is to provide a starting point for new users of tDCS and spur discussion of the standardization of methods to enhance replicability.

  20. Using transcranial direct-current stimulation (tDCS) to understand cognitive processing

    PubMed Central

    Reinhart, Robert M.G.; Cosman, Josh D.; Fukuda, Keisuke; Woodman, Geoffrey F.

    2017-01-01

    Noninvasive brain stimulation methods are becoming increasingly common tools in the kit of the cognitive scientist. In particular, transcranial direct-current stimulation (tDCS) is showing great promise as a tool to causally manipulate the brain and understand how information is processed. The popularity of this method of brain stimulation is based on the fact that it is safe, inexpensive, its effects are long lasting, and you can increase the likelihood that neurons will fire near one electrode and decrease the likelihood that neurons will fire near another. However, this method of manipulating the brain to draw causal inferences is not without complication. Because tDCS methods continue to be refined and are not yet standardized, there are reports in the literature that show some striking inconsistencies. Primary among the complications of the technique is that the tDCS method uses two or more electrodes to pass current and all of these electrodes will have effects on the tissue underneath them. In this tutorial, we will share what we have learned about using tDCS to manipulate how the brain perceives, attends, remembers, and responds to information from our environment. Our goal is to provide a starting point for new users of tDCS and spur discussion of the standardization of methods to enhance replicability. PMID:27804033

  1. Investigation of optical current transformer signal processing method based on an improved Kalman algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Ge, Jin-ming; Zhang, Guo-qing; Yu, Wen-bin; Liu, Rui-tong; Fan, Wei; Yang, Ying-xuan

    2018-01-01

    This paper explores the problem of signal processing in optical current transformers (OCTs). Based on the noise characteristics of OCTs, such as overlapping signals, noise frequency bands, low signal-to-noise ratios, and difficulties in acquiring statistical features of noise power, an improved standard Kalman filtering algorithm was proposed for direct current (DC) signal processing. The state-space model of the OCT DC measurement system is first established, and then mixed noise can be processed by adding mixed noise into measurement and state parameters. According to the minimum mean squared error criterion, state predictions and update equations of the improved Kalman algorithm could be deduced based on the established model. An improved central difference Kalman filter was proposed for alternating current (AC) signal processing, which improved the sampling strategy and noise processing of colored noise. Real-time estimation and correction of noise were achieved by designing AC and DC noise recursive filters. Experimental results show that the improved signal processing algorithms had a good filtering effect on the AC and DC signals with mixed noise of OCT. Furthermore, the proposed algorithm was able to achieve real-time correction of noise during the OCT filtering process.

  2. Environmentally Responsible Microbiological Production of Energetic Ingredients

    DTIC Science & Technology

    2007-11-01

    effort was to develop an environmentally benign and economical microbial process for nitro-energetics production . The specific targets of this method...microbial production of nitro-based EM. As the processes and compounds of choice, RDX/HMX (nitramine) generation was selected. Microorganisms capable of...Current synthetic methods for the production of RDX and HMX utilize hexamine as the precursor. Hexamine is an industrial chemical available on a large

  3. Factors influencing tests of auditory processing: a perspective on current issues and relevant concerns.

    PubMed

    Cacace, Anthony T; McFarland, Dennis J

    2013-01-01

    Tests of auditory perception, such as those used in the assessment of central auditory processing disorders ([C]APDs), represent a domain in audiological assessment where measurement of this theoretical construct is often confounded by nonauditory abilities due to methodological shortcomings. These confounds include the effects of cognitive variables such as memory and attention and suboptimal testing paradigms, including the use of verbal reproduction as a form of response selection. We argue that these factors need to be controlled more carefully and/or modified so that their impact on tests of auditory and visual perception is only minimal. To advocate for a stronger theoretical framework than currently exists and to suggest better methodological strategies to improve assessment of auditory processing disorders (APDs). Emphasis is placed on adaptive forced-choice psychophysical methods and the use of matched tasks in multiple sensory modalities to achieve these goals. Together, this approach has potential to improve the construct validity of the diagnosis, enhance and develop theory, and evolve into a preferred method of testing. Examination of methods commonly used in studies of APDs. Where possible, currently used methodology is compared to contemporary psychophysical methods that emphasize computer-controlled forced-choice paradigms. In many cases, the procedures used in studies of APD introduce confounding factors that could be minimized if computer-controlled forced-choice psychophysical methods were utilized. Ambiguities of interpretation, indeterminate diagnoses, and unwanted confounds can be avoided by minimizing memory and attentional demands on the input end and precluding the use of response-selection strategies that use complex motor processes on the output end. Advocated are the use of computer-controlled forced-choice psychophysical paradigms in combination with matched tasks in multiple sensory modalities to enhance the prospect of obtaining a valid diagnosis. American Academy of Audiology.

  4. A Current View of Learning Disabilities.

    ERIC Educational Resources Information Center

    Feagans, Lynne

    1983-01-01

    The issue of defining learning disability is considered. Important recent trends in research are reviewed with regard to: intellectual skills, academic retardation, neurological and behavioral dysfunction, and cognitive and interactive processes. Current intervention methods are also briefly described. Available from: Journal of Pediatrics, C.V.…

  5. Surface Coating of Oxide Powders: A New Synthesis Method to Process Biomedical Grade Nano-Composites

    PubMed Central

    Palmero, Paola; Montanaro, Laura; Reveron, Helen; Chevalier, Jérôme

    2014-01-01

    Composite and nanocomposite ceramics have achieved special interest in recent years when used for biomedical applications. They have demonstrated, in some cases, increased performance, reliability, and stability in vivo, with respect to pure monolithic ceramics. Current research aims at developing new compositions and architectures to further increase their properties. However, the ability to tailor the microstructure requires the careful control of all steps of manufacturing, from the synthesis of composite nanopowders, to their processing and sintering. This review aims at deepening understanding of the critical issues associated with the manufacturing of nanocomposite ceramics, focusing on the key role of the synthesis methods to develop homogeneous and tailored microstructures. In this frame, the authors have developed an innovative method, named “surface-coating process”, in which matrix oxide powders are coated with inorganic precursors of the second phase. The method is illustrated into two case studies; the former, on Zirconia Toughened Alumina (ZTA) materials for orthopedic applications, and the latter, on Zirconia-based composites for dental implants, discussing the advances and the potential of the method, which can become a valuable alternative to the current synthesis process already used at a clinical and industrial scale. PMID:28788117

  6. Battery powered thought: enhancement of attention, learning, and memory in healthy adults using transcranial direct current stimulation.

    PubMed

    Coffman, Brian A; Clark, Vincent P; Parasuraman, Raja

    2014-01-15

    This article reviews studies demonstrating enhancement with transcranial direct current stimulation (tDCS) of attention, learning, and memory processes in healthy adults. Given that these are fundamental cognitive functions, they may also mediate stimulation effects on other higher-order processes such as decision-making and problem solving. Although tDCS research is still young, there have been a variety of methods used and cognitive processes tested. While these different methods have resulted in seemingly contradictory results among studies, many consistent and noteworthy effects of tDCS on attention, learning, and memory have been reported. The literature suggests that although tDCS as typically applied may not be as useful for localization of function in the brain as some other methods of brain stimulation, tDCS may be particularly well-suited for practical applications involving the enhancement of attention, learning, and memory, in both healthy subjects and in clinical populations. © 2013 Elsevier Inc. All rights reserved.

  7. Topological data analysis as a morphometric method: using persistent homology to demarcate a leaf morphospace

    USDA-ARS?s Scientific Manuscript database

    Current morphometric methods that comprehensively measure shape cannot compare the disparate leaf shapes found in flowering plants and are sensitive to processing artifacts. Here we describe a persistent homology approach to measuring shape. Persistent homology is a topological method (concerned wit...

  8. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  9. Overlap junctions for high coherence superconducting qubits

    NASA Astrophysics Data System (ADS)

    Wu, X.; Long, J. L.; Ku, H. S.; Lake, R. E.; Bal, M.; Pappas, D. P.

    2017-07-01

    Fabrication of sub-micron Josephson junctions is demonstrated using standard processing techniques for high-coherence, superconducting qubits. These junctions are made in two separate lithography steps with normal-angle evaporation. Most significantly, this work demonstrates that it is possible to achieve high coherence with junctions formed on aluminum surfaces cleaned in situ by Ar plasma before junction oxidation. This method eliminates the angle-dependent shadow masks typically used for small junctions. Therefore, this is conducive to the implementation of typical methods for improving margins and yield using conventional CMOS processing. The current method uses electron-beam lithography and an additive process to define the top and bottom electrodes. Extension of this work to optical lithography and subtractive processes is discussed.

  10. Application of Taguchi optimization on the cassava starch wastewater electrocoagulation using batch recycle method

    NASA Astrophysics Data System (ADS)

    Sudibyo, Hermida, L.; Suwardi

    2017-11-01

    Tapioca waste water is very difficult to treat; hence many tapioca factories could not treat it well. One of method which able to overcome this problem is electrodeposition. This process has high performance when it conducted using batch recycle process and use aluminum bipolar electrode. However, the optimum operation conditions are having a significant effect in the tapioca wastewater treatment using bath recycle process. In this research, The Taguchi method was successfully applied to know the optimum condition and the interaction between parameters in electrocoagulation process. The results show that current density, conductivity, electrode distance, and pH have a significant effect on the turbidity removal of cassava starch waste water.

  11. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    PubMed

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  12. Current conceptual challenges in the study of rhythm processing deficits.

    PubMed

    Tranchant, Pauline; Vuvan, Dominique T

    2015-01-01

    Interest in the study of rhythm processing deficits (RPD) is currently growing in the cognitive neuroscience community, as this type of investigation constitutes a powerful tool for the understanding of normal rhythm processing. Because this field is in its infancy, it still lacks a common conceptual vocabulary to facilitate effective communication between different researchers and research groups. In this commentary, we provide a brief review of recent reports of RPD through the lens of one important empirical issue: the method by which beat perception is measured, and the consequences of method selection for the researcher's ability to specify which mechanisms are impaired in RPD. This critical reading advocates for the importance of matching measurement tools to the putative neurocognitive mechanisms under study, and reveals the need for effective and specific assessments of the different aspects of rhythm perception and synchronization.

  13. System and method for motor speed estimation of an electric motor

    DOEpatents

    Lu, Bin [Kenosha, WI; Yan, Ting [Brookfield, WI; Luebke, Charles John [Sussex, WI; Sharma, Santosh Kumar [Viman Nagar, IN

    2012-06-19

    A system and method for a motor management system includes a computer readable storage medium and a processing unit. The processing unit configured to determine a voltage value of a voltage input to an alternating current (AC) motor, determine a frequency value of at least one of a voltage input and a current input to the AC motor, determine a load value from the AC motor, and access a set of motor nameplate data, where the set of motor nameplate data includes a rated power, a rated speed, a rated frequency, and a rated voltage of the AC motor. The processing unit is also configured to estimate a motor speed based on the voltage value, the frequency value, the load value, and the set of nameplate data and also store the motor speed on the computer readable storage medium.

  14. The potential of novel infrared food processing technologies: case studies of those developed at the USDA-ARS

    USDA-ARS?s Scientific Manuscript database

    Infrared (IR) radiation heating has been considered as an alternative to current food and agricultural processing methods for improving product quality and safety, increasing energy and processing efficiency, and reducing water and chemical usage. As part of the electromagnetic spectrum, IR has the ...

  15. Current Sensor Fault Reconstruction for PMSM Drives

    PubMed Central

    Huang, Gang; Luo, Yi-Ping; Zhang, Chang-Fan; He, Jing; Huang, Yi-Shan

    2016-01-01

    This paper deals with a current sensor fault reconstruction algorithm for the torque closed-loop drive system of an interior PMSM. First, sensor faults are equated to actuator ones by a new introduced state variable. Then, in αβ coordinates, based on the motor model with active flux linkage, a current observer is constructed with a specific sliding mode equivalent control methodology to eliminate the effects of unknown disturbances, and the phase current sensor faults are reconstructed by means of an adaptive method. Finally, an αβ axis current fault processing module is designed based on the reconstructed value. The feasibility and effectiveness of the proposed method are verified by simulation and experimental tests on the RT-LAB platform. PMID:26840317

  16. Method and apparatus for casting conductive and semi-conductive materials

    DOEpatents

    Ciszek, T.F.

    1984-08-13

    A method and apparatus is disclosed for casting conductive and semi-conductive materials. The apparatus includes a plurality of conductive members arranged to define a container-like area having a desired cross-sectional shape. A portion or all of the conductive or semi-conductive material which is to be cast is introduced into the container-like area. A means is provided for inducing the flow of an electrical current in each of the conductive members, which currents act collectively to induce a current flow in the material. The induced current flow through the conductive members is in a direction substantially opposite to the induced current flow in the material so that the material is repelled from the conductive members during the casting process.

  17. Method and apparatus for casting conductive and semiconductive materials

    DOEpatents

    Ciszek, Theodore F.

    1986-01-01

    A method and apparatus is disclosed for casting conductive and semiconduce materials. The apparatus includes a plurality of conductive members arranged to define a container-like area having a desired cross-sectional shape. A portion or all of the conductive or semiconductive material which is to be cast is introduced into the container-like area. A means is provided for inducing the flow of an electrical current in each of the conductive members, which currents act collectively to induce a current flow in the material. The induced current flow through the conductive members is in a direction substantially opposite to the induced current flow in the material so that the material is repelled from the conductive members during the casting process.

  18. Functional relationship-based alarm processing system

    DOEpatents

    Corsberg, D.R.

    1988-04-22

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the functional relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated or deactivated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). The alarm processing system and method is sensitive to the dynamic nature of the process being monitored and is capable of changing the relative importance of each alarm as necessary. 12 figs.

  19. Functional relationship-based alarm processing

    DOEpatents

    Corsberg, Daniel R.

    1988-01-01

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated. Thus, each alarm's importance is continuously oupdated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on caussal factors between two alarms); (3) required action (system response or action) expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). The alarm processing system and method is sensitive to the dynamic nature of the process being monitored and is capable of changing the relative importance of each alarm as necessary.

  20. Functional relationship-based alarm processing system

    DOEpatents

    Corsberg, Daniel R.

    1989-01-01

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the functional relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated or deactivated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). The alarm processing system and method is sensitive to the dynamic nature of the process being monitored and is capable of changing the relative importance of each alarm as necessary.

  1. Metal-assisted exfoliation (MAE): green, roll-to-roll compatible method for transferring graphene to flexible substrates

    NASA Astrophysics Data System (ADS)

    Zaretski, Aliaksandr V.; Moetazedi, Herad; Kong, Casey; Sawyer, Eric J.; Savagatrup, Suchol; Valle, Eduardo; O'Connor, Timothy F.; Printz, Adam D.; Lipomi, Darren J.

    2015-01-01

    Graphene is expected to play a significant role in future technologies that span a range from consumer electronics, to devices for the conversion and storage of energy, to conformable biomedical devices for healthcare. To realize these applications, however, a low-cost method of synthesizing large areas of high-quality graphene is required. Currently, the only method to generate large-area single-layer graphene that is compatible with roll-to-roll manufacturing destroys approximately 300 kg of copper foil (thickness = 25 μm) for every 1 g of graphene produced. This paper describes a new environmentally benign and scalable process of transferring graphene to flexible substrates. The process is based on the preferential adhesion of certain thin metallic films to graphene; separation of the graphene from the catalytic copper foil is followed by lamination to a flexible target substrate in a process that is compatible with roll-to-roll manufacturing. The copper substrate is indefinitely reusable and the method is substantially greener than the current process that uses relatively large amounts of corrosive etchants to remove the copper. The sheet resistance of the graphene produced by this new process is unoptimized but should be comparable in principle to that produced by the standard method, given the defects observable by Raman spectroscopy and the presence of process-induced cracks. With further improvements, this green, inexpensive synthesis of single-layer graphene could enable applications in flexible, stretchable, and disposable electronics, low-profile and lightweight barrier materials, and in large-area displays and photovoltaic modules.

  2. Eddy current characterization of magnetic treatment of materials

    NASA Technical Reports Server (NTRS)

    Chern, E. James

    1992-01-01

    Eddy current impedance measuring methods have been applied to study the effect that magnetically treated materials have on service life extension. Eddy current impedance measurements have been performed on Nickel 200 specimens that have been subjected to many mechanical and magnetic engineering processes: annealing, applied strain, magnetic field, shot peening, and magnetic field after peening. Experimental results have demonstrated a functional relationship between coil impedance, resistance and reactance, and specimens subjected to various engineering processes. It has shown that magnetic treatment does induce changes in a material's electromagnetic properties and does exhibit evidence of stress relief. However, further fundamental studies are necessary for a thorough understanding of the exact mechanism of the magnetic-field processing effect on machine tool service life.

  3. Signal processing and analysis for copper layer thickness measurement within a large variation range in the CMP process.

    PubMed

    Li, Hongkai; Zhao, Qian; Lu, Xinchun; Luo, Jianbin

    2017-11-01

    In the copper (Cu) chemical mechanical planarization (CMP) process, accurate determination of a process reaching the end point is of great importance. Based on the eddy current technology, the in situ thickness measurement of the Cu layer is feasible. Previous research studies focus on the application of the eddy current method to the metal layer thickness measurement or endpoint detection. In this paper, an in situ measurement system, which is independently developed by using the eddy current method, is applied to the actual Cu CMP process. A series of experiments are done for further analyzing the dynamic response characteristic of the output signal within different thickness variation ranges. In this study, the voltage difference of the output signal is used to represent the thickness of the Cu layer, and we can extract the voltage difference variations from the output signal fast by using the proposed data processing algorithm. The results show that the voltage difference decreases as thickness decreases in the conventional measurement range and the sensitivity increases at the same time. However, it is also found that there exists a thickness threshold, and the correlation is negative, when the thickness is more than the threshold. Furthermore, it is possible that the in situ measurement system can be used within a larger Cu layer thickness variation range by creating two calibration tables.

  4. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  5. Improved Environmental Life Cycle Assessment of Crop Production at the Catchment Scale via a Process-Based Nitrogen Simulation Model.

    PubMed

    Liao, Wenjie; van der Werf, Hayo M G; Salmon-Monviola, Jordy

    2015-09-15

    One of the major challenges in environmental life cycle assessment (LCA) of crop production is the nonlinearity between nitrogen (N) fertilizer inputs and on-site N emissions resulting from complex biogeochemical processes. A few studies have addressed this nonlinearity by combining process-based N simulation models with LCA, but none accounted for nitrate (NO3(-)) flows across fields. In this study, we present a new method, TNT2-LCA, that couples the topography-based simulation of nitrogen transfer and transformation (TNT2) model with LCA, and compare the new method with a current LCA method based on a French life cycle inventory database. Application of the two methods to a case study of crop production in a catchment in France showed that, compared to the current method, TNT2-LCA allows delineation of more appropriate temporal limits when developing data for on-site N emissions associated with specific crops in this catchment. It also improves estimates of NO3(-) emissions by better consideration of agricultural practices, soil-climatic conditions, and spatial interactions of NO3(-) flows across fields, and by providing predicted crop yield. The new method presented in this study provides improved LCA of crop production at the catchment scale.

  6. Ethnographic methods for process evaluations of complex health behaviour interventions.

    PubMed

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  7. Method of Calculating the Correction Factors for Cable Dimensioning in Smart Grids

    NASA Astrophysics Data System (ADS)

    Simutkin, M.; Tuzikova, V.; Tlusty, J.; Tulsky, V.; Muller, Z.

    2017-04-01

    One of the main causes of overloading electrical equipment by currents of higher harmonics is the great increasing of a number of non-linear electricity power consumers. Non-sinusoidal voltages and currents affect the operation of electrical equipment, reducing its lifetime, increases the voltage and power losses in the network, reducing its capacity. There are standards that respects emissions amount of higher harmonics current that cannot provide interference limit for a safe level in power grid. The article presents a method for determining a correction factor to the long-term allowable current of the cable, which allows for this influence. Using mathematical models in the software Elcut, it was described thermal processes in the cable in case the flow of non-sinusoidal current. Developed in the article theoretical principles, methods, mathematical models allow us to calculate the correction factor to account for the effect of higher harmonics in the current spectrum for network equipment in any type of non-linear load.

  8. A stereoscopic imaging system for laser back scatter based trajectory measurement in ballistics: part 2

    NASA Astrophysics Data System (ADS)

    Chalupka, Uwe; Rothe, Hendrik

    2012-03-01

    The progress on a laser- and stereo-camera-based trajectory measurement system that we already proposed and described in recent publications is given. The system design was extended from one to two more powerful, DSP-controllable LASER systems. Experimental results of the extended system using different projectile-/weapon combinations will be shown and discussed. Automatic processing of acquired images using common 3DIP techniques was realized. Processing steps to extract trajectory segments from images as representative for the current application will be presented. Used algorithms for backward-calculation of the projectile trajectory will be shown. Verification of produced results is done against simulated trajectories, once in terms of detection robustness and once in terms of detection accuracy. Fields of use for the current system are within the ballistic domain. The first purpose is for trajectory measurement of small and middle caliber projectiles on a shooting range. Extension to big caliber projectiles as well as an application for sniper detection is imaginable, but would require further work. Beside classical RADAR, acoustic and optical projectile detection methods, the current system represents a further projectile location method under the new class of electro-optical methods that have been evolved in recent decades and that uses 3D imaging acquisition and processing techniques.

  9. Klystron Manufacturing Technology Program.

    DTIC Science & Technology

    1983-09-01

    processes, and methodology used on the current production tube, VKU-7735E, and the new methods and techniques used to improve and reduce the cost of...the bellows. This alignment is c~tclto the smoothi operation of the internal tuniing mezhanism. IT METR𔃼D - VKCU-7795F The new assembly method changes...Varian, the MT contractor that the new methodology , technologies and process changes introduced into the MT power klystron and autotuner assembly - VKU

  10. Transitioning Human, Social, Cultural Behavior (HSCB) Models and Simulations to the Operational User1

    DTIC Science & Technology

    2009-10-01

    actuelle M&S couvrant le soutien aux operations, la representation du comportement humain , la guerre asymetrique, la defense contre le terrorisme et...methods, tools, data, intellectual capital , and processes to address these capability requirements. Fourth, there is a need to compare capability...requirements to current capabilities to identify gaps that may be addressed with DoD HSCB methods, tools, data, intellectual capital , and process

  11. Control and monitoring method and system for electromagnetic forming process

    DOEpatents

    Kunerth, Dennis C.; Lassahn, Gordon D.

    1990-01-01

    A process, system, and improvement for a process for electromagnetic forming of a workpiece in which characteristics of the workpiece such as its geometry, electrical conductivity, quality, and magnetic permeability can be determined by monitoring the current and voltage in the workcoil. In an electromagnet forming process in which a power supply provides current to a workcoil and the electromagnetic field produced by the workcoil acts to form the workpiece, the dynamic interaction of the electromagnetic fields produced by the workcoil with the geometry, electrical conductivity, and magnetic permeability of the workpiece, provides information pertinent to the physical condition of the workpiece that is available for determination of quality and process control. This information can be obtained by deriving in real time the first several time derivatives of the current and voltage in the workcoil. In addition, the process can be extended by injecting test signals into the workcoil during the electromagnetic forming and monitoring the response to the test signals in the workcoil.

  12. Modeling, simulation and control of pulsed DE-GMA welding process for joining of aluminum to steel

    NASA Astrophysics Data System (ADS)

    Zhang, Gang; Shi, Yu; Li, Jie; Huang, Jiankang; Fan, Ding

    2014-09-01

    Joining of aluminum to steel has attracted significant attention from the welding research community, automotive and rail transportation industries. Many current welding methods have been developed and applied, however, they can not precisely control the heat input to work-piece, they are high costs, low efficiency and consist lots of complex welding devices, and the generated intermetallic compound layer in weld bead interface is thicker. A novel pulsed double electrode gas metal arc welding(Pulsed DE-GMAW) method is developed. To achieve a stable welding process for joining of aluminum to steel, a mathematical model of coupled arc is established, and a new control scheme that uses the average feedback arc voltage of main loop to adjust the wire feed speed to control coupled arc length is proposed and developed. Then, the impulse control simulation of coupled arc length, wire feed speed and wire extension is conducted to demonstrate the mathematical model and predict the stability of welding process by changing the distance of contact tip to work-piece(CTWD). To prove the proposed PSO based PID control scheme's feasibility, the rapid prototyping experimental system is setup and the bead-on-plate control experiments are conducted to join aluminum to steel. The impulse control simulation shows that the established model can accurately represent the variation of coupled arc length, wire feed speed and the average main arc voltage when the welding process is disturbed, and the developed controller has a faster response and adjustment, only runs about 0.1 s. The captured electric signals show the main arc voltage gradually closes to the supposed arc voltage by adjusting the wire feed speed in 0.8 s. The obtained typical current waveform demonstrates that the main current can be reduced by controlling the bypass current under maintaining a relative large total current. The control experiment proves the accuracy of proposed model and feasibility of new control scheme further. The beautiful and smooth weld beads are also obtained by this method. Pulsed DE-GMAW can thus be considered as an alternative method for low cost, high efficiency joining of aluminum to steel.

  13. Evaluating transient performance of servo mechanisms by analysing stator current of PMSM

    NASA Astrophysics Data System (ADS)

    Zhang, Qing; Tan, Luyao; Xu, Guanghua

    2018-02-01

    Smooth running and rapid response are the desired performance goals for the transient motions of servo mechanisms. Because of the uncertain and unobservable transient behaviour of servo mechanisms, it is difficult to evaluate their transient performance. Under the effects of electromechanical coupling, the stator current signals of a permanent-magnet synchronous motor (PMSM) potentially contain the performance information regarding servo mechanisms in use. In this paper, a novel method based on analysing the stator current of the PMSM is proposed for quantifying the transient performance. First, a vector control model is constructed to simulate the stator current behaviour in the transient processes of consecutive speed changes, consecutive load changes, and intermittent start-stops. It is discovered that the amplitude and frequency of the stator current are modulated by the transient load torque and motor speed, respectively. The stator currents under different performance conditions are also simulated and compared. Then, the stator current is processed using a local means decomposition (LMD) algorithm to extract the instantaneous amplitude and instantaneous frequency. The sample entropy of the instantaneous amplitude, which reflects the complexity of the load torque variation, is calculated as a performance indicator of smooth running. The peak-to-peak value of the instantaneous frequency, which defines the range of the motor speed variation, is set as a performance indicator of rapid response. The proposed method is applied to both simulated data in an intermittent start-stops process and experimental data measured for a batch of servo turrets for turning lathes. The results show that the performance evaluations agree with the actual performance.

  14. Segmentation and Image Analysis of Abnormal Lungs at CT: Current Approaches, Challenges, and Future Trends

    PubMed Central

    Mansoor, Awais; Foster, Brent; Xu, Ziyue; Papadakis, Georgios Z.; Folio, Les R.; Udupa, Jayaram K.; Mollura, Daniel J.

    2015-01-01

    The computer-based process of identifying the boundaries of lung from surrounding thoracic tissue on computed tomographic (CT) images, which is called segmentation, is a vital first step in radiologic pulmonary image analysis. Many algorithms and software platforms provide image segmentation routines for quantification of lung abnormalities; however, nearly all of the current image segmentation approaches apply well only if the lungs exhibit minimal or no pathologic conditions. When moderate to high amounts of disease or abnormalities with a challenging shape or appearance exist in the lungs, computer-aided detection systems may be highly likely to fail to depict those abnormal regions because of inaccurate segmentation methods. In particular, abnormalities such as pleural effusions, consolidations, and masses often cause inaccurate lung segmentation, which greatly limits the use of image processing methods in clinical and research contexts. In this review, a critical summary of the current methods for lung segmentation on CT images is provided, with special emphasis on the accuracy and performance of the methods in cases with abnormalities and cases with exemplary pathologic findings. The currently available segmentation methods can be divided into five major classes: (a) thresholding-based, (b) region-based, (c) shape-based, (d) neighboring anatomy–guided, and (e) machine learning–based methods. The feasibility of each class and its shortcomings are explained and illustrated with the most common lung abnormalities observed on CT images. In an overview, practical applications and evolving technologies combining the presented approaches for the practicing radiologist are detailed. ©RSNA, 2015 PMID:26172351

  15. An update on pharmaceutical film coating for drug delivery.

    PubMed

    Felton, Linda A; Porter, Stuart C

    2013-04-01

    Pharmaceutical coating processes have generally been transformed from what was essentially an art form in the mid-twentieth century to a much more technology-driven process. This review article provides a basic overview of current film coating processes, including a discussion on polymer selection, coating formulation additives and processing equipment. Substrate considerations for pharmaceutical coating processes are also presented. While polymeric coating operations are commonplace in the pharmaceutical industry, film coating processes are still not fully understood, which presents serious challenges with current regulatory requirements. Novel analytical technologies and various modeling techniques that are being used to better understand film coating processes are discussed. This review article also examines the challenges of implementing process analytical technologies in coating operations, active pharmaceutical ingredients in polymer film coatings, the use of high-solids coating systems and continuous coating and other novel coating application methods.

  16. 42 CFR 82.30 - How will NIOSH inform the public of any plans to change scientific elements underlying the dose...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR... change scientific elements underlying the dose reconstruction process to maintain methods reasonably... methods reasonably current with scientific progress? Periodically, NIOSH will publish a notice in the...

  17. System and method for measuring ocean surface currents at locations remote from land masses using synthetic aperture radar

    NASA Technical Reports Server (NTRS)

    Young, Lawrence E. (Inventor)

    1991-01-01

    A system for measuring ocean surface currents from an airborne platform is disclosed. A radar system having two spaced antennas wherein one antenna is driven and return signals from the ocean surface are detected by both antennas is employed to get raw ocean current data which are saved for later processing. There are a pair of global positioning system (GPS) systems including a first antenna carried by the platform at a first location and a second antenna carried by the platform at a second location displaced from the first antenna for determining the position of the antennas from signals from orbiting GPS navigational satellites. Data are also saved for later processing. The saved data are subsequently processed by a ground-based computer system to determine the position, orientation, and velocity of the platform as well as to derive measurements of currents on the ocean surface.

  18. An analytical approach to customer requirement information processing

    NASA Astrophysics Data System (ADS)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  19. Automatic Processing of Current Affairs Queries

    ERIC Educational Resources Information Center

    Salton, G.

    1973-01-01

    The SMART system is used for the analysis, search and retrieval of news stories appearing in Time'' magazine. A comparison is made between the automatic text processing methods incorporated into the SMART system and a manual search using the classified index to Time.'' (14 references) (Author)

  20. The development of control and monitoring system on marine current renewable energy Case study: strait of Toyapakeh - Nusa Penida, Bali

    NASA Astrophysics Data System (ADS)

    Arief, I. S.; Suherman, I. H.; Wardani, A. Y.; Baidowi, A.

    2017-05-01

    Control and monitoring system is a continuous process of securing the asset in the Marine Current Renewable Energy. A control and monitoring system is existed each critical components which is embedded in Failure Mode Effect Analysis (FMEA) method. As the result, the process in this paper developed through a matrix sensor. The matrix correlated to critical components and monitoring system which supported by sensors to conduct decision-making.

  1. Deformation and crack growth response under cyclic creep conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brust, F.W. Jr.

    To increase energy efficiency, new plants must operate at higher and higher temperatures. Moreover, power generation equipment continues to age and is being used far beyond its intended original design life. Some recent failures which unfortunately occurred with serious consequences have clearly illustrated that current methods for insuring safety and reliability of high temperature equipment is inadequate. Because of these concerns, an understanding of the high-temperature crack growth process is very important and has led to the following studies of the high temperature failure process. This effort summarizes the results of some recent studies which investigate the phenomenon of highmore » temperature creep fatigue crack growth. Experimental results which detail the process of creep fatigue, analytical studies which investigate why current methods are ineffective, and finally, a new approach which is based on the T{sup *}-integral and its ability to characterize the creep-fatigue crack growth process are discussed. The potential validity of this new predictive methodology is illustrated.« less

  2. From Discovery to Production: Biotechnology of Marine Fungi for the Production of New Antibiotics.

    PubMed

    Silber, Johanna; Kramer, Annemarie; Labes, Antje; Tasdemir, Deniz

    2016-07-21

    Filamentous fungi are well known for their capability of producing antibiotic natural products. Recent studies have demonstrated the potential of antimicrobials with vast chemodiversity from marine fungi. Development of such natural products into lead compounds requires sustainable supply. Marine biotechnology can significantly contribute to the production of new antibiotics at various levels of the process chain including discovery, production, downstream processing, and lead development. However, the number of biotechnological processes described for large-scale production from marine fungi is far from the sum of the newly-discovered natural antibiotics. Methods and technologies applied in marine fungal biotechnology largely derive from analogous terrestrial processes and rarely reflect the specific demands of the marine fungi. The current developments in metabolic engineering and marine microbiology are not yet transferred into processes, but offer numerous options for improvement of production processes and establishment of new process chains. This review summarises the current state in biotechnological production of marine fungal antibiotics and points out the enormous potential of biotechnology in all stages of the discovery-to-development pipeline. At the same time, the literature survey reveals that more biotechnology transfer and method developments are needed for a sustainable and innovative production of marine fungal antibiotics.

  3. The potential of novel infrared food processing technologies: case studies of those developed at the USDA-ARS WRRC and the University of California Davis

    USDA-ARS?s Scientific Manuscript database

    Infrared (IR) radiation heating has been considered as an alternative to current food and agricultural processing methods for improving product quality and safety, increasing energy and processing efficiency, and reducing water and chemical usage. As part of the electromagnetic spectrum, IR has the ...

  4. Using nocturnal cold air drainage flow to monitor ecosystem processes in complex terrain

    Treesearch

    Thomas G. Pypker; Michael H. Unsworth; Alan C. Mix; William Rugh; Troy Ocheltree; Karrin Alstad; Barbara J. Bond

    2007-01-01

    This paper presents initial investigations of a new approach to monitor ecosystem processes in complex terrain on large scales. Metabolic processes in mountainous ecosystems are poorly represented in current ecosystem monitoring campaigns because the methods used for monitoring metabolism at the ecosystem scale (e.g., eddy covariance) require flat study sites. Our goal...

  5. Globally optimal superconducting magnets part I: minimum stored energy (MSE) current density map.

    PubMed

    Tieng, Quang M; Vegh, Viktor; Brereton, Ian M

    2009-01-01

    An optimal current density map is crucial in magnet design to provide the initial values within search spaces in an optimization process for determining the final coil arrangement of the magnet. A strategy for obtaining globally optimal current density maps for the purpose of designing magnets with coaxial cylindrical coils in which the stored energy is minimized within a constrained domain is outlined. The current density maps obtained utilising the proposed method suggests that peak current densities occur around the perimeter of the magnet domain, where the adjacent peaks have alternating current directions for the most compact designs. As the dimensions of the domain are increased, the current density maps yield traditional magnet designs of positive current alone. These unique current density maps are obtained by minimizing the stored magnetic energy cost function and therefore suggest magnet coil designs of minimal system energy. Current density maps are provided for a number of different domain arrangements to illustrate the flexibility of the method and the quality of the achievable designs.

  6. Thin wire pointing method

    NASA Technical Reports Server (NTRS)

    Green, G.; Mattauch, R. J. (Inventor)

    1983-01-01

    A method is described for forming sharp tips on thin wires, in particular phosphor bronze wires of diameters such as one-thousandth inch used to contact micron size Schottky barrier diodes, which enables close control of tip shape and which avoids the use of highly toxic solutions. The method includes dipping an end of a phosphor bronze wire into a dilute solution of sulfamic acid and applying a current through the wire to electrochemically etch it. The humidity in the room is controlled to a level of less than 50%, and the voltage applied between the wire and another electrode in the solutions is a half wave rectified voltage. The current through the wire is monitored, and the process is stopped when the current falls to a predetermined low level.

  7. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  8. Implementation of the 3Rs (refinement, reduction, and replacement): validation and regulatory acceptance considerations for alternative toxicological test methods.

    PubMed

    Schechtman, Leonard M

    2002-01-01

    Toxicological testing in the current regulatory environment is steeped in a history of using animals to answer questions about the safety of products to which humans are exposed. That history forms the basis for the testing strategies that have evolved to satisfy the needs of the regulatory bodies that render decisions that affect, for the most part, virtually all phases of premarket product development and evaluation and, to a lesser extent, postmarketing surveillance. Only relatively recently have the levels of awareness of, and responsiveness to, animal welfare issues reached current proportions. That paradigm shift, although sluggish, has nevertheless been progressive. New and alternative toxicological methods for hazard evaluation and risk assessment have now been adopted and are being viewed as a means to address those issues in a manner that considers humane treatment of animals yet maintains scientific credibility and preserves the goal of ensuring human safety. To facilitate this transition, regulatory agencies and regulated industry must work together toward improved approaches. They will need assurance that the methods will be reliable and the results comparable with, or better than, those derived from the current classical methods. That confidence will be a function of the scientific validation and resultant acceptance of any given method. In the United States, to fulfill this need, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and its operational center, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), have been constituted as prescribed in federal law. Under this mandate, ICCVAM has developed a process and established criteria for the scientific validation and regulatory acceptance of new and alternative methods. The role of ICCVAM in the validation and acceptance process and the criteria instituted toward that end are described. Also discussed are the participation of the US Food and Drug Administration (FDA) in the ICCVAM process and that agency's approach to the application and implementation of ICCVAM-recommended methods.

  9. Longitudinal gradient coil optimization in the presence of transient eddy currents.

    PubMed

    Trakic, A; Liu, F; Lopez, H Sanchez; Wang, H; Crozier, S

    2007-06-01

    The switching of magnetic field gradient coils in magnetic resonance imaging (MRI) inevitably induces transient eddy currents in conducting system components, such as the cryostat vessel. These secondary currents degrade the spatial and temporal performance of the gradient coils, and compensation methods are commonly employed to correct for these distortions. This theoretical study shows that by incorporating the eddy currents into the coil optimization process, it is possible to modify a gradient coil design so that the fields created by the coil and the eddy currents combine together to generate a spatially homogeneous gradient that follows the input pulse. Shielded and unshielded longitudinal gradient coils are used to exemplify this novel approach. To assist in the evaluation of transient eddy currents induced within a realistic cryostat vessel, a low-frequency finite-difference time-domain (FDTD) method using the total-field scattered-field (TFSF) scheme was performed. The simulations demonstrate the effectiveness of the proposed method for optimizing longitudinal gradient fields while taking into account the spatial and temporal behavior of the eddy currents.

  10. Parameter Analysis for Arc Snubber of EAST Neutral Beam Injector

    NASA Astrophysics Data System (ADS)

    Wang, Haitian; Li, Ge; Cao, Liang; Dang, Xiaoqiang; Fu, Peng

    2010-08-01

    According to the B-H curve and structural dimensions of the snubber by the Fink-Baker Method, the inductive voltage and the eddy current of any core tape with the thickness of the saturated regions are derived when the accelerator breakdown occurs. Using the Ampere's law, in each core tape, the eddy current of the core lamination is equal to the arc current, and the relation of the thickness of the saturated regions for different laminations can be deduced. The total equivalent resistance of the snubber can be obtained. The transient eddy current model based on the stray capacitance and the equivalent resistance is analyzed, and the solving process is given in detail. The exponential time constant and the arc current are obtained. Then, the maximum width of the lamination and the minimum thickness of the core tape are determined. The experimental time constant of the eddy current obtained, with or without the bias current, is approximately the same as that by the analytical method, which proves the accuracy of the adopted assumptions and the analysis method.

  11. Current approaches for the assessment of in situ biodegradation.

    PubMed

    Bombach, Petra; Richnow, Hans H; Kästner, Matthias; Fischer, Anko

    2010-04-01

    Considering the high costs and technical difficulties associated with conventional remediation strategies, in situ biodegradation has become a promising approach for cleaning up contaminated aquifers. To verify if in situ biodegradation of organic contaminants is taking place at a contaminated site and to determine if these processes are efficient enough to replace conventional cleanup technologies, a comprehensive characterization of site-specific biodegradation processes is essential. In recent years, several strategies including geochemical analyses, microbial and molecular methods, tracer tests, metabolite analysis, compound-specific isotope analysis, and in situ microcosms have been developed to investigate the relevance of biodegradation processes for cleaning up contaminated aquifers. In this review, we outline current approaches for the assessment of in situ biodegradation and discuss their potential and limitations. We also discuss the benefits of research strategies combining complementary methods to gain a more comprehensive understanding of the complex hydrogeological and microbial interactions governing contaminant biodegradation in the field.

  12. Current techniques for the real-time processing of complex radar signatures

    NASA Astrophysics Data System (ADS)

    Clay, E.

    A real-time processing technique has been developed for the microwave receiver of the Brahms radar station. The method allows such target signatures as the radar cross section (RCS) of the airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys to be characterized. The method allows optimization of experimental parameters including the analysis frequency band, the receiver gain, and the wavelength range of EM analysis.

  13. Reducing Risk of Salmonellosis through Egg Decontamination Processes.

    PubMed

    Keerthirathne, Thilini Piushani; Ross, Kirstin; Fallowfield, Howard; Whiley, Harriet

    2017-03-22

    Eggs have a high nutritional value and are an important ingredient in many food products. Worldwide foodborne illnesses, such as salmonellosis linked to the consumption of eggs and raw egg products, are a major public health concern. This review focuses on previous studies that have investigated the procedures for the production of microbiologically safe eggs. Studies exploring pasteurization and decontamination methods were investigated. Gamma irradiation, freeze drying, hot air, hot water, infra-red, atmospheric steam, microwave heating and radiofrequency heating are all different decontamination methods currently considered for the production of microbiologically safe eggs. However, each decontamination procedure has different effects on the properties and constituents of the egg. The pasteurization processes are the most widely used and best understood; however, they influence the coagulation, foaming and emulsifying properties of the egg. Future studies are needed to explore combinations of different decontamination methods to produce safe eggs without impacting the protein structure and usability. Currently, eggs which have undergone decontamination processes are primarily used in food prepared for vulnerable populations. However, the development of a decontamination method that does not affect egg properties and functionality could be used in food prepared for the general population to provide greater public health protection.

  14. Reducing Risk of Salmonellosis through Egg Decontamination Processes

    PubMed Central

    Keerthirathne, Thilini Piushani; Ross, Kirstin; Fallowfield, Howard; Whiley, Harriet

    2017-01-01

    Eggs have a high nutritional value and are an important ingredient in many food products. Worldwide foodborne illnesses, such as salmonellosis linked to the consumption of eggs and raw egg products, are a major public health concern. This review focuses on previous studies that have investigated the procedures for the production of microbiologically safe eggs. Studies exploring pasteurization and decontamination methods were investigated. Gamma irradiation, freeze drying, hot air, hot water, infra-red, atmospheric steam, microwave heating and radiofrequency heating are all different decontamination methods currently considered for the production of microbiologically safe eggs. However, each decontamination procedure has different effects on the properties and constituents of the egg. The pasteurization processes are the most widely used and best understood; however, they influence the coagulation, foaming and emulsifying properties of the egg. Future studies are needed to explore combinations of different decontamination methods to produce safe eggs without impacting the protein structure and usability. Currently, eggs which have undergone decontamination processes are primarily used in food prepared for vulnerable populations. However, the development of a decontamination method that does not affect egg properties and functionality could be used in food prepared for the general population to provide greater public health protection. PMID:28327524

  15. Renaissance of protein crystallization and precipitation in biopharmaceuticals purification.

    PubMed

    Dos Santos, Raquel; Carvalho, Ana Luísa; Roque, A Cecília A

    The current chromatographic approaches used in protein purification are not keeping pace with the increasing biopharmaceutical market demand. With the upstream improvements, the bottleneck shifted towards the downstream process. New approaches rely in Anything But Chromatography methodologies and revisiting former techniques with a bioprocess perspective. Protein crystallization and precipitation methods are already implemented in the downstream process of diverse therapeutic biological macromolecules, overcoming the current chromatographic bottlenecks. Promising work is being developed in order to implement crystallization and precipitation in the purification pipeline of high value therapeutic molecules. This review focuses in the role of these two methodologies in current industrial purification processes, and highlights their potential implementation in the purification pipeline of high value therapeutic molecules, overcoming chromatographic holdups. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Empirical Force Fields for Mechanistic Studies of Chemical Reactions in Proteins.

    PubMed

    Das, A K; Meuwly, M

    2016-01-01

    Following chemical reactions in atomistic detail is one of the most challenging aspects of current computational approaches to chemistry. In this chapter the application of adiabatic reactive MD (ARMD) and its multistate version (MS-ARMD) are discussed. Both methods allow to study bond-breaking and bond-forming processes in chemical and biological processes. Particular emphasis is put on practical aspects for applying the methods to investigate the dynamics of chemical reactions. The chapter closes with an outlook of possible generalizations of the methods discussed. © 2016 Elsevier Inc. All rights reserved.

  17. EVALUATION OF BIOSOLID SAMPLE PROCESSING TECHNIQUES TO MAXIMIZE RECOVERY OF BACTERIA

    EPA Science Inventory

    Current federal regulations (40 CFR 503) require enumeration of fecal coliform or Salmoella prior to land application of Class A biosolids. This regulation specifies use of enumeration methods included in "Standard Methods for the Examination of Water and Wastewater 18th Edition,...

  18. Adaptive Sequential Monte Carlo for Multiple Changepoint Analysis

    DOE PAGES

    Heard, Nicholas A.; Turcotte, Melissa J. M.

    2016-05-21

    Process monitoring and control requires detection of structural changes in a data stream in real time. This paper introduces an efficient sequential Monte Carlo algorithm designed for learning unknown changepoints in continuous time. The method is intuitively simple: new changepoints for the latest window of data are proposed by conditioning only on data observed since the most recent estimated changepoint, as these observations carry most of the information about the current state of the process. The proposed method shows improved performance over the current state of the art. Another advantage of the proposed algorithm is that it can be mademore » adaptive, varying the number of particles according to the apparent local complexity of the target changepoint probability distribution. This saves valuable computing time when changes in the changepoint distribution are negligible, and enables re-balancing of the importance weights of existing particles when a significant change in the target distribution is encountered. The plain and adaptive versions of the method are illustrated using the canonical continuous time changepoint problem of inferring the intensity of an inhomogeneous Poisson process, although the method is generally applicable to any changepoint problem. Performance is demonstrated using both conjugate and non-conjugate Bayesian models for the intensity. Lastly, appendices to the article are available online, illustrating the method on other models and applications.« less

  19. Faster methods for estimating arc centre position during VAR and results from Ti-6Al-4V and INCONEL 718 alloys

    NASA Astrophysics Data System (ADS)

    Nair, B. G.; Winter, N.; Daniel, B.; Ward, R. M.

    2016-07-01

    Direct measurement of the flow of electric current during VAR is extremely difficult due to the aggressive environment as the arc process itself controls the distribution of current. In previous studies the technique of “magnetic source tomography” was presented; this was shown to be effective but it used a computationally intensive iterative method to analyse the distribution of arc centre position. In this paper we present faster computational methods requiring less numerical optimisation to determine the centre position of a single distributed arc both numerically and experimentally. Numerical validation of the algorithms were done on models and experimental validation on measurements based on titanium and nickel alloys (Ti6Al4V and INCONEL 718). The results are used to comment on the effects of process parameters on arc behaviour during VAR.

  20. Reticulated shallow etch mesa isolation for controlling surface leakage in GaSb-based infrared detectors

    NASA Astrophysics Data System (ADS)

    Nolde, J. A.; Jackson, E. M.; Bennett, M. F.; Affouda, C. A.; Cleveland, E. R.; Canedy, C. L.; Vurgaftman, I.; Jernigan, G. G.; Meyer, J. R.; Aifer, E. H.

    2017-07-01

    Longwave infrared detectors using p-type absorbers composed of InAs-rich type-II superlattices (T2SLs) nearly always suffer from high surface currents due to carrier inversion on the etched sidewalls. Here, we demonstrate reticulated shallow etch mesa isolation (RSEMI): a structural method of reducing surface currents in longwave single-band and midwave/longwave dual-band detectors with p-type T2SL absorbers. By introducing a lateral shoulder to increase the separation between the n+ cathode and the inverted absorber surface, a substantial barrier to surface electron flow is formed. We demonstrate experimentally that the RSEMI process results in lower surface current, lower net dark current, much weaker dependence of the current on bias, and higher uniformity compared to mesas processed with a single deep etch. For the structure used, a shoulder width of 2 μm is sufficient to block surface currents.

  1. Separation of Calcium Isotopes by Counter-Current Electro-Migration in Molten Salts; SEPARATION DES ISOTOPES DU CALCIUM PAR ELECTRO-MIGRATION A CONTRECOURANT EN SELS FONDUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menes, F.; Dirian, G.; Roth, E.

    1962-01-01

    The method of counter-current electromigration in molten salts was applied to CaBr/sub 2/ with an alkali metal bromide added to the cathode compartment. Enrichments on Ca/sup 46/ greater than a factor of two were obtained at the anode. The mass effect was found to be about 0.06. An estimation of the cost of energy for a process based on this method was made. (auth)

  2. Methods And Devices For Characterizing Duplex Nucleic Acid Molecules

    DOEpatents

    Akeson, Mark; Vercoutere, Wenonah; Haussler, David; Winters-Hilt, Stephen

    2005-08-30

    Methods and devices are provided for characterizing a duplex nucleic acid, e.g., a duplex DNA molecule. In the subject methods, a fluid conducting medium that includes a duplex nucleic acid molecule is contacted with a nanopore under the influence of an applied electric field and the resulting changes in current through the nanopore caused by the duplex nucleic acid molecule are monitored. The observed changes in current through the nanopore are then employed as a set of data values to characterize the duplex nucleic acid, where the set of data values may be employed in raw form or manipulated, e.g., into a current blockade profile. Also provided are nanopore devices for practicing the subject methods, where the subject nanopore devices are characterized by the presence of an algorithm which directs a processing means to employ monitored changes in current through a nanopore to characterize a duplex nucleic acid molecule responsible for the current changes. The subject methods and devices find use in a variety of applications, including, among other applications, the identification of an analyte duplex DNA molecule in a sample, the specific base sequence at a single nulceotide polymorphism (SNP), and the sequencing of duplex DNA molecules.

  3. Eddy current simulation in thick cylinders of finite length induced by coils of arbitrary geometry.

    PubMed

    Sanchez Lopez, Hector; Poole, Michael; Crozier, Stuart

    2010-12-01

    Eddy currents are inevitably induced when time-varying magnetic field gradients interact with the metallic structures of a magnetic resonance imaging (MRI) scanner. The secondary magnetic field produced by this induced current degrades the spatial and temporal performance of the primary field generated by the gradient coils. Although this undesired effect can be minimized by using actively and/or passively shielded gradient coils and current pre-emphasis techniques, a residual eddy current still remains in the MRI scanner structure. Accurate simulation of these eddy currents is important in the successful design of gradient coils and magnet cryostat vessels. Efficient methods for simulating eddy currents are currently restricted to cylindrical-symmetry. The approach presented in this paper divides thick conducting cylinders into thin layers (thinner than the skin depth) and expresses the current density on each as a Fourier series. The coupling between each mode of the Fourier series with every other is modeled with an inductive network method. In this way, the eddy currents induced in realistic cryostat surfaces by coils of arbitrary geometry can be simulated. The new method was validated by simulating a canonical problem and comparing the results against a commercially available software package. An accurate skin depth of 2.76 mm was calculated in 6 min with the new method. The currents induced by an actively shielded x-gradient coil were simulated assuming a finite length cylindrical cryostat consisting of three different conducting materials. Details of the temporal-spatial induced current diffusion process were simulated through all cryostat layers, which could not be efficiently simulated with any other method. With this data, all quantities that depend on the current density, such as the secondary magnetic field, are simply evaluated. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. Detailed electrochemical studies of the tetraruthenium polyoxometalate water oxidation catalyst in acidic media: identification of an extended oxidation series using Fourier transformed alternating current voltammetry.

    PubMed

    Lee, Chong-Yong; Guo, Si-Xuan; Murphy, Aidan F; McCormac, Timothy; Zhang, Jie; Bond, Alan M; Zhu, Guibo; Hill, Craig L; Geletii, Yurii V

    2012-11-05

    The electrochemistry of the water oxidation catalyst, Rb(8)K(2)[{Ru(4)O(4)(OH)(2)(H(2)O)(4)}(γ-SiW(10)O(36))(2)] (Rb(8)K(2)-1(0)) has been studied in the presence and absence of potassium cations in both hydrochloric and sulfuric acid solutions by transient direct current (dc) cyclic voltammetry, a steady state dc method in the rotating disk configuration and the kinetically sensitive technique of Fourier transformed large-amplitude alternating current (ac) voltammetry. In acidic media, the presence of potassium ions affects the kinetics (apparent rate of electron transfer) and thermodynamics (reversible potentials) of the eight processes (A'/A to H/H') that are readily detected under dc voltammetric conditions. The six most positive processes (A'/A to F/F'), each involve a one electron ruthenium based charge transfer step (A'/A, B'/B are Ru(IV/V) oxidation and C/C' to F/F' are Ru(IV/III) reduction). The apparent rate of electron transfer of the ruthenium centers in sulfuric acid is higher than in hydrochloric acid. The addition of potassium cations increases the apparent rates and gives rise to a small shift of reversible potential. Simulations of the Fourier transformed ac voltammetry method show that the B'/B, E/E', and F/F' processes are quasi-reversible, while the others are close to reversible. A third Ru(IV/V) oxidation process is observed just prior to the positive potential limit via dc methods. Importantly, the ability of the higher harmonic components of the ac method to discriminate against the irreversible background solvent process allows this (process I) as well as an additional fourth reversible ruthenium based process (J) to be readily identified. The steady-state rotating disk electrode (RDE) method confirmed that all four Ru-centers in Rb(8)K(2)-1(0) are in oxidation state IV. The dc and ac data indicate that reversible potentials of the four ruthenium centers are evenly spaced, which may be relevant to understanding of the water oxidation electrocatalysis. A profound effect of the potassium cation is observed for the one-electron transfer process (G/G') assigned to Ru(III/II) reduction and the multiple electron transfer reduction process (H/H') that arise from the tungstate polyoxometalate framework. A significant shift of E°' to a more positive potential value for process H/H' was observed on removal of K(+) (~100 mV in H(2)SO(4) and ~50 mV in HCl).

  5. Some opinions on the review process of research papers destined for publication.

    PubMed

    Roohi, Ehsan; Mahian, Omid

    2015-06-01

    The current paper discusses the peer review process in journals that publish research papers purveying new science and understandings (scientific journals). Different aspects of peer review including the selection of reviewers, the review process and the decision policy of editor are discussed in details. Here, the pros and cons of different conventional methods of review processes are mentioned. Finally, a suggestion is presented for the review process of scientific papers.

  6. Method for removal of random noise in eddy-current testing system

    DOEpatents

    Levy, Arthur J.

    1995-01-01

    Eddy-current response voltages, generated during inspection of metallic structures for anomalies, are often replete with noise. Therefore, analysis of the inspection data and results is difficult or near impossible, resulting in inconsistent or unreliable evaluation of the structure. This invention processes the eddy-current response voltage, removing the effect of random noise, to allow proper identification of anomalies within and associated with the structure.

  7. Removing Beam Current Artifacts in Helium Ion Microscopy: A Comparison of Image Processing Techniques.

    PubMed

    Barlow, Anders J; Portoles, Jose F; Sano, Naoko; Cumpson, Peter J

    2016-10-01

    The development of the helium ion microscope (HIM) enables the imaging of both hard, inorganic materials and soft, organic or biological materials. Advantages include outstanding topographical contrast, superior resolution down to <0.5 nm at high magnification, high depth of field, and no need for conductive coatings. The instrument relies on helium atom adsorption and ionization at a cryogenically cooled tip that is atomically sharp. Under ideal conditions this arrangement provides a beam of ions that is stable for days to weeks, with beam currents in the order of picoamperes. Over time, however, this stability is lost as gaseous contamination builds up in the source region, leading to adsorbed atoms of species other than helium, which ultimately results in beam current fluctuations. This manifests itself as horizontal stripe artifacts in HIM images. We investigate post-processing methods to remove these artifacts from HIM images, such as median filtering, Gaussian blurring, fast Fourier transforms, and principal component analysis. We arrive at a simple method for completely removing beam current fluctuation effects from HIM images while maintaining the full integrity of the information within the image.

  8. Occupancy change detection system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2009-09-01

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes instructions for producing an occupancy grid map of an environment around the robot, scanning the environment to generate a current obstacle map relative to a current robot position, and converting the current obstacle map to a current occupancy grid map. The instructions also include processing each grid cell in the occupancy grid map. Within the processing of each grid cell, the instructions include comparing each grid cell in the occupancy grid map to a corresponding grid cell in the current occupancy grid map. For grid cells with a difference, the instructions include defining a change vector for each changed grid cell, wherein the change vector includes a direction from the robot to the changed grid cell and a range from the robot to the changed grid cell.

  9. Method development and qualification of capillary zone electrophoresis for investigation of therapeutic monoclonal antibody quality.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2016-10-01

    Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Method for fabricating carbon/lithium-ion electrode for rechargeable lithium cell

    NASA Technical Reports Server (NTRS)

    Attia, Alan I. (Inventor); Halpert, Gerald (Inventor); Huang, Chen-Kuo (Inventor); Surampudi, Subbarao (Inventor)

    1995-01-01

    The method includes steps for forming a carbon electrode composed of graphitic carbon particles adhered by an ethylene propylene diene monomer binder. An effective binder composition is disclosed for achieving a carbon electrode capable of subsequent intercalation by lithium ions. The method also includes steps for reacting the carbon electrode with lithium ions to incorporate lithium ions into graphitic carbon particles of the electrode. An electrical current is repeatedly applied to the carbon electrode to initially cause a surface reaction between the lithium ions and to the carbon and subsequently cause intercalation of the lithium ions into crystalline layers of the graphitic carbon particles. With repeated application of the electrical current, intercalation is achieved to near a theoretical maximum. Two differing multi-stage intercalation processes are disclosed. In the first, a fixed current is reapplied. In the second, a high current is initially applied, followed by a single subsequent lower current stage. Resulting carbon/lithium-ion electrodes are well suited for use as an anode in a reversible, ambient temperature, lithium cell.

  11. Low cost solar silicon production

    NASA Astrophysics Data System (ADS)

    Mede, Matt

    2009-08-01

    The worldwide demand for solar grade silicon reached an all time high between 2007 and 2008. Although growth in the solar industry is slowing due to the current economic downturn, demand is expected to rebound in 2011 based on current cost models. However, demand will increase even more than currently anticipated if costs are reduced. This situation creates an opportunity for new and innovative approaches to the production of photovoltaic grade silicon, especially methods which can demonstrate cost reductions over currently utilized processes.

  12. Improvement of radiology services based on the process management approach.

    PubMed

    Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria

    2011-06-01

    The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  13. A Simple Method for High-Performance, Solution-Processed, Amorphous ZrO2 Gate Insulator TFT with a High Concentration Precursor

    PubMed Central

    Cai, Wei; Zhu, Zhennan; Wei, Jinglin; Fang, Zhiqiang; Zheng, Zeke; Zhou, Shangxiong; Peng, Junbiao; Lu, Xubing

    2017-01-01

    Solution-processed high-k dielectric TFTs attract much attention since they cost relatively little and have a simple fabrication process. However, it is still a challenge to reduce the leakage of the current density of solution-processed dielectric TFTs. Here, a simple solution method is presented towards enhanced performance of ZrO2 films by intentionally increasing the concentration of precursor. The ZrO2 films not only exhibit a low leakage current density of 10−6 A/cm2 at 10 V and a breakdown field of 2.5 MV/cm, but also demonstrate a saturation mobility of 12.6 cm2·V−1·s−1 and a Ion/Ioff ratio of 106 in DC pulse sputtering IGZO-TFTs based on these films. Moreover, the underlying mechanism of influence of precursor concentration on film formation is presented. Higher concentration precursor results in a thicker film within same coating times with reduced ZrO2/IGZO interface defects and roughness. It shows the importance of thickness, roughness, and annealing temperature in solution-processed dielectric oxide TFT and provides an approach to precisely control solution-processed oxide films thickness. PMID:28825652

  14. A Simple Method for High-Performance, Solution-Processed, Amorphous ZrO₂ Gate Insulator TFT with a High Concentration Precursor.

    PubMed

    Cai, Wei; Zhu, Zhennan; Wei, Jinglin; Fang, Zhiqiang; Ning, Honglong; Zheng, Zeke; Zhou, Shangxiong; Yao, Rihui; Peng, Junbiao; Lu, Xubing

    2017-08-21

    Solution-processed high-k dielectric TFTs attract much attention since they cost relatively little and have a simple fabrication process. However, it is still a challenge to reduce the leakage of the current density of solution-processed dielectric TFTs. Here, a simple solution method is presented towards enhanced performance of ZrO₂ films by intentionally increasing the concentration of precursor. The ZrO₂ films not only exhibit a low leakage current density of 10 -6 A/cm² at 10 V and a breakdown field of 2.5 MV/cm, but also demonstrate a saturation mobility of 12.6 cm²·V -1 ·s -1 and a I on /I off ratio of 10⁶ in DC pulse sputtering IGZO-TFTs based on these films. Moreover, the underlying mechanism of influence of precursor concentration on film formation is presented. Higher concentration precursor results in a thicker film within same coating times with reduced ZrO₂/IGZO interface defects and roughness. It shows the importance of thickness, roughness, and annealing temperature in solution-processed dielectric oxide TFT and provides an approach to precisely control solution-processed oxide films thickness.

  15. Theoretical model and experimental investigation of current density boundary condition for welding arc study

    NASA Astrophysics Data System (ADS)

    Boutaghane, A.; Bouhadef, K.; Valensi, F.; Pellerin, S.; Benkedda, Y.

    2011-04-01

    This paper presents results of theoretical and experimental investigation of the welding arc in Gas Tungsten Arc Welding (GTAW) and Gas Metal Arc Welding (GMAW) processes. A theoretical model consisting in simultaneous resolution of the set of conservation equations for mass, momentum, energy and current, Ohm's law and Maxwell equation is used to predict temperatures and current density distribution in argon welding arcs. A current density profile had to be assumed over the surface of the cathode as a boundary condition in order to make the theoretical calculations possible. In stationary GTAW process, this assumption leads to fair agreement with experimental results reported in literature with maximum arc temperatures of ~21 000 K. In contrast to the GTAW process, in GMAW process, the electrode is consumable and non-thermionic, and a realistic boundary condition of the current density is lacking. For establishing this crucial boundary condition which is the current density in the anode melting electrode, an original method is setup to enable the current density to be determined experimentally. High-speed camera (3000 images/s) is used to get geometrical dimensions of the welding wire used as anode. The total area of the melting anode covered by the arc plasma being determined, the current density at the anode surface can be calculated. For a 330 A arc, the current density at the melting anode surface is found to be of 5 × 107 A m-2 for a 1.2 mm diameter welding electrode.

  16. Current Strategies for the Detoxification of Jatropha curcas Seed Cake: A Review.

    PubMed

    Gomes, Taisa G; Hadi, Sámed I I A; Costa Alves, Gabriel S; Mendonça, Simone; De Siqueira, Felix G; Miller, Robert N G

    2018-03-21

    Jatropha curcas is an important oilseed plant, with considerable potential in the development of biodiesel. Although Jatropha seed cake, the byproduct of oil extraction, is a residue rich in nitrogen, phosphorus, potassium, and carbon, with high protein content suitable for application in animal feed, the presence of toxic phorbol esters limits its application in feed supplements and fertilizers. This review summarizes the current methods available for detoxification of this residue, based upon chemical, physical, biological, or combined processes. The advantages and disadvantages of each process are discussed, and future directions involving genomic and proteomic approaches for advancing our understanding of biodegradation processes involving microorganisms are highlighted.

  17. Solar thermal drum drying performance of prune and tomato pomaces

    USDA-ARS?s Scientific Manuscript database

    Fruit and vegetable pomaces are co-products of the food processing industry; they are underutilized in part because their high water activity (aw) renders them unstable. Drum drying is one method that can dry/stabilize pomaces, but current drum drying methods utilize conventional, high-environmental...

  18. IMPROVING PHOTOCATALYTIC PROPERTIES OF TIO2 THROUGH THIN FILM COATING AND METAL DOPING VIA FLAME AEROSOL COATING METHOD

    EPA Science Inventory

    There has been an increasing demand for efficient, economical and environmentally friendly methods for partial oxidation of hydrocarbons by molecular oxygen, to desirable industrial feedstock oxygenates. Current processes are energy intensive, have low conversion efficiencies and...

  19. Schools versus Students' Rights: Can Alternative Dispute Resolution Build a Community.

    ERIC Educational Resources Information Center

    Goldberg, Steven S.

    1995-01-01

    Schools' regulation by external forces has rendered the education process secondary to avoidance of litigation. Alternative dispute resolution (ADR) provides an answer to the adversarial process currently in place within education. ADR offers negotiation and mediation as methods to resolve conflict, avoid litigation, and increase the likelihood of…

  20. Saliency-aware food image segmentation for personal dietary assessment using a wearable computer

    USDA-ARS?s Scientific Manuscript database

    Image-based dietary assessment has recently received much attention in the community of obesity research. In this assessment, foods in digital pictures are specified, and their portion sizes (volumes) are estimated. Although manual processing is currently the most utilized method, image processing h...

  1. The application of ultrasound and enzymes in textile processing of greige cotton

    USDA-ARS?s Scientific Manuscript database

    Research progress made at the USDA’s Southern Regional Research Center to provide an ultrasound and enzymatic alternative to the current textile processing method of scouring greige cotton textile with caustic chemicals is reported. The review covers early efforts to measure pectin and wax removal ...

  2. Application of high speed machining technology in aviation

    NASA Astrophysics Data System (ADS)

    Bałon, Paweł; Szostak, Janusz; Kiełbasa, Bartłomiej; Rejman, Edward; Smusz, Robert

    2018-05-01

    Aircraft structures are exposed to many loads during their working lifespan. Every particular action made during a flight is composed of a series of air movements which generate various aircraft loads. The most rigorous requirement which modern aircraft structures must fulfill is to maintain their high durability and reliability. This requirement involves taking many restrictions into account during the aircraft design process. The most important factor is the structure's overall mass, which has a crucial impact on both utility properties and cost-effectiveness. This makes aircraft one of the most complex results of modern technology. Additionally, there is currently an increasing utilization of high strength aluminum alloys, which requires the implementation of new manufacturing processes. High Speed Machining technology (HSM) is currently one of the most important machining technologies used in the aviation industry, especially in the machining of aluminium alloys. The primary difference between HSM and other milling techniques is the ability to select cutting parameters - depth of the cut layer, feed rate, and cutting speed in order to simultaneously ensure high quality, precision of the machined surface, and high machining efficiency, all of which shorten the manufacturing process of the integral components. In this paper, the authors explain the implementation of the HSM method in integral aircraft constructions. It presents the method of the airframe manufacturing method, and the final results. The HSM method is compared to the previous method where all subcomponents were manufactured by bending and forming processes, and then, they were joined by riveting.

  3. Automatic cloud coverage assessment of Formosat-2 image

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  4. Damage and annealing recovery of boron-implanted ultra-shallow junction: The correlation between beam current and surface configuration

    NASA Astrophysics Data System (ADS)

    Chang, Feng-Ming; Wu, Zong-Zhe; Lin, Yen-Fu; Kao, Li-Chi; Wu, Cheng-Ta; JangJian, Shiu-Ko; Chen, Yuan-Nian; Lo, Kuang Yao

    2018-03-01

    The condition of the beam current in the implantation process is a key issue in the damage rate and structural evolution in the sequent annealing process, especially for ultra-shallow layers. In this work, we develop a compensative optical method combined with UV Raman, X-ray photoelectron spectroscopy (XPS), and X-ray absorption near edge spectroscopy (XANES) to inspect the influence of the beam current in the implantation process. The optima condition of the beam current in the implantation process is determined by higher effective Si-B bond portion in UV Raman spectra and less the peak of B-B bond in XPS spectra which is caused by B cluster defects. Results of XANES indicate that the B oxide layer is formed on the surface of the ultra-shallow junction. The defects in the ultra-shallow junction after annealing are analyzed by novel optical analyses, which cannot be inspected by a traditional thermal wave and resistance measurement. This work exhibits the structural variation of the ultra-shallow junction via a variant beam current and provides a valuable metrology in examining the chemical states and the effective activation in the implantation technology.

  5. Multilayer integral method for simulation of eddy currents in thin volumes of arbitrary geometry produced by MRI gradient coils.

    PubMed

    Sanchez Lopez, Hector; Freschi, Fabio; Trakic, Adnan; Smith, Elliot; Herbert, Jeremy; Fuentes, Miguel; Wilson, Stephen; Liu, Limei; Repetto, Maurizio; Crozier, Stuart

    2014-05-01

    This article aims to present a fast, efficient and accurate multi-layer integral method (MIM) for the evaluation of complex spatiotemporal eddy currents in nonmagnetic and thin volumes of irregular geometries induced by arbitrary arrangements of gradient coils. The volume of interest is divided into a number of layers, wherein the thickness of each layer is assumed to be smaller than the skin depth and where one of the linear dimensions is much smaller than the remaining two dimensions. The diffusion equation of the current density is solved both in time-harmonic and transient domain. The experimentally measured magnetic fields produced by the coil and the induced eddy currents as well as the corresponding time-decay constants were in close agreement with the results produced by the MIM. Relevant parameters such as power loss and force induced by the eddy currents in a split cryostat were simulated using the MIM. The proposed method is capable of accurately simulating the current diffusion process inside thin volumes, such as the magnet cryostat. The method permits the priori-calculation of optimal pre-emphasis parameters. The MIM enables unified designs of gradient coil-magnet structures for an optimal mitigation of deleterious eddy current effects. Copyright © 2013 Wiley Periodicals, Inc.

  6. Application of computational methods to analyse and investigate physical and chemical processes of high-temperature mineralizing of condensed substances in gas stream

    NASA Astrophysics Data System (ADS)

    Markelov, A. Y.; Shiryaevskii, V. L.; Kudrinskiy, A. A.; Anpilov, S. V.; Bobrakov, A. N.

    2017-11-01

    A computational method of analysis of physical and chemical processes of high-temperature mineralizing of low-level radioactive waste in gas stream in the process of plasma treatment of radioactive waste in shaft furnaces was introduced. It was shown that the thermodynamic simulation method allows fairly adequately describing the changes in the composition of the pyrogas withdrawn from the shaft furnace at different waste treatment regimes. This offers a possibility of developing environmentally and economically viable technologies and small-sized low-cost facilities for plasma treatment of radioactive waste to be applied at currently operating nuclear power plants.

  7. Catalysts and methods for converting carbonaceous materials to fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hensley, Jesse; Ruddy, Daniel A.; Schaidle, Joshua A.

    Catalysts and processes designed to convert DME and/or methanol and hydrogen (H.sub.2) to desirable liquid fuels are described. These catalysts produce the fuels efficiently and with a high selectivity and yield, and reduce the formation of aromatic hydrocarbons by incorporating H.sub.2 into the products. Also described are process methods to further upgrade these fuels to higher molecular weight liquid fuel mixtures, which have physical properties comparable with current commercially used liquid fuels.

  8. Protein arginine methylation: Cellular functions and methods of analysis.

    PubMed

    Pahlich, Steffen; Zakaryan, Rouzanna P; Gehring, Heinz

    2006-12-01

    During the last few years, new members of the growing family of protein arginine methyltransferases (PRMTs) have been identified and the role of arginine methylation in manifold cellular processes like signaling, RNA processing, transcription, and subcellular transport has been extensively investigated. In this review, we describe recent methods and findings that have yielded new insights into the cellular functions of arginine-methylated proteins, and we evaluate the currently used procedures for the detection and analysis of arginine methylation.

  9. A mixed-methods study on perceptions towards use of Rapid Ethical Assessment to improve informed consent processes for health research in a low-income setting

    PubMed Central

    2014-01-01

    Background Rapid Ethical Assessment (REA) is a form of rapid ethnographic assessment conducted at the beginning of research project to guide the consent process with the objective of reconciling universal ethical guidance with specific research contexts. The current study is conducted to assess the perceived relevance of introducing REA as a mainstream tool in Ethiopia. Methods Mixed methods research using a sequential explanatory approach was conducted from July to September 2012, including 241 cross-sectional, self-administered and 19 qualitative, in-depth interviews among health researchers and regulators including ethics committee members in Ethiopian health research institutions and universities. Results In their evaluation of the consent process, only 40.2% thought that the consent process and information given were adequately understood by study participants; 84.6% claimed they were not satisfied with the current consent process and 85.5% thought the best interests of study participants were not adequately considered. Commonly mentioned consent-related problems included lack of clarity (48.1%), inadequate information (34%), language barriers (28.2%), cultural differences (27.4%), undue expectations (26.6%) and power imbalances (20.7%). About 95.4% believed that consent should be contextualized to the study setting and 39.4% thought REA would be an appropriate approach to improve the perceived problems. Qualitative findings helped to further explore the gaps identified in the quantitative findings and to map-out concerns related to the current research consent process in Ethiopia. Suggestions included, conducting REA during the pre-test (pilot) phase of studies when applicable. The need for clear guidance for researchers on issues such as when and how to apply the REA tools was stressed. Conclusion The study findings clearly indicated that there are perceived to be correctable gaps in the consent process of medical research in Ethiopia. REA is considered relevant by researchers and stakeholders to address these gaps. Exploring further the feasibility and applicability of REA is recommended. PMID:24885049

  10. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  11. Preparation of SmBCO layer for the surface optimization of GdYBCO film by MOCVD process based on a simple self-heating technology

    NASA Astrophysics Data System (ADS)

    Zhao, Ruipeng; Zhang, Fei; Liu, Qing; Xia, Yudong; Lu, Yuming; Cai, Chuanbing; Tao, Bowan; Li, Yanrong

    2018-07-01

    The MOCVD process was adopted to grow the REBa2Cu3O7-δ ((REBCO), RE = rare earth elements) films on the LaMnO3 (LMO) templates. Meanwhile, the LMO-template tapes are heated by the joule effect after applying a heating current through the Hastelloy metal substrates. The surface of GdYBCO films prepared by MOCVD method is prone to form outgrowths. So the surface morphology of GdYBCO film is optimized by depositing the SmBCO layer, which is an important process method for the preparation of high-quality multilayer REBCO films. At last, the GdYBCO/SmBCO/GdYBCO multilayer films were successfully prepared on the LMO templates based on the simple self-heating method. It is demonstrated that the GdYBCO surface was well improved by the characterization analysis of scanning electron microscope. And the Δω of REBCO (005) and Δφ of REBCO (103), which were performed by an X-ray diffraction system, are respectively 1.3° and 3.3° What's more, the critical current density (Jc) has been more than 3 MA/cm2 (77 K, 0 T) and the critical current (Ic) basically shows a trend of good linear increase with the increase of the number of REBCO layers.

  12. Toward the automated generation of genome-scale metabolic networks in the SEED.

    PubMed

    DeJongh, Matthew; Formsma, Kevin; Boillot, Paul; Gould, John; Rycenga, Matthew; Best, Aaron

    2007-04-26

    Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis). We have implemented our tools and database within the SEED, an open-source software environment for comparative genome annotation and analysis. Our method sets the stage for the automated generation of substantially complete metabolic networks for over 400 complete genome sequences currently in the SEED. With each genome that is processed using our tools, the database of common components grows to cover more of the diversity of metabolic pathways. This increases the likelihood that components of reaction networks for subsequently processed genomes can be retrieved from the database, rather than assembled and verified manually.

  13. Current Technical Approaches for the Early Detection of Foodborne Pathogens: Challenges and Opportunities.

    PubMed

    Cho, Il-Hoon; Ku, Seockmo

    2017-09-30

    The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

  14. Hybrid finite element and Brownian dynamics method for diffusion-controlled reactions.

    PubMed

    Bauler, Patricia; Huber, Gary A; McCammon, J Andrew

    2012-04-28

    Diffusion is often the rate determining step in many biological processes. Currently, the two main computational methods for studying diffusion are stochastic methods, such as Brownian dynamics, and continuum methods, such as the finite element method. This paper proposes a new hybrid diffusion method that couples the strengths of each of these two methods. The method is derived for a general multidimensional system, and is presented using a basic test case for 1D linear and radially symmetric diffusion systems.

  15. Allograft update: the current status of tissue regulation, procurement, processing, and sterilization.

    PubMed

    McAllister, David R; Joyce, Michael J; Mann, Barton J; Vangsness, C Thomas

    2007-12-01

    Allografts are commonly used during sports medicine surgical procedures in the United States, and their frequency of use is increasing. Based on surgeon reports, it is estimated that more than 60 000 allografts were used in knee surgeries by members of the American Orthopaedic Society for Sports Medicine in 2005. In the United States, there are governmental agencies and other regulatory bodies involved in the oversight of tissue banks. In 2005, the Food and Drug Administration finalized its requirements for current good tissue practice and has mandated new rules regarding the "manufacture" of allogenic tissue. In response to well-publicized infections associated with the implantation of allograft tissue, some tissue banks have developed methods to sterilize allograft tissue. Although many surgeons have significant concerns about the safety of allografts, the majority believe that sterilized allografts are safe but that the sterilization process negatively affects tissue biology and biomechanics. However, most know very little about the principles of sterilization and the proprietary processes currently used in tissue banking. This article will review the current status of allograft tissue regulation, procurement, processing, and sterilization in the United States.

  16. Identifying failure in a tree network of a parallel computer

    DOEpatents

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  17. Improved Linear Algebra Methods for Redshift Computation from Limited Spectrum Data - II

    NASA Technical Reports Server (NTRS)

    Foster, Leslie; Waagen, Alex; Aijaz, Nabella; Hurley, Michael; Luis, Apolo; Rinsky, Joel; Satyavolu, Chandrika; Gazis, Paul; Srivastava, Ashok; Way, Michael

    2008-01-01

    Given photometric broadband measurements of a galaxy, Gaussian processes may be used with a training set to solve the regression problem of approximating the redshift of this galaxy. However, in practice solving the traditional Gaussian processes equation is too slow and requires too much memory. We employed several methods to avoid this difficulty using algebraic manipulation and low-rank approximation, and were able to quickly approximate the redshifts in our testing data within 17 percent of the known true values using limited computational resources. The accuracy of one method, the V Formulation, is comparable to the accuracy of the best methods currently used for this problem.

  18. Untangling Autophagy Measurements: All Fluxed Up

    PubMed Central

    Gottlieb, Roberta A.; Andres, Allen M.; Sin, Jon; Taylor, David

    2015-01-01

    Autophagy is an important physiological process in the heart, and alterations in autophagic activity can exacerbate or mitigate injury during various pathological processes. Methods to assess autophagy have changed rapidly as the field of research has expanded. As with any new field, methods and standards for data analysis and interpretation evolve as investigators acquire experience and insight. The purpose of this review is to summarize current methods to measure autophagy, selective mitochondrial autophagy (mitophagy), and autophagic flux. We will examine several published studies where confusion arose in in data interpretation, in order to illustrate the challenges. Finally we will discuss methods to assess autophagy in vivo and in patients. PMID:25634973

  19. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  20. Desolvation Induced Origami of Photocurable Polymers by Digit Light Processing.

    PubMed

    Zhao, Zeang; Wu, Jiangtao; Mu, Xiaoming; Chen, Haosen; Qi, H Jerry; Fang, Daining

    2017-07-01

    Self-folding origami is of great interest in current research on functional materials and structures, but there is still a challenge to develop a simple method to create freestanding, reversible, and complex origami structures. This communication provides a feasible solution to this challenge by developing a method based on the digit light processing technique and desolvation-induced self-folding. In this new method, flat polymer sheets can be cured by a light field from a commercial projector with varying intensity, and the self-folding process is triggered by desolvation in water. Folded origami structures can be recovered once immersed in the swelling medium. The self-folding process is investigated both experimentally and theoretically. Diverse 3D origami shapes are demonstrated. This method can be used for responsive actuators and the fabrication of 3D electronic devices. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Automatic tracking of labeled red blood cells in microchannels.

    PubMed

    Pinho, Diana; Lima, Rui; Pereira, Ana I; Gayubo, Fernando

    2013-09-01

    The current study proposes an automatic method for the segmentation and tracking of red blood cells flowing through a 100- μm glass capillary. The original images were obtained by means of a confocal system and then processed in MATLAB using the Image Processing Toolbox. The measurements obtained with the proposed automatic method were compared with the results determined by a manual tracking method. The comparison was performed by using both linear regressions and Bland-Altman analysis. The results have shown a good agreement between the two methods. Therefore, the proposed automatic method is a powerful way to provide rapid and accurate measurements for in vitro blood experiments in microchannels. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Nursing Admission Practices to Discern "Fit": A Case Study Exemplar

    ERIC Educational Resources Information Center

    Sinutko, Jaime M.

    2014-01-01

    Admission to a baccalaureate nursing school in the United States is currently a challenging proposition for a variety of reasons. This research explored a holistic nursing school admission process at a small, private, baccalaureate college using a retrospective, mixed-method, approach. The holistic method included multiple admission criteria, both…

  3. The Context Oriented Training Method.

    ERIC Educational Resources Information Center

    Cavrini, Andrea

    The Context Oriented Training (COT) method is introduced and explored in this paper. COT is a means of improving the training process, beginning with the observation and analysis of current corporate experiences in the field. The learning context lies between the development of professional competencies in training and the operational side in the…

  4. A Selection Method That Succeeds!

    ERIC Educational Resources Information Center

    Weitman, Catheryn J.

    Provided a structural selection method is carried out, it is possible to find quality early childhood personnel. The hiring process involves five definite steps, each of which establishes a base for the next. A needs assessment formulating basic minimal qualifications is the first step. The second step involves review of current job descriptions…

  5. Improving the Bandwidth Selection in Kernel Equating

    ERIC Educational Resources Information Center

    Andersson, Björn; von Davier, Alina A.

    2014-01-01

    We investigate the current bandwidth selection methods in kernel equating and propose a method based on Silverman's rule of thumb for selecting the bandwidth parameters. In kernel equating, the bandwidth parameters have previously been obtained by minimizing a penalty function. This minimization process has been criticized by practitioners…

  6. Apparatus and method for controlling the rotary airlocks in a coal processing system by reversing the motor current rotating the air lock

    DOEpatents

    Groombridge, Clifton E.

    1996-01-01

    An improvement to a coal processing system where hard materials found in the coal may cause jamming of either inflow or outflow rotary airlocks, each driven by a reversible motor. The instantaneous current used by the motor is continually monitored and compared to a predetermined value. If an overcurrent condition occurs, indicating a jamming of the airlock, a controller means starts a "soft" reverse rotation of the motor thereby clearing the jamming. Three patterns of the motor reversal are provided.

  7. Rehabilitation centers in change: participatory methods for managing redesign and renovation.

    PubMed

    Lahtinen, Marjaana; Nenonen, Suvi; Rasila, Heidi; Lehtelä, Jouni; Ruohomäki, Virpi; Reijula, Kari

    2014-01-01

    The aim of this article is to describe a set of participatory methods that we have either developed or modified for developing future work and service environments to better suit renewable rehabilitation processes. We discuss the methods in a larger framework of change process model and participatory design. Rehabilitation organizations are currently in transition; customer groups, financing, services, and the processes of rehabilitation centers are changing. The pressure for change challenges the centers to develop both their processes and facilities. There is a need for methods that support change management. Four participatory methods were developed: future workshop, change survey, multi-method assessment tool, and participatory design generator cards. They were tested and evaluated in three rehabilitation centers at the different phases of their change process. The developed methods were considered useful in creating a mutual understanding of the change goals between different stakeholders, providing a good picture of the work community's attitudes toward the change, forming an integrated overview of the built and perceived environment, inspiring new solutions, and supporting the management in steering the change process. The change process model described in this article serves as a practical framework that combined the viewpoints of organizational and facility development. However, participatory design continues to face challenges concerning communication between different stakeholders, and further development of the methods and processes is still needed. Intervention studies could provide data on the success factors that enhance the transformations in the rehabilitation sector. Design process, methodology, organizational transformation, planning, renovation.

  8. Method and Process Development of Advanced Atmospheric Plasma Spraying for Thermal Barrier Coatings

    NASA Astrophysics Data System (ADS)

    Mihm, Sebastian; Duda, Thomas; Gruner, Heiko; Thomas, Georg; Dzur, Birger

    2012-06-01

    Over the last few years, global economic growth has triggered a dramatic increase in the demand for resources, resulting in steady rise in prices for energy and raw materials. In the gas turbine manufacturing sector, process optimizations of cost-intensive production steps involve a heightened potential of savings and form the basis for securing future competitive advantages in the market. In this context, the atmospheric plasma spraying (APS) process for thermal barrier coatings (TBC) has been optimized. A constraint for the optimization of the APS coating process is the use of the existing coating equipment. Furthermore, the current coating quality and characteristics must not change so as to avoid new qualification and testing. Using experience in APS and empirically gained data, the process optimization plan included the variation of e.g. the plasma gas composition and flow-rate, the electrical power, the arrangement and angle of the powder injectors in relation to the plasma jet, the grain size distribution of the spray powder and the plasma torch movement procedures such as spray distance, offset and iteration. In particular, plasma properties (enthalpy, velocity and temperature), powder injection conditions (injection point, injection speed, grain size and distribution) and the coating lamination (coating pattern and spraying distance) are examined. The optimized process and resulting coating were compared to the current situation using several diagnostic methods. The improved process significantly reduces costs and achieves the requirement of comparable coating quality. Furthermore, a contribution was made towards better comprehension of the APS of ceramics and the definition of a better method for future process developments.

  9. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  10. Study of switching electric circuits with DC hybrid breaker, one stage

    NASA Astrophysics Data System (ADS)

    Niculescu, T.; Marcu, M.; Popescu, F. G.

    2016-06-01

    The paper presents a method of extinguishing the electric arc that occurs between the contacts of direct current breakers. The method consists of using an LC type extinguishing group to be optimally sized. From this point of view is presented a theoretical approach to the phenomena that occurs immediately after disconnecting the load and the specific diagrams are drawn. Using these, the elements extinguishing group we can choose. At the second part of the paper there is presented an analyses of the circuit switching process by decomposing the process in particular time sequences. For every time interval there was conceived a numerical simulation model in MATLAB-SIMULINK medium which integrates the characteristic differential equation and plots the capacitor voltage variation diagram and the circuit dumping current diagram.

  11. A practical method of determining water current velocities and diffusion coefficients in coastal waters by remote sensing techniques

    NASA Technical Reports Server (NTRS)

    James, W. P.

    1971-01-01

    A simplified procedure is presented for determining water current velocities and diffusion coefficients. Dye drops which form dye patches in the receiving water are made from an aircraft. The changes in position and size of the patches are recorded from two flights over the area. The simplified data processing procedure requires only that the ground coordinates about the dye patches be determined at the time of each flight. With an automatic recording coordinatograph for measuring coordinates and a computer for processing the data, this technique provides a practical method of determining circulation patterns and mixing characteristics of large aquatic systems. This information is useful in assessing the environmental impact of waste water discharges and for industrial plant siting.

  12. An AK-LDMeans algorithm based on image clustering

    NASA Astrophysics Data System (ADS)

    Chen, Huimin; Li, Xingwei; Zhang, Yongbin; Chen, Nan

    2018-03-01

    Clustering is an effective analytical technique for handling unmarked data for value mining. Its ultimate goal is to mark unclassified data quickly and correctly. We use the roadmap for the current image processing as the experimental background. In this paper, we propose an AK-LDMeans algorithm to automatically lock the K value by designing the Kcost fold line, and then use the long-distance high-density method to select the clustering centers to further replace the traditional initial clustering center selection method, which further improves the efficiency and accuracy of the traditional K-Means Algorithm. And the experimental results are compared with the current clustering algorithm and the results are obtained. The algorithm can provide effective reference value in the fields of image processing, machine vision and data mining.

  13. [Thinking on designation of sham acupuncture in clinical research].

    PubMed

    Pan, Li-Jia; Chen, Bo; Zhao, Xue; Guo, Yi

    2014-01-01

    Randomized controlled trials (RCT) is the source of the raw data of evidence-based medicine. Blind method is adopted in most of the high-quality RCT. Sham acupuncture is the main form of blinded in acupuncture clinical trial. In order to improve the quality of acupuncture clinical trail, based on the necessity of sham acupuncture in clinical research, the current situation as well as the existing problems of sham acupuncture, suggestions were put forward from the aspects of new way and new designation method which can be adopted as reference, and factors which have to be considered during the process of implementing. Various subjective and objective factors involving in the process of trial should be considered, and used of the current international standards, try to be quantification, and carry out strict quality monitoring.

  14. Listening to Early Career Teachers: How Can Elementary Mathematics Methods Courses Better Prepare Them to Utilize Standards-Based Practices in Their Classrooms?

    ERIC Educational Resources Information Center

    Coester, Lee Anne

    2010-01-01

    This study was designed to gather input from early career elementary teachers with the goal of finding ways to improve elementary mathematics methods courses. Multiple areas were explored including the degree to which respondents' elementary mathematics methods course focused on the NCTM Process Standards, the teachers' current standards-based…

  15. Transfer path analysis: Current practice, trade-offs and consideration of damping

    NASA Astrophysics Data System (ADS)

    Oktav, Akın; Yılmaz, Çetin; Anlaş, Günay

    2017-02-01

    Current practice of experimental transfer path analysis is discussed in the context of trade-offs between accuracy and time cost. An overview of methods, which propose solutions for structure borne noise, is given, where assumptions, drawbacks and advantages of methods are stated theoretically. Applicability of methods is also investigated, where an engine induced structure borne noise of an automobile is taken as a reference problem. Depending on this particular problem, sources of measurement errors, processing operations that affect results and physical obstacles faced in the application are analysed. While an operational measurement is common in all stated methods, when it comes to removal of source, or the need for an external excitation, discrepancies are present. Depending on the chosen method, promised outcomes like independent characterisation of the source, or getting information about mounts also differ. Although many aspects of the problem are reported in the literature, damping and its effects are not considered. Damping effect is embedded in the measured complex frequency response functions, and it is needed to be analysed in the post processing step. Effects of damping, reasons and methods to analyse them are discussed in detail. In this regard, a new procedure, which increases the accuracy of results, is also proposed.

  16. From Discovery to Production: Biotechnology of Marine Fungi for the Production of New Antibiotics

    PubMed Central

    Silber, Johanna; Kramer, Annemarie; Labes, Antje; Tasdemir, Deniz

    2016-01-01

    Filamentous fungi are well known for their capability of producing antibiotic natural products. Recent studies have demonstrated the potential of antimicrobials with vast chemodiversity from marine fungi. Development of such natural products into lead compounds requires sustainable supply. Marine biotechnology can significantly contribute to the production of new antibiotics at various levels of the process chain including discovery, production, downstream processing, and lead development. However, the number of biotechnological processes described for large-scale production from marine fungi is far from the sum of the newly-discovered natural antibiotics. Methods and technologies applied in marine fungal biotechnology largely derive from analogous terrestrial processes and rarely reflect the specific demands of the marine fungi. The current developments in metabolic engineering and marine microbiology are not yet transferred into processes, but offer numerous options for improvement of production processes and establishment of new process chains. This review summarises the current state in biotechnological production of marine fungal antibiotics and points out the enormous potential of biotechnology in all stages of the discovery-to-development pipeline. At the same time, the literature survey reveals that more biotechnology transfer and method developments are needed for a sustainable and innovative production of marine fungal antibiotics. PMID:27455283

  17. Validation of alternative methods for toxicity testing.

    PubMed Central

    Bruner, L H; Carr, G J; Curren, R D; Chamberlain, M

    1998-01-01

    Before nonanimal toxicity tests may be officially accepted by regulatory agencies, it is generally agreed that the validity of the new methods must be demonstrated in an independent, scientifically sound validation program. Validation has been defined as the demonstration of the reliability and relevance of a test method for a particular purpose. This paper provides a brief review of the development of the theoretical aspects of the validation process and updates current thinking about objectively testing the performance of an alternative method in a validation study. Validation of alternative methods for eye irritation testing is a specific example illustrating important concepts. Although discussion focuses on the validation of alternative methods intended to replace current in vivo toxicity tests, the procedures can be used to assess the performance of alternative methods intended for other uses. Images Figure 1 PMID:9599695

  18. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820

  19. Defining Support Requirements During Conceptual Design of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Morris, W. D.; White, N. H.; Davis, W. T.; Ebeling, C. E.

    1995-01-01

    Current methods for defining the operational support requirements of new systems are data intensive and require significant design information. Methods are being developed to aid in the analysis process of defining support requirements for new launch vehicles during their conceptual design phase that work with the level of information available during this phase. These methods will provide support assessments based on the vehicle design and the operating scenarios. The results can be used both to define expected support requirements for new launch vehicle designs and to help evaluate the benefits of using new technologies. This paper describes the models, their current status, and provides examples of their use.

  20. Polymer flammability

    DOT National Transportation Integrated Search

    2005-05-01

    This report provides an overview of polymer flammability from a material science perspective and describes currently accepted test methods to quantify burning behavior. Simplifying assumptions about the gas and condensed phase processes of flaming co...

  1. Assessment of the GHG Reduction Potential from Energy Crops Using a Combined LCA and Biogeochemical Process Models: A Review

    PubMed Central

    Jiang, Dong; Hao, Mengmeng; Wang, Qiao; Huang, Yaohuan; Fu, Xinyu

    2014-01-01

    The main purpose for developing biofuel is to reduce GHG (greenhouse gas) emissions, but the comprehensive environmental impact of such fuels is not clear. Life cycle analysis (LCA), as a complete comprehensive analysis method, has been widely used in bioenergy assessment studies. Great efforts have been directed toward establishing an efficient method for comprehensively estimating the greenhouse gas (GHG) emission reduction potential from the large-scale cultivation of energy plants by combining LCA with ecosystem/biogeochemical process models. LCA presents a general framework for evaluating the energy consumption and GHG emission from energy crop planting, yield acquisition, production, product use, and postprocessing. Meanwhile, ecosystem/biogeochemical process models are adopted to simulate the fluxes and storage of energy, water, carbon, and nitrogen in the soil-plant (energy crops) soil continuum. Although clear progress has been made in recent years, some problems still exist in current studies and should be addressed. This paper reviews the state-of-the-art method for estimating GHG emission reduction through developing energy crops and introduces in detail a new approach for assessing GHG emission reduction by combining LCA with biogeochemical process models. The main achievements of this study along with the problems in current studies are described and discussed. PMID:25045736

  2. Discriminative feature representation: an effective postprocessing solution to low dose CT imaging

    NASA Astrophysics Data System (ADS)

    Chen, Yang; Liu, Jin; Hu, Yining; Yang, Jian; Shi, Luyao; Shu, Huazhong; Gui, Zhiguo; Coatrieux, Gouenou; Luo, Limin

    2017-03-01

    This paper proposes a concise and effective approach termed discriminative feature representation (DFR) for low dose computerized tomography (LDCT) image processing, which is currently a challenging problem in medical imaging field. This DFR method assumes LDCT images as the superposition of desirable high dose CT (HDCT) 3D features and undesirable noise-artifact 3D features (the combined term of noise and artifact features induced by low dose scan protocols), and the decomposed HDCT features are used to provide the processed LDCT images with higher quality. The target HDCT features are solved via the DFR algorithm using a featured dictionary composed by atoms representing HDCT features and noise-artifact features. In this study, the featured dictionary is efficiently built using physical phantom images collected from the same CT scanner as the target clinical LDCT images to process. The proposed DFR method also has good robustness in parameter setting for different CT scanner types. This DFR method can be directly applied to process DICOM formatted LDCT images, and has good applicability to current CT systems. Comparative experiments with abdomen LDCT data validate the good performance of the proposed approach. This research was supported by National Natural Science Foundation under grants (81370040, 81530060), the Fundamental Research Funds for the Central Universities, and the Qing Lan Project in Jiangsu Province.

  3. Video enhancement method with color-protection post-processing

    NASA Astrophysics Data System (ADS)

    Kim, Youn Jin; Kwak, Youngshin

    2015-01-01

    The current study is aimed to propose a post-processing method for video enhancement by adopting a color-protection technique. The color-protection intends to attenuate perceptible artifacts due to over-enhancements in visually sensitive image regions such as low-chroma colors, including skin and gray objects. In addition, reducing the loss in color texture caused by the out-of-color-gamut signals is also taken into account. Consequently, color reproducibility of video sequences could be remarkably enhanced while the undesirable visual exaggerations are minimized.

  4. Catalysts and methods for converting carbonaceous materials to fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hensley, Jesse; Ruddy, Daniel A.; Schaidle, Joshua A.

    This disclosure relates to catalysts and processes designed to convert DME and/or methanol and hydrogen (H.sub.2) to desirable liquid fuels. These catalysts produce the fuels efficiently and with a high selectivity and yield, and reduce the formation of aromatic hydrocarbons by incorporating H.sub.2 into the products. This disclosure also describes process methods to further upgrade these fuels to higher molecular weight liquid fuel mixtures, which have physical properties comparable with current commercially used liquid fuels.

  5. Methods of measurement for semiconductor materials, process control, and devices

    NASA Technical Reports Server (NTRS)

    Bullis, W. M. (Editor)

    1971-01-01

    The development of methods of measurement for semiconductor materials, process control, and devices is discussed. The following subjects are also presented: (1) demonstration of the high sensitivity of the infrared response technique by the identification of gold in a germanium diode, (2) verification that transient thermal response is significantly more sensitive to the presence of voids in die attachment than steady-state thermal resistance, and (3) development of equipment for determining susceptibility of transistors to hot spot formation by the current-gain technique.

  6. Evaluating and Improving the Mathematics Teaching-Learning Process through Metacognition

    ERIC Educational Resources Information Center

    Desoete, Annemie

    2007-01-01

    Introduction: Despite all the emphasis on metacognition, researchers currently use different techniques to assess metacognition. The purpose of this contribution is to help to clarify some of the paradigms on the evaluation of metacognition. In addition the paper reviews studies aiming to improve the learning process through metacognition. Method:…

  7. Sensor fault-tolerant control for gear-shifting engaging process of automated manual transmission

    NASA Astrophysics Data System (ADS)

    Li, Liang; He, Kai; Wang, Xiangyu; Liu, Yahui

    2018-01-01

    Angular displacement sensor on the actuator of automated manual transmission (AMT) is sensitive to fault, and the sensor fault will disturb its normal control, which affects the entire gear-shifting process of AMT and results in awful riding comfort. In order to solve this problem, this paper proposes a method of fault-tolerant control for AMT gear-shifting engaging process. By using the measured current of actuator motor and angular displacement of actuator, the gear-shifting engaging load torque table is built and updated before the occurrence of the sensor fault. Meanwhile, residual between estimated and measured angular displacements is used to detect the sensor fault. Once the residual exceeds a determined fault threshold, the sensor fault is detected. Then, switch control is triggered, and the current observer and load torque table estimates an actual gear-shifting position to replace the measured one to continue controlling the gear-shifting process. Numerical and experiment tests are carried out to evaluate the reliability and feasibility of proposed methods, and the results show that the performance of estimation and control is satisfactory.

  8. Potential Alternatives Report for Validation of Alternative Low-Emission Surface PreparationlDepainting Technologies for Structural Steel

    NASA Technical Reports Server (NTRS)

    Lewis, Pattie

    2006-01-01

    For this project, particulates and solvents used during the depainting process of steel structures were the identified hazardous material (HazMat) targeted for elimination or reduction. This Potential Alternatives Report (PAR) provides technical analyses of identified alternatives to the current coating removal processes, criteria used to select alternatives for further analysis, and a list of those alternatives recommended for testing. The initial coating removal alternatives list was compiled using literature searches and center participant recommendations. The involved project participants initially considered fifteen (15) alternatives. In late 2004, stakeholders down-selected the list and identified specific processes as potential alternatives to the current depainting methods. The selected alternatives were: 1. Plastic Blast Media 2. Hard Abrasive Media 3. Sponge Blast Media 4. Mechanical Removal with Vacuum Attachment 5. Liquid Nitrogen 6. Laser Coating Removal Available information about these processes was used to analyze the technical merits and the potential environmental, safety, and occupational health (ESOH) impacts of these methods. A preliminary cost benefit analysis will be performed to determine if implementation of alternative technologies is economically justified. NASA AP2

  9. Low cost hydrogen/novel membrane technology for hydrogen separation from synthesis gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-02-01

    To make the coal-to-hydrogen route economically attractive, improvements are being sought in each step of the process: coal gasification, water-carbon monoxide shift reaction, and hydrogen separation. This report addresses the use of membranes in the hydrogen separation step. The separation of hydrogen from synthesis gas is a major cost element in the manufacture of hydrogen from coal. Separation by membranes is an attractive, new, and still largely unexplored approach to the problem. Membrane processes are inherently simple and efficient and often have lower capital and operating costs than conventional processes. In this report current ad future trends in hydrogen productionmore » and use are first summarized. Methods of producing hydrogen from coal are then discussed, with particular emphasis on the Texaco entrained flow gasifier and on current methods of separating hydrogen from this gas stream. The potential for membrane separations in the process is then examined. In particular, the use of membranes for H{sub 2}/CO{sub 2}, H{sub 2}/CO, and H{sub 2}/N{sub 2} separations is discussed. 43 refs., 14 figs., 6 tabs.« less

  10. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  11. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focused on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for the increased understanding of the physical processes governing ice accretion, ice shedding, and iced aerodynamics is examined.

  12. Conventional and Innovative Processing of Milk for Yogurt Manufacture; Development of Texture and Flavor: A Review.

    PubMed

    Sfakianakis, Panagiotis; Tzia, Constatnina

    2014-03-11

    Milk and yogurt are important elements of the human diet, due to their high nutritional value and their appealing sensory properties. During milk processing (homogenization, pasteurization) and further yogurt manufacture (fermentation) physicochemical changes occur that affect the flavor and texture of these products while the development of standardized processes contributes to the development of desirable textural and flavor characteristics. The processes that take place during milk processing and yogurt manufacture with conventional industrial methods, as well as with innovative methods currently proposed (ultra-high pressure, ultrasound, microfluidization, pulsed electric fields), and their effect on the texture and flavor of the final conventional or probiotic/prebiotic products will be presented in this review.

  13. Conventional and Innovative Processing of Milk for Yogurt Manufacture; Development of Texture and Flavor: A Review

    PubMed Central

    Sfakianakis, Panagiotis; Tzia, Constatnina

    2014-01-01

    Milk and yogurt are important elements of the human diet, due to their high nutritional value and their appealing sensory properties. During milk processing (homogenization, pasteurization) and further yogurt manufacture (fermentation) physicochemical changes occur that affect the flavor and texture of these products while the development of standardized processes contributes to the development of desirable textural and flavor characteristics. The processes that take place during milk processing and yogurt manufacture with conventional industrial methods, as well as with innovative methods currently proposed (ultra-high pressure, ultrasound, microfluidization, pulsed electric fields), and their effect on the texture and flavor of the final conventional or probiotic/prebiotic products will be presented in this review. PMID:28234312

  14. A Study of the Air Force’s Exception Management Process: Its Effect on Customer Service and Order Processing

    DTIC Science & Technology

    1991-09-01

    The purpose of this study was two-fold; the first goal was to determine what the SBSS order processing is, and secondly to determine the effect the...current method of ECC management has on the SBSS order processing cycle and the level of customer service rendered by base supply. The research...revealed that exception management is a crucial component of a successful order processing function. Further, it was established that the level of customer

  15. Dual-Energy CT: New Horizon in Medical Imaging

    PubMed Central

    Goo, Jin Mo

    2017-01-01

    Dual-energy CT has remained underutilized over the past decade probably due to a cumbersome workflow issue and current technical limitations. Clinical radiologists should be made aware of the potential clinical benefits of dual-energy CT over single-energy CT. To accomplish this aim, the basic principle, current acquisition methods with advantages and disadvantages, and various material-specific imaging methods as clinical applications of dual-energy CT should be addressed in detail. Current dual-energy CT acquisition methods include dual tubes with or without beam filtration, rapid voltage switching, dual-layer detector, split filter technique, and sequential scanning. Dual-energy material-specific imaging methods include virtual monoenergetic or monochromatic imaging, effective atomic number map, virtual non-contrast or unenhanced imaging, virtual non-calcium imaging, iodine map, inhaled xenon map, uric acid imaging, automatic bone removal, and lung vessels analysis. In this review, we focus on dual-energy CT imaging including related issues of radiation exposure to patients, scanning and post-processing options, and potential clinical benefits mainly to improve the understanding of clinical radiologists and thus, expand the clinical use of dual-energy CT; in addition, we briefly describe the current technical limitations of dual-energy CT and the current developments of photon-counting detector. PMID:28670151

  16. H2S mediated thermal and photochemical methane activation

    PubMed Central

    Baltrusaitis, Jonas; de Graaf, Coen; Broer, Ria; Patterson, Eric

    2013-01-01

    Sustainable, low temperature methods of natural gas activation are critical in addressing current and foreseeable energy and hydrocarbon feedstock needs. Large portions of natural gas resources are still too expensive to process due to their high content of hydrogen sulfide gas (H2S) in mixture with methane, CH4, altogether deemed as sub-quality or “sour” gas. We propose a unique method for activating this “sour” gas to form a mixture of sulfur-containing hydrocarbon intermediates, CH3SH and CH3SCH3, and an energy carrier, such as H2. For this purpose, we computationally investigated H2S mediated methane activation to form a reactive CH3SH species via direct photolysis of sub-quality natural gas. Photoexcitation of hydrogen sulfide in the CH4+H2S complex results in a barrier-less relaxation via a conical intersection to form a ground state CH3SH+H2 complex. The resulting CH3SH can further be heterogeneously coupled over acidic catalysts to form higher hydrocarbons while the H2 can be used as a fuel. This process is very different from a conventional thermal or radical-based processes and can be driven photolytically at low temperatures, with enhanced controllability over the process conditions currently used in industrial oxidative natural gas activation. Finally, the proposed process is CO2 neutral, as opposed to the currently industrially used methane steam reforming (SMR). PMID:24150813

  17. Automotive Marketing Methods and Practice

    DOT National Transportation Integrated Search

    1979-09-01

    The report is a comprehensive examination of the current marketing practices, marketing methodologies, and decision-making processes utilized by the domestic automotive industry. The various marketing elements, such as products, consumer behavior, sa...

  18. A systematic review and appraisal of the quality of practice guidelines for the management of Neisseria gonorrhoeae infections.

    PubMed

    Dickson, Catherine; Arnason, Trevor; Friedman, Dara Spatz; Metz, Gila; Grimshaw, Jeremy M

    2017-11-01

    Clinical guidelines help ensure consistent care informed by current evidence. As shifts in antimicrobial resistance continue to influence first-line treatment, up-to-date guidelines are important for preventing treatment failure. A guideline's development process will influence its recommendations and users' trust. To assess the quality of current gonorrhoea guidelines' development processes. Multiple databases. Original and current English-language guidelines targeting health professionals and containing treatment recommendations for uncomplicated gonorrhoea in the general adult population. Two appraisers assessed the guidelines independently using the Appraisal of Guidelines for Research and Evaluation II (AGREE II) tool. Scores were combined as per the AGREE II users' manual. We identified 10 guidelines meeting the inclusion criteria. The quality of the gonorrhoea treatment guidelines varied. Most scored poorly on Rigour of Development ; information on the evidence review process and methods for formulating recommendations was often missing. The WHO Guidelines for the Treatment of Neisseria gonorrhoeae and UK National Guideline for the Management of Gonorrhoea in Adults scored the highest on Rigour of Development . Methods to address conflicts of interest were often not described in the materials reviewed. Implementation of recommendations was often not addressed. By limiting our study to English-language guidelines, a small number of guidelines we identified were excluded. Our analysis was limited to either published or online materials that were readily available to users. We could not differentiate between items addressed in the development process but not documented from items that were not addressed. Gonorrhoea treatment guidelines may slow antimicrobial resistance. Many current guidelines are not in line with the current guideline development best practices; this might undermine the perceived trustworthiness of guidelines. By identifying current limitations, this study can help improve the quality of future guidelines. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. Research on Finite Element Model Generating Method of General Gear Based on Parametric Modelling

    NASA Astrophysics Data System (ADS)

    Lei, Yulong; Yan, Bo; Fu, Yao; Chen, Wei; Hou, Liguo

    2017-06-01

    Aiming at the problems of low efficiency and poor quality of gear meshing in the current mainstream finite element software, through the establishment of universal gear three-dimensional model, and explore the rules of unit and node arrangement. In this paper, a finite element model generation method of universal gear based on parameterization is proposed. Visual Basic program is used to realize the finite element meshing, give the material properties, and set the boundary / load conditions and other pre-processing work. The dynamic meshing analysis of the gears is carried out with the method proposed in this pape, and compared with the calculated values to verify the correctness of the method. The method greatly shortens the workload of gear finite element pre-processing, improves the quality of gear mesh, and provides a new idea for the FEM pre-processing.

  20. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  1. A green preparation method of battery grade α-PbO based on Pb-O2 fuel cell

    NASA Astrophysics Data System (ADS)

    Wang, Pingyuan; Pan, Junqing; Gong, Shumin; Sun, Yanzhi

    2017-08-01

    In order to solve the problem of high pollution and high energy consumption of the current lead oxide (PbO) preparation processes, a new clean and energy saving preparation method for high purity α-PbO via discharge of a Pb-O2 fuel cell is reported. The fuel cell with metallic lead anode, oxygen cathode, and 30% NaOH electrolyte can provide a discharge voltage of 0.66-0.38 V corresponding to discharge current range of 5-50 mA cm-2. PbO is precipitated from the NaHPbO2-containing electrolyte through a cooling crystallization process after discharge process, and the XRD patterns indicate the structure is pure α-PbO. The mother liquid after crystallization can be recycled for the next batch. The obtained PbO mixed with 60% Shimadzu PbO is superior to the pure Shimadzu PbO in discharge capacity and cycle ability.

  2. High performance MoS2 TFT using graphene contact first process

    NASA Astrophysics Data System (ADS)

    Chang Chien, Chih-Shiang; Chang, Hsun-Ming; Lee, Wei-Ta; Tang, Ming-Ru; Wu, Chao-Hsin; Lee, Si-Chen

    2017-08-01

    An ohmic contact of graphene/MoS2 heterostructure is determined by using ultraviolet photoelectron spectroscopy (UPS). Since graphene shows a great potential to replace metal contact, a direct comparison of Cr/Au contact and graphene contact on the MoS2 thin film transistor (TFT) is made. Different from metal contacts, the work function of graphene can be modulated. As a result, the subthreshold swing can be improved. And when Vg

  3. H2S-mediated thermal and photochemical methane activation.

    PubMed

    Baltrusaitis, Jonas; de Graaf, Coen; Broer, Ria; Patterson, Eric V

    2013-12-02

    Sustainable, low-temperature methods for natural gas activation are critical in addressing current and foreseeable energy and hydrocarbon feedstock needs. Large portions of natural gas resources are still too expensive to process due to their high content of hydrogen sulfide gas (H2S) mixed with methane, deemed altogether as sub-quality or "sour" gas. We propose a unique method of activation to form a mixture of sulfur-containing hydrocarbon intermediates, CH3SH and CH3SCH3 , and an energy carrier such as H2. For this purpose, we investigated the H2S-mediated methane activation to form a reactive CH3SH species by means of direct photolysis of sub-quality natural gas. Photoexcitation of hydrogen sulfide in the CH4 + H2S complex resulted in a barrierless relaxation by a conical intersection to form a ground-state CH3SH + H2 complex. The resulting CH3SH could further be coupled over acidic catalysts to form higher hydrocarbons, and the resulting H2 used as a fuel. This process is very different from conventional thermal or radical-based processes and can be driven photolytically at low temperatures, with enhanced control over the conditions currently used in industrial oxidative natural gas activation. Finally, the proposed process is CO2 neutral, as opposed to the current industrial steam methane reforming (SMR). Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Method for conducting electroless metal-plating processes

    DOEpatents

    Petit, George S.; Wright, Ralph R.

    1978-01-01

    This invention is an improved method for conducting electroless metal-plating processes in a metal tank which is exposed to the plating bath. The invention solves a problem commonly encountered in such processes: how to determine when it is advisable to shutdown the process in order to clean and/or re-passivate the tank. The new method comprises contacting the bath with a current-conducting, non-catalytic probe and, during plating operations, monitoring the gradually changing difference in electropotential between the probe and tank. It has been found that the value of this voltage is indicative of the extent to which nickel-bearing decomposition products accumulate on the tank. By utilizing the voltage to determine when shutdown for cleaning is advisable, the operator can avoid premature shutdown and at the same time avoid prolonging operations to the point that spontaneous decomposition occurs.

  5. Method for surface passivation and protection of cadmium zinc telluride crystals

    DOEpatents

    Mescher, Mark J.; James, Ralph B.; Schlesinger, Tuviah E.; Hermon, Haim

    2000-01-01

    A method for reducing the leakage current in CZT crystals, particularly Cd.sub.1-x Zn.sub.x Te crystals (where x is greater than equal to zero and less than or equal to 0.5), and preferably Cd.sub.0.9 Zn.sub.0.1 Te crystals, thereby enhancing the ability of these crystal to spectrally resolve radiological emissions from a wide variety of radionuclides. Two processes are disclosed. The first method provides for depositing, via reactive sputtering, a silicon nitride hard-coat overlayer which provides significant reduction in surface leakage currents. The second method enhances the passivation by oxidizing the CZT surface with an oxygen plasma prior to silicon nitride deposition without breaking the vacuum state.

  6. Production of Titanium Metal by an Electrochemical Molten Salt Process

    NASA Astrophysics Data System (ADS)

    Fatollahi-Fard, Farzin

    Titanium production is a long and complicated process. What we often consider to be the standard method of primary titanium production (the Kroll process), involves many complex steps both before and after to make a useful product from titanium ore. Thus new methods of titanium production, especially electrochemical processes, which can utilize less-processed feedstocks have the potential to be both cheaper and less energy intensive than current titanium production processes. This project is investigating the use of lower-grade titanium ores with the electrochemical MER process for making titanium via a molten salt process. The experimental work carried out has investigated making the MER process feedstock (titanium oxycarbide) with natural titanium ores--such as rutile and ilmenite--and new ways of using the MER electrochemical reactor to "upgrade" titanium ores or the titanium oxycarbide feedstock. It is feasible to use the existing MER electrochemical reactor to both purify the titanium oxycarbide feedstock and produce titanium metal.

  7. A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.

    2017-03-01

    Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.

  8. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  9. e-Research and Learning Theory: What Do Sequence and Process Mining Methods Contribute?

    ERIC Educational Resources Information Center

    Reimann, Peter; Markauskaite, Lina; Bannert, Maria

    2014-01-01

    This paper discusses the fundamental question of how data-intensive e-research methods could contribute to the development of learning theories. Using methodological developments in research on self-regulated learning as an example, it argues that current applications of data-driven analytical techniques, such as educational data mining and its…

  10. An Activity Model for Scientific Inquiry

    ERIC Educational Resources Information Center

    Harwood, William

    2004-01-01

    Most people are frustrated with the current scientific method presented in textbooks. The scientific method--a simplistic model of the scientific inquiry process--fails in most cases to provide a successful guide to how science is done. This is not shocking, really. Many simple models used in science are quite useful within their limitations. When…

  11. Does Project-Based Learning Enhance Iranian EFL Learners' Vocabulary Recall and Retention?

    ERIC Educational Resources Information Center

    Shafaei, Azadeh; Rahim, Hajar Abdul

    2015-01-01

    Vocabulary knowledge is an integral part of second/foreign language learning. Thus, using teaching methods that can help learners retain and expand their vocabulary knowledge is necessary to facilitate the language learning process. The current research investigated the effectiveness of an interactive classroom method, known as Project-Based…

  12. Developing Carbon Nanotube Standards at NASA

    NASA Technical Reports Server (NTRS)

    Nikolaev, Pasha; Arepalli, Sivaram; Sosa, Edward; Gorelik, Olga; Yowell, Leonard

    2007-01-01

    Single wall carbon nanotubes (SWCNTs) are currently being produced and processed by several methods. Many researchers are continuously modifying existing methods and developing new methods to incorporate carbon nanotubes into other materials and utilize the phenomenal properties of SWCNTs. These applications require availability of SWCNTs with known properties and there is a need to characterize these materials in a consistent manner. In order to monitor such progress, it is critical to establish a means by which to define the quality of SWCNT material and develop characterization standards to evaluate of nanotube quality across the board. Such characterization standards should be applicable to as-produced materials as well as processed SWCNT materials. In order to address this issue, NASA Johnson Space Center has developed a protocol for purity and dispersion characterization of SWCNTs (Ref.1). The NASA JSC group is currently working with NIST, ANSI and ISO to establish purity and dispersion standards for SWCNT material. A practice guide for nanotube characterization is being developed in cooperation with NIST (Ref.2). Furthermore, work is in progress to incorporate additional characterization methods for electrical, mechanical, thermal, optical and other properties of SWCNTs.

  13. Developing Carbon Nanotube Standards at NASA

    NASA Technical Reports Server (NTRS)

    Nikolaev, Pasha; Arepalli, Sivaram; Sosa, Edward; Gorelik, Olga; Yowell, Leonard

    2007-01-01

    Single wall carbon nanotubes (SWCNTs) are currently being produced and processed by several methods. Many researchers are continuously modifying existing methods and developing new methods to incorporate carbon nanotubes into other materials and utilize the phenomenal properties of SWCNTs. These applications require availability of SWCNTs with known properties and there is a need to characterize these materials in a consistent manner. In order to monitor such progress, it is critical to establish a means by which to define the quality of SWCNT material and develop characterization standards to evaluate of nanotube quality across the board. Such characterization standards should be applicable to as-produced materials as well as processed SWCNT materials. In order to address this issue, NASA Johnson Space Center has developed a protocol for purity and dispersion characterization of SWCNTs. The NASA JSC group is currently working with NIST, ANSI and ISO to establish purity and dispersion standards for SWCNT material. A practice guide for nanotube characterization is being developed in cooperation with NIST. Furthermore, work is in progress to incorporate additional characterization methods for electrical, mechanical, thermal, optical and other properties of SWCNTs.

  14. The Conceptual Landscape of iSchools: Examining Current Research Interests of Faculty Members

    ERIC Educational Resources Information Center

    Holmberg, Kim

    2013-01-01

    Introduction: This study describes the intellectual landscape of iSchools and examines how the various iSchools map on to these research areas. Method: The primary focus of the data collection process was on faculty members' current research interests as described by the individuals themselves. A co-word analysis of all iSchool faculty…

  15. Cochrane Qualitative and Implementation Methods Group guidance series-paper 6: reporting guidelines for qualitative, implementation, and process evaluation evidence syntheses.

    PubMed

    Flemming, Kate; Booth, Andrew; Hannes, Karin; Cargo, Margaret; Noyes, Jane

    2018-05-01

    To outline contemporary and novel developments for the presentation and reporting of syntheses of qualitative, implementation, and process evaluation evidence and provide recommendations for the use of reporting guidelines. An overview of reporting guidelines for qualitative, implementation, and process evaluation evidence syntheses drawing on current international literature and the collective expert knowledge of the Cochrane Qualitative and Implementation Methods Group. Several reporting guidelines exist that can be used or adapted to report syntheses of qualitative, implementation, and process evaluation evidence. Methods to develop individual guidance varied. The use of a relevant reporting guideline can enhance the transparency, consistency, and quality of reporting. Guidelines that exist are generic, method specific, and for particular aspects of the reviewing process, searching. Caution is expressed over the potential for reporting guidelines to produce a mechanistic approach moving the focus away from the content and toward the procedural aspects of the review. The use of a reporting guideline is recommended and a five-step decision flowchart to guide the choice of reporting guideline is provided. Gaps remain in method-specific reporting guidelines such as mixed-study, implementation, and process evaluation evidence syntheses. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. A Focusing Method in the Calibration Process of Image Sensors Based on IOFBs

    PubMed Central

    Fernández, Pedro R.; Lázaro, José L.; Gardel, Alfredo; Cano, Ángel E.; Bravo, Ignacio

    2010-01-01

    A focusing procedure in the calibration process of image sensors based on Incoherent Optical Fiber Bundles (IOFBs) is described using the information extracted from fibers. These procedures differ from any other currently known focusing method due to the non spatial in-out correspondence between fibers, which produces a natural codification of the image to transmit. Focus measuring is essential prior to carrying out calibration in order to guarantee accurate processing and decoding. Four algorithms have been developed to estimate the focus measure; two methods based on mean grey level, and the other two based on variance. In this paper, a few simple focus measures are defined and compared. Some experimental results referred to the focus measure and the accuracy of the developed methods are discussed in order to demonstrate its effectiveness. PMID:22315526

  17. Comparison of simple additive weighting (SAW) and composite performance index (CPI) methods in employee remuneration determination

    NASA Astrophysics Data System (ADS)

    Karlitasari, L.; Suhartini, D.; Benny

    2017-01-01

    The process of determining the employee remuneration for PT Sepatu Mas Idaman currently are still using Microsoft Excel-based spreadsheet where in the spreadsheet there is the value of criterias that must be calculated for every employee. This can give the effect of doubt during the assesment process, therefore resulting in the process to take much longer time. The process of employee remuneration determination is conducted by the assesment team based on some criterias that have been predetermined. The criteria used in the assessment process are namely the ability to work, human relations, job responsibility, discipline, creativity, work, achievement of targets, and absence. To ease the determination of employee remuneration to be more efficient and effective, the Simple Additive Weighting (SAW) method is used. SAW method can help in decision making for a certain case, and the calculation that generates the greatest value will be chosen as the best alternative. Other than SAW, also by using another method was the CPI method which is one of the calculating method in decision making based on performance index. Where SAW method was more faster by 89-93% compared to CPI method. Therefore it is expected that this application can be an evaluation material for the need of training and development for employee performances to be more optimal.

  18. Compilation of field methods used in geochemical prospecting by the U.S. Geological Survey

    USGS Publications Warehouse

    Lakin, Hubert William; Ward, Frederick Norville; Almond, Hy

    1952-01-01

    The field methods described in this report are those currently used in geochemical prospecting by the U. S. Geological Survey. Some have been published, others are being processed for publication, while others are still being investigated. The purpose in compiling these methods is to make them readily available in convenient form. The methods have not been thoroughly tested and none is wholly satisfactory. Research is being continued.

  19. Verbal autopsy: current practices and challenges.

    PubMed Central

    Soleman, Nadia; Chandramohan, Daniel; Shibuya, Kenji

    2006-01-01

    Cause-of-death data derived from verbal autopsy (VA) are increasingly used for health planning, priority setting, monitoring and evaluation in countries with incomplete or no vital registration systems. In some regions of the world it is the only method available to obtain estimates on the distribution of causes of death. Currently, the VA method is routinely used at over 35 sites, mainly in Africa and Asia. In this paper, we present an overview of the VA process and the results of a review of VA tools and operating procedures used at demographic surveillance sites and sample vital registration systems. We asked for information from 36 field sites about field-operating procedures and reviewed 18 verbal autopsy questionnaires and 10 cause-of-death lists used in 13 countries. The format and content of VA questionnaires, field-operating procedures, cause-of-death lists and the procedures to derive causes of death from VA process varied substantially among sites. We discuss the consequences of using varied methods and conclude that the VA tools and procedures must be standardized and reliable in order to make accurate national and international comparisons of VA data. We also highlight further steps needed in the development of a standard VA process. PMID:16583084

  20. A biologically relevant method for considering patterns of oceanic retention in the Southern Ocean

    NASA Astrophysics Data System (ADS)

    Mori, Mao; Corney, Stuart P.; Melbourne-Thomas, Jessica; Klocker, Andreas; Sumner, Michael; Constable, Andrew

    2017-12-01

    Many marine species have planktonic forms - either during a larval stage or throughout their lifecycle - that move passively or are strongly influenced by ocean currents. Understanding these patterns of movement is important for informing marine ecosystem management and for understanding ecological processes generally. Retention of biological particles in a particular area due to ocean currents has received less attention than transport pathways, particularly for the Southern Ocean. We present a method for modelling retention time, based on the half-life for particles in a particular region, that is relevant for biological processes. This method uses geostrophic velocities at the ocean surface, derived from 23 years of satellite altimetry data (1993-2016), to simulate the advection of passive particles during the Southern Hemisphere summer season (from December to March). We assess spatial patterns in the retention time of passive particles and evaluate the processes affecting these patterns for the Indian sector of the Southern Ocean. Our results indicate that the distribution of retention time is related to bathymetric features and the resulting ocean dynamics. Our analysis also reveals a moderate level of consistency between spatial patterns of retention time and observations of Antarctic krill (Euphausia superba) distribution.

  1. Instrumentation in Developing Chlorophyll Fluorescence Biosensing: A Review

    PubMed Central

    Fernandez-Jaramillo, Arturo A.; Duarte-Galvan, Carlos; Contreras-Medina, Luis M.; Torres-Pacheco, Irineo; de J. Romero-Troncoso, Rene; Guevara-Gonzalez, Ramon G.; Millan-Almaraz, Jesus R.

    2012-01-01

    Chlorophyll fluorescence can be defined as the red and far-red light emitted by photosynthetic tissue when it is excited by a light source. This is an important phenomenon which permits investigators to obtain important information about the state of health of a photosynthetic sample. This article reviews the current state of the art knowledge regarding the design of new chlorophyll fluorescence sensing systems, providing appropriate information about processes, instrumentation and electronic devices. These types of systems and applications can be created to determine both comfort conditions and current problems within a given subject. The procedure to measure chlorophyll fluorescence is commonly split into two main parts; the first involves chlorophyll excitation, for which there are passive or active methods. The second part of the procedure is to closely measure the chlorophyll fluorescence response with specialized instrumentation systems. Such systems utilize several methods, each with different characteristics regarding to cost, resolution, ease of processing or portability. These methods for the most part include cameras, photodiodes and satellite images. PMID:23112686

  2. Peptidomic Approach to Developing ELISAs for the Determination of Bovine and Porcine Processed Animal Proteins in Feed for Farmed Animals.

    PubMed

    Huet, Anne-Catherine; Charlier, Caroline; Deckers, Elise; Marbaix, Hélène; Raes, Martine; Mauro, Sergio; Delahaut, Philippe; Gillard, Nathalie

    2016-11-30

    The European Commission (EC) wants to reintroduce nonruminant processed animal proteins (PAPs) safely into the feed chain. This would involve replacing the current ban in feed with a species-to-species ban which, in the case of nonruminants, would only prohibit feeding them with proteins from the same species. To enforce such a provision, there is an urgent need for species-specific methods for detecting PAPs from several species in animal feed and in PAPs from other species. Currently, optical microscopy and the polymerase chain reaction are the officially accepted methods, but they have limitations, and alternative methods are needed. We have developed immunoassays using antibodies raised against targets which are not influenced by high temperature and pressure. These targets were identified in a previous study based on an experimental approach. One optimized competitive ELISA detects bovine PAPs at 2% in plant-derived feed. The detection capability demonstrated on blind samples shows a good correlation with mass spectrometry results.

  3. Improving operational anodising process performance using simulation approach

    NASA Astrophysics Data System (ADS)

    Liong, Choong-Yeun; Ghazali, Syarah Syahidah

    2015-10-01

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.

  4. Cathodic Protection Measurement Through Inline Inspection Technology Uses and Observations

    NASA Astrophysics Data System (ADS)

    Ferguson, Briana Ley

    This research supports the evaluation of an impressed current cathodic protection (CP) system of a buried coated steel pipeline through alternative technology and methods, via an inline inspection device (ILI, CP ILI tool, or tool), in order to prevent and mitigate external corrosion. This thesis investigates the ability to measure the current density of a pipeline's CP system from inside of a pipeline rather than manually from outside, and then convert that CP ILI tool reading into a pipe-to-soil potential as required by regulations and standards. This was demonstrated through a mathematical model that utilizes applications of Ohm's Law, circuit concepts, and attenuation principles in order to match the results of the ILI sample data by varying parameters of the model (i.e., values for over potential and coating resistivity). This research has not been conducted previously in order to determine if the protected potential range can be achieved with respect to the predicted current density from the CP ILI device. Kirchhoff's method was explored, but certain principals could not be used in the model as manual measurements were required. This research was based on circuit concepts which indirectly affected electrochemical processes. Through Ohm's law, the results show that a constant current density is possible in the protected potential range; therefore, indicates polarization of the pipeline, which leads to calcareous deposit development with respect to electrochemistry. Calcareous deposit is desirable in industry since it increases the resistance of the pipeline coating and lowers current, thus slowing the oxygen diffusion process. This research conveys that an alternative method for CP evaluation from inside of the pipeline is possible where the pipe-to-soil potential can be estimated (as required by regulations) from the ILI tool's current density measurement.

  5. Tomographic methods in flow diagnostics

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    1993-01-01

    This report presents a viewpoint of tomography that should be well adapted to currently available optical measurement technology as well as the needs of computational and experimental fluid dynamists. The goals in mind are to record data with the fastest optical array sensors; process the data with the fastest parallel processing technology available for small computers; and generate results for both experimental and theoretical data. An in-depth example treats interferometric data as it might be recorded in an aeronautics test facility, but the results are applicable whenever fluid properties are to be measured or applied from projections of those properties. The paper discusses both computed and neural net calibration tomography. The report also contains an overview of key definitions and computational methods, key references, computational problems such as ill-posedness, artifacts, missing data, and some possible and current research topics.

  6. Frequency division multiplexed multi-color fluorescence microscope system

    NASA Astrophysics Data System (ADS)

    Le, Vu Nam; Yang, Huai Dong; Zhang, Si Chun; Zhang, Xin Rong; Jin, Guo Fan

    2017-10-01

    Grayscale camera can only obtain gray scale image of object, while the multicolor imaging technology can obtain the color information to distinguish the sample structures which have the same shapes but in different colors. In fluorescence microscopy, the current method of multicolor imaging are flawed. Problem of these method is affecting the efficiency of fluorescence imaging, reducing the sampling rate of CCD etc. In this paper, we propose a novel multiple color fluorescence microscopy imaging method which based on the Frequency division multiplexing (FDM) technology, by modulating the excitation lights and demodulating the fluorescence signal in frequency domain. This method uses periodic functions with different frequency to modulate amplitude of each excitation lights, and then combine these beams for illumination in a fluorescence microscopy imaging system. The imaging system will detect a multicolor fluorescence image by a grayscale camera. During the data processing, the signal obtained by each pixel of the camera will be processed with discrete Fourier transform, decomposed by color in the frequency domain and then used inverse discrete Fourier transform. After using this process for signals from all of the pixels, monochrome images of each color on the image plane can be obtained and multicolor image is also acquired. Based on this method, this paper has constructed and set up a two-color fluorescence microscope system with two excitation wavelengths of 488 nm and 639 nm. By using this system to observe the linearly movement of two kinds of fluorescent microspheres, after the data processing, we obtain a two-color fluorescence dynamic video which is consistent with the original image. This experiment shows that the dynamic phenomenon of multicolor fluorescent biological samples can be generally observed by this method. Compared with the current methods, this method can obtain the image signals of each color at the same time, and the color video's frame rate is consistent with the frame rate of the camera. The optical system is simpler and does not need extra color separation element. In addition, this method has a good filtering effect on the ambient light or other light signals which are not affected by the modulation process.

  7. Digital enhancement of X-rays for NDT

    NASA Technical Reports Server (NTRS)

    Butterfield, R. L.

    1980-01-01

    Report is "cookbook" for digital processing of industrial X-rays. Computer techniques, previously used primarily in laboratory and developmental research, have been outlined and codified into step by step procedures for enhancing X-ray images. Those involved in nondestructive testing should find report valuable asset, particularly is visual inspection is method currently used to process X-ray images.

  8. How to Implement an E-Learning Curriculum to Streamline Teaching Digital Image Processing

    ERIC Educational Resources Information Center

    Király, Sándor

    2016-01-01

    In the field of teaching, one of the interesting subjects is the research of the fact which didactic methods are good for learning the current curriculum for the students who show a wide range of age, interest, chosen courses, previous studies and motivation. This article introduces the facilities that support the learning process: the…

  9. Current research issues related to post-wildfire runoff and erosion processes

    Treesearch

    John A. Moody; Richard A. Shakesby; Peter R. Robichaud; Susan H. Cannon; Deborah A. Martin

    2013-01-01

    Research into post-wildfire effects began in the United Statesmore than 70 years ago and only later extended to other parts of the world. Post-wildfire responses are typically transient, episodic, variable in space and time, dependent on thresholds, and involve multiple processes measured by different methods. These characteristics tend to hinder research progress, but...

  10. An Analysis and Allocation System for Library Collections Budgets: The Comprehensive Allocation Process (CAP)

    ERIC Educational Resources Information Center

    Lyons, Lucy Eleonore; Blosser, John

    2012-01-01

    The "Comprehensive Allocation Process" (CAP) is a reproducible decision-making structure for the allocation of new collections funds, for the reallocation of funds within stagnant budgets, and for budget cuts in the face of reduced funding levels. This system was designed to overcome common shortcomings of current methods. Its philosophical…

  11. A Level-set based framework for viscous simulation of particle-laden supersonic flows

    NASA Astrophysics Data System (ADS)

    Das, Pratik; Sen, Oishik; Jacobs, Gustaaf; Udaykumar, H. S.

    2017-06-01

    Particle-laden supersonic flows are important in natural and industrial processes, such as, volcanic eruptions, explosions, pneumatic conveyance of particle in material processing etc. Numerical study of such high-speed particle laden flows at the mesoscale calls for a numerical framework which allows simulation of supersonic flow around multiple moving solid objects. Only a few efforts have been made toward development of numerical frameworks for viscous simulation of particle-fluid interaction in supersonic flow regime. The current work presents a Cartesian grid based sharp-interface method for viscous simulations of interaction between supersonic flow with moving rigid particles. The no-slip boundary condition is imposed at the solid-fluid interfaces using a modified ghost fluid method (GFM). The current method is validated against the similarity solution of compressible boundary layer over flat-plate and benchmark numerical solution for steady supersonic flow over cylinder. Further validation is carried out against benchmark numerical results for shock induced lift-off of a cylinder in a shock tube. 3D simulation of steady supersonic flow over sphere is performed to compare the numerically obtained drag co-efficient with experimental results. A particle-resolved viscous simulation of shock interaction with a cloud of particles is performed to demonstrate that the current method is suitable for large-scale particle resolved simulations of particle-laden supersonic flows.

  12. Method of Conjugate Radii for Solving Linear and Nonlinear Systems

    NASA Technical Reports Server (NTRS)

    Nachtsheim, Philip R.

    1999-01-01

    This paper describes a method to solve a system of N linear equations in N steps. A quadratic form is developed involving the sum of the squares of the residuals of the equations. Equating the quadratic form to a constant yields a surface which is an ellipsoid. For different constants, a family of similar ellipsoids can be generated. Starting at an arbitrary point an orthogonal basis is constructed and the center of the family of similar ellipsoids is found in this basis by a sequence of projections. The coordinates of the center in this basis are the solution of linear system of equations. A quadratic form in N variables requires N projections. That is, the current method is an exact method. It is shown that the sequence of projections is equivalent to a special case of the Gram-Schmidt orthogonalization process. The current method enjoys an advantage not shared by the classic Method of Conjugate Gradients. The current method can be extended to nonlinear systems without modification. For nonlinear equations the Method of Conjugate Gradients has to be augmented with a line-search procedure. Results for linear and nonlinear problems are presented.

  13. ICFA Beam Dynamics Newsletter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pikin, A.

    2017-11-21

    Electron beam ion sources technology made significant progress since 1968 when this method of producing highly charged ions in a potential trap within electron beam was proposed by E. Donets. Better understanding of physical processes in EBIS, technological advances and better simulation tools determined significant progress in key EBIS parameters: electron beam current and current density, ion trap capacity, attainable charge states. Greatly increased the scope of EBIS and EBIT applications. An attempt is made to compile some of EBIS engineering problems and solutions and to demonstrate a present stage of understanding the processes and approaches to build a bettermore » EBIS.« less

  14. Consistent and efficient processing of ADCP streamflow measurements

    USGS Publications Warehouse

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  15. Improving healthcare using Lean processes.

    PubMed

    Baker, G Ross

    2014-01-01

    For more than a decade, healthcare organizations across Canada have been using Lean management tools to improve care processes, reduce preventable adverse events, increase patient satisfaction and create better work environments. The largest system-wide effort in Canada, and perhaps anywhere, is currently under way in Saskatchewan. The jury is still out on whether Lean efforts in that province, or elsewhere in Canada, are robust enough to transform current delivery systems and sustain new levels of performance. This issue of Healthcare Quarterly features several articles that provide a perspective on Lean methods in healthcare. Copyright © 2014 Longwoods Publishing.

  16. Metals processing control by counting molten metal droplets

    DOEpatents

    Schlienger, Eric; Robertson, Joanna M.; Melgaard, David; Shelmidine, Gregory J.; Van Den Avyle, James A.

    2000-01-01

    Apparatus and method for controlling metals processing (e.g., ESR) by melting a metal ingot and counting molten metal droplets during melting. An approximate amount of metal in each droplet is determined, and a melt rate is computed therefrom. Impedance of the melting circuit is monitored, such as by calculating by root mean square a voltage and current of the circuit and dividing the calculated current into the calculated voltage. Analysis of the impedance signal is performed to look for a trace characteristic of formation of a molten metal droplet, such as by examining skew rate, curvature, or a higher moment.

  17. Current trends in guideline development: a cause for concern.

    PubMed

    Stephens, R G; Kogon, S L; Bohay, R N

    1996-02-01

    Although the development and use of practice-related guidelines as educational aids has a long history in the health professions, scientific assessment indicates that they have had limited success in changing practice patterns. This is principally due to the exclusion of practitioners from the development process, and the lack of a credible scientific basis for many guidelines. Past failures have led to new methods of guideline development based on a critical analysis of scientific data. These methods, which involve legitimate professional organizations at all stages of the development process, are clearly a step in the right direction. Unfortunately, there are signs that current guideline developers still fail to recognize the critical nature of the new methods or the need for an open and inclusive development process. It is even more disquieting that the objective of some guideline developers, such as licensing bodies, is the formulation of standards or review criteria, particularly when there are very few therapeutic practices with a sufficient scientific basis to justify such a designation. National and provincial societies, as well as dental educators, need to assume a leadership role to ensure that if guidelines are required, they will be developed as credible aids for the improvement of patient care. In this paper, the authors recount why the "traditional process" of guideline development resulted in guidelines that were mistrusted by the profession and, as a result, ineffective. They also outline the widely-documented current methodology, which should be followed if guidelines are to be accepted by the profession. Finally, they discuss the critical issue of who should develop guidelines, and examine their role in dental practice and education.

  18. Mini-Ckpts: Surviving OS Failures in Persistent Memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiala, David; Mueller, Frank; Ferreira, Kurt Brian

    Concern is growing in the high-performance computing (HPC) community on the reliability of future extreme-scale systems. Current efforts have focused on application fault-tolerance rather than the operating system (OS), despite the fact that recent studies have suggested that failures in OS memory are more likely. The OS is critical to a system's correct and efficient operation of the node and processes it governs -- and in HPC also for any other nodes a parallelized application runs on and communicates with: Any single node failure generally forces all processes of this application to terminate due to tight communication in HPC. Therefore,more » the OS itself must be capable of tolerating failures. In this work, we introduce mini-ckpts, a framework which enables application survival despite the occurrence of a fatal OS failure or crash. Mini-ckpts achieves this tolerance by ensuring that the critical data describing a process is preserved in persistent memory prior to the failure. Following the failure, the OS is rejuvenated via a warm reboot and the application continues execution effectively making the failure and restart transparent. The mini-ckpts rejuvenation and recovery process is measured to take between three to six seconds and has a failure-free overhead of between 3-5% for a number of key HPC workloads. In contrast to current fault-tolerance methods, this work ensures that the operating and runtime system can continue in the presence of faults. This is a much finer-grained and dynamic method of fault-tolerance than the current, coarse-grained, application-centric methods. Handling faults at this level has the potential to greatly reduce overheads and enables mitigation of additional fault scenarios.« less

  19. Study of Ni-Mo electrodeposition in direct and pulse-reverse current

    NASA Astrophysics Data System (ADS)

    Stryuchkova, Yu M.; Rybin, N. B.; Suvorov, D. V.; Gololobov, G. P.; Tolstoguzov, A. B.; Tarabrin, D. Yu; Serpova, M. A.; Korotchenko, V. A.; Slivkin, E. V.

    2017-05-01

    Process of electrochemical deposition of the coating based on a binary nickel-molybdenum alloy onto a nickel substrate under pulse mode with current reverse within the range of current density change from 2 to 9 A/dm2 has been researched. Coating structure and its surface morphology have been studied. Method of X-ray energy dispersive spectroscopy has determined a percentage ratio of alloy components in the coating. Mode to obtain the densest and smoothest deposits has been identified under considered terms.

  20. Unified planar process for fabricating heterojunction bipolar transistors and buried-heterostructure lasers utilizing impurity-induced disordering

    NASA Astrophysics Data System (ADS)

    Thornton, R. L.; Mosby, W. J.; Chung, H. F.

    1988-12-01

    We describe results on a novel geometry of heterojunction bipolar transistor that has been realized by impurity-induced disordering. This structure is fabricated by a method that is compatible with techniques for the fabrication of low threshold current buried-heterostructure lasers. We have demonstrated this compatibility by fabricating a hybrid laser/transistor structure that operates as a laser with a threshold current of 6 mA at room temperature, and as a transistor with a current gain of 5.

  1. Mechanisms of Current Flow in the Diode Structure with an n + - p-Junction Formed by Thermal Diffusion of Phosphorus From Porous Silicon Film

    NASA Astrophysics Data System (ADS)

    Tregulov, V. V.; Litvinov, V. G.; Ermachikhin, A. V.

    2018-01-01

    Temperature dependences of current-voltage characteristics of the photoelectric converter with an antireflective film of porous silicon and an n + -p-junction formed by thermal diffusion of phosphorus from a porous film is studied. The porous silicon film was saturated with phosphorus during its growing by electrochemical method. It is shown that the current flow processes in the structure under study are significantly influenced by traps.

  2. Effects of obliquely opposing and following currents on wave propagation in a new 3D wave-current basin

    NASA Astrophysics Data System (ADS)

    Lieske, Mike; Schlurmann, Torsten

    2016-04-01

    INTRODUCTION & MOTIVATION The design of structures in coastal and offshore areas and their maintenance are key components of coastal protection. Usually, assessments of processes and loads on coastal structures are derived from experiments with flow and wave parameters in separate physical models. However, Peregrin (1976) already points out that processes in natural shallow coastal waters flow and sea state processes do not occur separately, but influence each other nonlinearly. Kemp & Simons (1982) perform 2D laboratory tests and study the interactions between a turbulent flow and following waves. They highlight the significance of wave-induced changes in the current properties, especially in the mean flow profiles, and draw attention to turbulent fluctuations and bottom shear stresses. Kemp & Simons (1983) also study these processes and features with opposing waves. Studies on the wave-current interaction in three-dimensional space for a certain wave height, wave period and water depth were conducted by MacIver et al. (2006). The research focus is set on the investigation of long-crested waves on obliquely opposing and following currents in the new 3D wave-current basin. METHODOLOGY In a first step the flow analysis without waves is carried out and includes measurements of flow profiles in the sweet spot of the basin at predefined measurement positions. Five measuring points in the water column have been delineated in different water depths in order to obtain vertical flow profiles. For the characterization of the undisturbed flow properties in the basin, an uniformly distributed flow was generated in the wave basin. In the second step wave analysis without current, the unidirectional wave propagation and wave height were investigated for long-crested waves in intermediate wave conditions. In the sweet spot of the wave basin waves with three different wave directions, three wave periods and uniform wave steepness were examined. For evaluation, we applied a common 3D wave analysis method, the Bayesian Directional Spectrum method (BDM). BDM was presented by Hashimoto et al. (1988). Lastly, identification of the wave-current interaction, the results from experiment with simultaneous waves and currents are compared with results for only-currents and only-waves in order to identify and exemplify the significance of nonlinear interaction processes. RESULTS The first results of the wave-current interaction show, as expected, a reduction in the wave height in the direction of flow and an increase in wave heights against the flow with unidirectional monochromatic waves. The superposition of current and orbital velocities cannot be conducted linearly. Furthermore, the results show a current domination for low wave periods and wave domination for larger wave periods. The criterion of a current or wave domination will be presented in the presentation. ACKNOWLEDGEMENT The support of the KFKI research project "Seegangsbelastungen (Seele)" (Contract No. 03KIS107) by the German "Federal Ministry of Education and Research (BMBF)" is gratefully acknowledged.

  3. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  4. Flotation separation of waste plastics for recycling-A review.

    PubMed

    Wang, Chong-qing; Wang, Hui; Fu, Jian-gang; Liu, You-nian

    2015-07-01

    The sharp increase of plastic wastes results in great social and environmental pressures, and recycling, as an effective way currently available to reduce the negative impacts of plastic wastes, represents one of the most dynamic areas in the plastics industry today. Froth flotation is a promising method to solve the key problem of recycling process, namely separation of plastic mixtures. This review surveys recent literature on plastics flotation, focusing on specific features compared to ores flotation, strategies, methods and principles, flotation equipments, and current challenges. In terms of separation methods, plastics flotation is divided into gamma flotation, adsorption of reagents, surface modification and physical regulation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  6. Anaerobic digestion of food waste: A review focusing on process stability.

    PubMed

    Li, Lei; Peng, Xuya; Wang, Xiaoming; Wu, Di

    2018-01-01

    Food waste (FW) is rich in biomass energy, and increasing numbers of national programs are being established to recover energy from FW using anaerobic digestion (AD). However process instability is a common operational issue for AD of FW. Process monitoring and control as well as microbial management can be used to control instability and increase the energy conversion efficiency of anaerobic digesters. Here, we review research progress related to these methods and identify existing limitations to efficient AD; recommendations for future research are also discussed. Process monitoring and control are suitable for evaluating the current operational status of digesters, whereas microbial management can facilitate early diagnosis and process optimization. Optimizing and combining these two methods are necessary to improve AD efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Beyond Standard Molecular Dynamics: Investigating the Molecular Mechanisms of G Protein-Coupled Receptors with Enhanced Molecular Dynamics Methods

    PubMed Central

    Johnston, Jennifer M.

    2014-01-01

    The majority of biological processes mediated by G Protein-Coupled Receptors (GPCRs) take place on timescales that are not conveniently accessible to standard molecular dynamics (MD) approaches, notwithstanding the current availability of specialized parallel computer architectures, and efficient simulation algorithms. Enhanced MD-based methods have started to assume an important role in the study of the rugged energy landscape of GPCRs by providing mechanistic details of complex receptor processes such as ligand recognition, activation, and oligomerization. We provide here an overview of these methods in their most recent application to the field. PMID:24158803

  8. A Simple Method to Simultaneously Detect and Identify Spikes from Raw Extracellular Recordings.

    PubMed

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2015-01-01

    The ability to track when and which neurons fire in the vicinity of an electrode, in an efficient and reliable manner can revolutionize the neuroscience field. The current bottleneck lies in spike sorting algorithms; existing methods for detecting and discriminating the activity of multiple neurons rely on inefficient, multi-step processing of extracellular recordings. In this work, we show that a single-step processing of raw (unfiltered) extracellular signals is sufficient for both the detection and identification of active neurons, thus greatly simplifying and optimizing the spike sorting approach. The efficiency and reliability of our method is demonstrated in both real and simulated data.

  9. Multiscale modeling of localized resistive heating in nanocrystalline metals subjected to electropulsing

    NASA Astrophysics Data System (ADS)

    Zhao, Jingyi; Wang, G.-X.; Dong, Yalin; Ye, Chang

    2017-08-01

    Many electrically assisted processes have been reported to induce changes in microstructure and metal plasticity. To understand the physics-based mechanisms behind these interesting phenomena, however, requires an understanding of the interaction between the electric current and heterogeneous microstructure. In this work, multiscale modeling of the electric current flow in a nanocrystalline material is reported. The cellular automata method was used to track the nanoscale grain boundaries in the matrix. Maxwell's electromagnetic equations were solved to obtain the electrical potential distribution at the macro scale. Kirchhoff's circuit equation was solved to obtain the electric current flow at the micro/nano scale. The electric current distribution at two representative locations was investigated. A significant electric current concentration was observed near the grain boundaries, particularly near the triple junctions. This higher localized electric current leads to localized resistive heating near the grain boundaries. The electric current distribution could be used to obtain critical information such as localized resistive heating rate and extra system free energy, which are critical for explaining many interesting phenomena, including microstructure evolution and plasticity enhancement in many electrically assisted processes.

  10. Additive Manufacturing Technologies Used for Processing Polymers: Current Status and Potential Application in Prosthetic Dentistry.

    PubMed

    Revilla-León, Marta; Özcan, Mutlu

    2018-04-22

    There are 7 categories of additive manufacturing (AM) technologies, and a wide variety of materials can be used to build a CAD 3D object. The present article reviews the main AM processes for polymers for dental applications: stereolithography (SLA), digital light processing (DLP), material jetting (MJ), and material extrusion (ME). The manufacturing process, accuracy, and precision of these methods will be reviewed, as well as their prosthodontic applications. © 2018 by the American College of Prosthodontists.

  11. Signal processing in ultrasound. [for diagnostic medicine

    NASA Technical Reports Server (NTRS)

    Le Croissette, D. H.; Gammell, P. M.

    1978-01-01

    Signal is the term used to denote the characteristic in the time or frequency domain of the probing energy of the system. Processing of this signal in diagnostic ultrasound occurs as the signal travels through the ultrasonic and electrical sections of the apparatus. The paper discusses current signal processing methods, postreception processing, display devices, real-time imaging, and quantitative measurements in noninvasive cardiology. The possibility of using deconvolution in a single transducer system is examined, and some future developments using digital techniques are outlined.

  12. Characterization of transverse profiles

    DOT National Transportation Integrated Search

    2001-04-01

    A study of the transverse profile data currently being collected under the Long Term Pavement Performance project was undertaken. The data were collected by three processes: (1) Dipstick, (2) a photographic method, and (3) straightedge used to collec...

  13. From papers to practices: district level priority setting processes and criteria for family planning, maternal, newborn and child health interventions in Tanzania.

    PubMed

    Chitama, Dereck; Baltussen, Rob; Ketting, Evert; Kamazima, Switbert; Nswilla, Anna; Mujinja, Phares G M

    2011-10-21

    Successful priority setting is increasingly known to be an important aspect in achieving better family planning, maternal, newborn and child health (FMNCH) outcomes in developing countries. However, far too little attention has been paid to capturing and analysing the priority setting processes and criteria for FMNCH at district level. This paper seeks to capture and analyse the priority setting processes and criteria for FMNCH at district level in Tanzania. Specifically, we assess the FMNCH actor's engagement and understanding, the criteria used in decision making and the way criteria are identified, the information or evidence and tools used to prioritize FMNCH interventions at district level in Tanzania. We conducted an exploratory study mixing both qualitative and quantitative methods to capture and analyse the priority setting for FMNCH at district level, and identify the criteria for priority setting. We purposively sampled the participants to be included in the study. We collected the data using the nominal group technique (NGT), in-depth interviews (IDIs) with key informants and documentary review. We analysed the collected data using both content analysis for qualitative data and correlation analysis for quantitative data. We found a number of shortfalls in the district's priority setting processes and criteria which may lead to inefficient and unfair priority setting decisions in FMNCH. In addition, participants identified the priority setting criteria and established the perceived relative importance of the identified criteria. However, we noted differences exist in judging the relative importance attached to the criteria by different stakeholders in the districts. In Tanzania, FMNCH contents in both general development policies and sector policies are well articulated. However, the current priority setting process for FMNCH at district levels are wanting in several aspects rendering the priority setting process for FMNCH inefficient and unfair (or unsuccessful). To improve district level priority setting process for the FMNCH interventions, we recommend a fundamental revision of the current FMNCH interventions priority setting process. The improvement strategy should utilize rigorous research methods combining both normative and empirical methods to further analyze and correct past problems at the same time use the good practices to improve the current priority setting process for FMNCH interventions. The suggested improvements might give room for efficient and fair (or successful) priority setting process for FMNCH interventions.

  14. Electrical energy consumption control apparatuses and electrical energy consumption control methods

    DOEpatents

    Hammerstrom, Donald J.

    2012-09-04

    Electrical energy consumption control apparatuses and electrical energy consumption control methods are described. According to one aspect, an electrical energy consumption control apparatus includes processing circuitry configured to receive a signal which is indicative of current of electrical energy which is consumed by a plurality of loads at a site, to compare the signal which is indicative of current of electrical energy which is consumed by the plurality of loads at the site with a desired substantially sinusoidal waveform of current of electrical energy which is received at the site from an electrical power system, and to use the comparison to control an amount of the electrical energy which is consumed by at least one of the loads of the site.

  15. Sequential Vapor Infiltration Treatment Enhances the Ionic Current Rectification Performance of Composite Membranes Based on Mesoporous Silica Confined in Anodic Alumina.

    PubMed

    Liang, Yanyan; Liu, Zhengping

    2016-12-20

    Ionic current rectification of nanofluidic diode membranes has been studied widely in recent years because it is analogous to the functionality of biological ion channels in principle. We report a new method to fabricate ionic current rectification membranes based on mesoporous silica confined in anodic aluminum oxide (AAO) membranes. Two types of mesostructured silica nanocomposites, hexagonal structure and nanoparticle stacked structure, were used to asymmetrically fill nanochannels of AAO membranes by a vapor-phase synthesis (VPS) method with aspiration approach and were further modified via sequence vapor infiltration (SVI) treatment. The ionic current measurements indicated that SVI treatment can modulate the asymmetric ionic transport in prepared membranes, which exhibited clear ionic current rectification phenomenon under optimal conditions. The ionic current rectifying behavior is derived from the asymmetry of surface conformations, silica species components, and hydrophobic wettability, which are created by the asymmetrical filling type, silica depositions on the heterogeneous membranes, and the condensation of silanol groups. This article provides a considerable strategy to fabricate composite membranes with obvious ionic current rectification performance via the cooperation of the VPS method and SVI treatment and opens up the potential of mesoporous silica confined in AAO membranes to mimic fluid transport in biological processes.

  16. Determination of Network Attributes from a High Resolution Terrain Data Base

    DTIC Science & Technology

    1987-09-01

    and existing models is in the method used to make decisions. All of ,he models- reviewed when developing the ALARM strategy depended either on threshold...problems with the methods currently accepted and used to *model the decision process. These methods are recognized because they have their uses...observation, detection, and lines of sight along a narrow strip of terrain relative to the overall size of the sectors of the two forces. Existing methods of

  17. Status and perspectives for the electron beam technology for flue gases treatment

    NASA Astrophysics Data System (ADS)

    Frank, Norman W.

    The electron-beam process is one of the most effective methods of removing SO 2 and NO x from industrial flue gases. This flue gas treatment consists of adding a small amount of ammonia to the flue gas and irradiating the gas by means of an electron beam, thereby causing reactions which convert the SO 2 and NO x to ammonium sulfate and ammonium sulfate-nitrate. These salts may the be collected from the flue gas by means of such conventional collectors as an electrostatic precipitator or baghouse. This process has numerous advantages over currently-used conventional processes as follows: (1) the process simultaneously removes SO 2 and NO x from flue gas at high efficiency levels; (2) it is a dry process which is easily controlled and has excellent load-following capability; (3) stack-gas reheat is not required; (4) the pollutants are converted into a saleable agricultural fertilizer; (5) the process has low capital and operating cost requirements. The history of the process is shown with a summary of the work that is presently underway. All of the current work is for the purpose of fine tuning the process for commercial usage. It is believed that with current testing and improvements, the process will be very competitive with existing processes and it will find its place in an environmental conscious world.

  18. The Development of Online Tutorial Program Design Using Problem-Based Learning in Open Distance Learning System

    ERIC Educational Resources Information Center

    Said, Asnah; Syarif, Edy

    2016-01-01

    This research aimed to evaluate of online tutorial program design by applying problem-based learning Research Methods currently implemented in the system of Open Distance Learning (ODL). The students must take a Research Methods course to prepare themselves for academic writing projects. Problem-based learning basically emphasizes the process of…

  19. Error minimization algorithm for comparative quantitative PCR analysis: Q-Anal.

    PubMed

    OConnor, William; Runquist, Elizabeth A

    2008-07-01

    Current methods for comparative quantitative polymerase chain reaction (qPCR) analysis, the threshold and extrapolation methods, either make assumptions about PCR efficiency that require an arbitrary threshold selection process or extrapolate to estimate relative levels of messenger RNA (mRNA) transcripts. Here we describe an algorithm, Q-Anal, that blends elements from current methods to by-pass assumptions regarding PCR efficiency and improve the threshold selection process to minimize error in comparative qPCR analysis. This algorithm uses iterative linear regression to identify the exponential phase for both target and reference amplicons and then selects, by minimizing linear regression error, a fluorescence threshold where efficiencies for both amplicons have been defined. From this defined fluorescence threshold, cycle time (Ct) and the error for both amplicons are calculated and used to determine the expression ratio. Ratios in complementary DNA (cDNA) dilution assays from qPCR data were analyzed by the Q-Anal method and compared with the threshold method and an extrapolation method. Dilution ratios determined by the Q-Anal and threshold methods were 86 to 118% of the expected cDNA ratios, but relative errors for the Q-Anal method were 4 to 10% in comparison with 4 to 34% for the threshold method. In contrast, ratios determined by an extrapolation method were 32 to 242% of the expected cDNA ratios, with relative errors of 67 to 193%. Q-Anal will be a valuable and quick method for minimizing error in comparative qPCR analysis.

  20. Nondestructive detection of an undesirable metallic phase, T.sub.1, during processing of aluminum-lithium alloys

    DOEpatents

    Buck, Otto; Bracci, David J.; Jiles, David C.; Brasche, Lisa J. H.; Shield, Jeffrey E.; Chumbley, Leonard S.

    1990-08-07

    A method is disclosed for detecting the T.sub.1 phase in aluminum-lithium alloys through simultaneous measurement of conductivity and hardness. In employing eddy current to measure conductivity, when the eddy current decreases with aging of the alloy, while the hardness of the material continues to increase, the presence of the T.sub.1 phase may be detected.

  1. Superpersistent Currents in Dirac Fermion Systems

    DTIC Science & Technology

    2017-03-06

    development of quantum mechanics,, but also to quantum information processing and computing . Exploiting various physical systems to realize two-level...Here, using the QSD method, we calculated the dynamical trajectories of the system in the quantum regime. Our computations extending to the long time...currents in 2D Dirac material systems and pertinent phenomena in the emerging field of relativistic quantum nonlinear dynamics and chaos. Systematic

  2. On simulation of local fluxes in molecular junctions

    NASA Astrophysics Data System (ADS)

    Cabra, Gabriel; Jensen, Anders; Galperin, Michael

    2018-05-01

    We present a pedagogical review of the current density simulation in molecular junction models indicating its advantages and deficiencies in analysis of local junction transport characteristics. In particular, we argue that current density is a universal tool which provides more information than traditionally simulated bond currents, especially when discussing inelastic processes. However, current density simulations are sensitive to the choice of basis and electronic structure method. We note that while discussing the local current conservation in junctions, one has to account for the source term caused by the open character of the system and intra-molecular interactions. Our considerations are illustrated with numerical simulations of a benzenedithiol molecular junction.

  3. Optimization of Advanced ACTPol Transition Edge Sensor Bolometer Operation Using R(T,I) Transition Measurements

    NASA Astrophysics Data System (ADS)

    Salatino, Maria

    2017-06-01

    In the current submm and mm cosmology experiments the focal planes are populated by kilopixel transition edge sensors (TESes). Varying incoming power load requires frequent rebiasing of the TESes through standard current-voltage (IV) acquisition. The time required to perform IVs on such large arrays and the resulting transient heating of the bath reduces the sky observation time. We explore a bias step method that significantly reduces the time required for the rebiasing process. This exploits the detectors' responses to the injection of a small square wave signal on top of the dc bias current and knowledge of the shape of the detector transition R(T,I). This method has been tested on two detector arrays of the Atacama Cosmology Telescope (ACT). In this paper, we focus on the first step of the method, the estimate of the TES %Rn.

  4. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  5. A new theoretical approach to analyze complex processes in cytoskeleton proteins.

    PubMed

    Li, Xin; Kolomeisky, Anatoly B

    2014-03-20

    Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.

  6. An image-processing methodology for extracting bloodstain pattern features.

    PubMed

    Arthur, Ravishka M; Humburg, Philomena J; Hoogenboom, Jerry; Baiker, Martin; Taylor, Michael C; de Bruin, Karla G

    2017-08-01

    There is a growing trend in forensic science to develop methods to make forensic pattern comparison tasks more objective. This has generally involved the application of suitable image-processing methods to provide numerical data for identification or comparison. This paper outlines a unique image-processing methodology that can be utilised by analysts to generate reliable pattern data that will assist them in forming objective conclusions about a pattern. A range of features were defined and extracted from a laboratory-generated impact spatter pattern. These features were based in part on bloodstain properties commonly used in the analysis of spatter bloodstain patterns. The values of these features were consistent with properties reported qualitatively for such patterns. The image-processing method developed shows considerable promise as a way to establish measurable discriminating pattern criteria that are lacking in current bloodstain pattern taxonomies. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A Novel Calibration-Minimum Method for Prediction of Mole Fraction in Non-Ideal Mixture.

    PubMed

    Shibayama, Shojiro; Kaneko, Hiromasa; Funatsu, Kimito

    2017-04-01

    This article proposes a novel concentration prediction model that requires little training data and is useful for rapid process understanding. Process analytical technology is currently popular, especially in the pharmaceutical industry, for enhancement of process understanding and process control. A calibration-free method, iterative optimization technology (IOT), was proposed to predict pure component concentrations, because calibration methods such as partial least squares, require a large number of training samples, leading to high costs. However, IOT cannot be applied to concentration prediction in non-ideal mixtures because its basic equation is derived from the Beer-Lambert law, which cannot be applied to non-ideal mixtures. We proposed a novel method that realizes prediction of pure component concentrations in mixtures from a small number of training samples, assuming that spectral changes arising from molecular interactions can be expressed as a function of concentration. The proposed method is named IOT with virtual molecular interaction spectra (IOT-VIS) because the method takes spectral change as a virtual spectrum x nonlin,i into account. It was confirmed through the two case studies that the predictive accuracy of IOT-VIS was the highest among existing IOT methods.

  8. Method for measuring the alternating current half-wave voltage of a Mach-Zehnder modulator based on opto-electronic oscillation.

    PubMed

    Hong, Jun; Chen, Dongchu; Peng, Zhiqiang; Li, Zulin; Liu, Haibo; Guo, Jian

    2018-05-01

    A new method for measuring the alternating current (AC) half-wave voltage of a Mach-Zehnder modulator is proposed and verified by experiment in this paper. Based on the opto-electronic self-oscillation technology, the physical relationship between the saturation output power of the oscillating signal and the AC half-wave voltage is revealed, and the value of the AC half-wave voltage is solved by measuring the saturation output power of the oscillating signal. The experimental results show that the measured data of this new method involved are in agreement with a traditional method, and not only an external microwave signal source but also the calibration for different frequency measurements is not needed in our new method. The measuring process is simplified with this new method on the premise of ensuring the accuracy of measurement, and it owns good practical value.

  9. Statistical Comparisons of Meso- and Small-Scale Field-Aligned Currents with Auroral Electron Acceleration Mechanisms from FAST Observations

    NASA Astrophysics Data System (ADS)

    Dombeck, J. P.; Cattell, C. A.; Prasad, N.; Sakher, A.; Hanson, E.; McFadden, J. P.; Strangeway, R. J.

    2016-12-01

    Field-aligned currents (FACs) provide a fundamental driver and means of Magnetosphere-Ionosphere (M-I) coupling. These currents need to be supported by local physics along the entire field line generally with quasi-static potential structures, but also supporting the time-evolution of the structures and currents, producing Alfvén waves and Alfvénic electron acceleration. In regions of upward current, precipitating auroral electrons are accelerated earthward. These processes can result in ion outflow, changes in ionospheric conductivity, and affect the particle distributions on the field line, affecting the M-I coupling processes supporting the individual FACs and potentially the entire FAC system. The FAST mission was well suited to study both the FACs and the electron auroral acceleration processes. We present the results of the comparisons between meso- and small-scale FACs determined from FAST using the method of Peria, et al., 2000, and our FAST auroral acceleration mechanism study when such identification is possible for the entire ˜13 year FAST mission. We also present the latest results of the electron energy (and number) flux ionospheric input based on acceleration mechanism (and FAC characteristics) from our FAST auroral acceleration mechanism study.

  10. Effects of electrode settings on chlorine generation efficiency of electrolyzing seawater.

    PubMed

    Hsu, Guoo-Shyng Wang; Hsia, Chih-Wei; Hsu, Shun-Yao

    2015-12-01

    Electrolyzed water has significant disinfection effects, can comply with food safety regulations, and is environmental friendly. We investigated the effects of immersion depth of electrodes, stirring, electrode size, and electrode gap on the properties and chlorine generation efficiency of electrolyzing seawater and its storage stability. Results indicated that temperature and oxidation-reduction potential (ORP) of the seawater increased gradually, whereas electrical conductivity decreased steadily in electrolysis. During the electrolysis process, pH values and electric currents also decreased slightly within small ranges. Additional stirring or immersing the electrodes deep under the seawater significantly increased current density without affecting its electric efficiency and current efficiency. Decreasing electrode size or increasing electrode gap decreased chlorine production and electric current of the process without affecting its electric efficiency and current efficiency. Less than 35% of chlorine in the electrolyzed seawater was lost in a 3-week storage period. The decrement trend leveled off after the 1 st week of storage. The electrolyzing system is a convenient and economical method for producing high-chlorine seawater, which will have high potential applications in agriculture, aquaculture, or food processing. Copyright © 2015. Published by Elsevier B.V.

  11. Eddy current characterization of magnetic treatment of nickel 200

    NASA Technical Reports Server (NTRS)

    Chern, E. J.

    1993-01-01

    Eddy current methods have been applied to characterize the effect of magnetic treatments on component service-life extension. Coil impedance measurements were acquired and analyzed on nickel 200 specimens that have been subjected to many mechanical and magnetic engineering processes: annealing, applied strain, magnetic field, shot peening, and magnetic field after peening. Experimental results have demonstrated a functional relationship between coil impedance, resistance and reactance, and specimens subjected to various engineering processes. It has shown that magnetic treatment does induce changes in electromagnetic properties of nickel 200 that then exhibit evidence of stress relief. However, further fundamental studies are necessary for a thorough understanding of the exact mechanism of the magnetic field processing effect on machine-tool service life.

  12. Classification of ligand molecules in PDB with graph match-based structural superposition.

    PubMed

    Shionyu-Mitsuyama, Clara; Hijikata, Atsushi; Tsuji, Toshiyuki; Shirai, Tsuyoshi

    2016-12-01

    The fast heuristic graph match algorithm for small molecules, COMPLIG, was improved by adding a structural superposition process to verify the atom-atom matching. The modified method was used to classify the small molecule ligands in the Protein Data Bank (PDB) by their three-dimensional structures, and 16,660 types of ligands in the PDB were classified into 7561 clusters. In contrast, a classification by a previous method (without structure superposition) generated 3371 clusters from the same ligand set. The characteristic feature in the current classification system is the increased number of singleton clusters, which contained only one ligand molecule in a cluster. Inspections of the singletons in the current classification system but not in the previous one implied that the major factors for the isolation were differences in chirality, cyclic conformations, separation of substructures, and bond length. Comparisons between current and previous classification systems revealed that the superposition-based classification was effective in clustering functionally related ligands, such as drugs targeted to specific biological processes, owing to the strictness of the atom-atom matching.

  13. Eddy current testing for blade edge micro cracks of aircraft engine

    NASA Astrophysics Data System (ADS)

    Zhang, Wei-min; Xu, Min-dong; Gao, Xuan-yi; Jin, Xin; Qin, Feng

    2017-10-01

    Based on the problems of low detection efficiency in the micro cracks detection of aircraft engine blades, a differential excitation eddy current testing system was designed and developed. The function and the working principle of the system were described, the problems which contained the manufacture method of simulated cracks, signal generating, signal processing and the signal display method were described. The detection test was carried out by taking a certain model aircraft engine blade with simulated cracks as a tested specimen. The test data was processed by digital low-pass filter in the computer and the crack signals of time domain display and Lissajous figure display were acquired. By comparing the test results, it is verified that Lissajous figure display shows better performance compared to time domain display when the crack angle is small. The test results show that the eddy current testing system designed in this paper is feasible to detect the micro cracks on the aeroengine blade and can effectively improve the detection efficiency of micro cracks in the practical detection work.

  14. Machine Learning for Discriminating Quantum Measurement Trajectories and Improving Readout.

    PubMed

    Magesan, Easwar; Gambetta, Jay M; Córcoles, A D; Chow, Jerry M

    2015-05-22

    Current methods for classifying measurement trajectories in superconducting qubit systems produce fidelities systematically lower than those predicted by experimental parameters. Here, we place current classification methods within the framework of machine learning (ML) algorithms and improve on them by investigating more sophisticated ML approaches. We find that nonlinear algorithms and clustering methods produce significantly higher assignment fidelities that help close the gap to the fidelity possible under ideal noise conditions. Clustering methods group trajectories into natural subsets within the data, which allows for the diagnosis of systematic errors. We find large clusters in the data associated with T1 processes and show these are the main source of discrepancy between our experimental and ideal fidelities. These error diagnosis techniques help provide a path forward to improve qubit measurements.

  15. Is transcranial direct current stimulation a potential method for improving response inhibition?☆

    PubMed Central

    Kwon, Yong Hyun; Kwon, Jung Won

    2013-01-01

    Inhibitory control of movement in motor learning requires the ability to suppress an inappropriate action, a skill needed to stop a planned or ongoing motor response in response to changes in a variety of environments. This study used a stop-signal task to determine whether transcranial direct-current stimulation over the pre-supplementary motor area alters the reaction time in motor inhibition. Forty healthy subjects were recruited for this study and were randomly assigned to either the transcranial direct-current stimulation condition or a sham-transcranial direct-current stimulation condition. All subjects consecutively performed the stop-signal task before, during, and after the delivery of anodal transcranial direct-current stimulation over the pre-supplementary motor area (pre-transcranial direct-current stimulation phase, transcranial direct-current stimulation phase, and post-transcranial direct-current stimulation phase). Compared to the sham condition, there were significant reductions in the stop-signal processing times during and after transcranial direct-current stimulation, and change times were significantly greater in the transcranial direct-current stimulation condition. There was no significant change in go processing-times during or after transcranial direct-current stimulation in either condition. Anodal transcranial direct-current stimulation was feasibly coupled to an interactive improvement in inhibitory control. This coupling led to a decrease in the stop-signal process time required for the appropriate responses between motor execution and inhibition. However, there was no transcranial direct-current stimulation effect on the no-signal reaction time during the stop-signal task. Transcranial direct-current stimulation can adjust certain behaviors, and it could be a useful clinical intervention for patients who have difficulties with response inhibition. PMID:25206399

  16. Is transcranial direct current stimulation a potential method for improving response inhibition?

    PubMed

    Kwon, Yong Hyun; Kwon, Jung Won

    2013-04-15

    Inhibitory control of movement in motor learning requires the ability to suppress an inappropriate action, a skill needed to stop a planned or ongoing motor response in response to changes in a variety of environments. This study used a stop-signal task to determine whether transcranial direct-current stimulation over the pre-supplementary motor area alters the reaction time in motor inhibition. Forty healthy subjects were recruited for this study and were randomly assigned to either the transcranial direct-current stimulation condition or a sham-transcranial direct-current stimulation condition. All subjects consecutively performed the stop-signal task before, during, and after the delivery of anodal transcranial direct-current stimulation over the pre-supplementary motor area (pre-transcranial direct-current stimulation phase, transcranial direct-current stimulation phase, and post-transcranial direct-current stimulation phase). Compared to the sham condition, there were significant reductions in the stop-signal processing times during and after transcranial direct-current stimulation, and change times were significantly greater in the transcranial direct-current stimulation condition. There was no significant change in go processing-times during or after transcranial direct-current stimulation in either condition. Anodal transcranial direct-current stimulation was feasibly coupled to an interactive improvement in inhibitory control. This coupling led to a decrease in the stop-signal process time required for the appropriate responses between motor execution and inhibition. However, there was no transcranial direct-current stimulation effect on the no-signal reaction time during the stop-signal task. Transcranial direct-current stimulation can adjust certain behaviors, and it could be a useful clinical intervention for patients who have difficulties with response inhibition.

  17. The pyroelectric behavior of lead free ferroelectric ceramics in thermally stimulated depolarization current measurements

    NASA Astrophysics Data System (ADS)

    González-Abreu, Y.; Peláiz-Barranco, A.; Garcia-Wong, A. C.; Guerra, J. D. S.

    2012-06-01

    The present paper shows a detailed analysis on the thermally stimulated processes in barium modified SrBi2Nb2O9 ferroelectric bi-layered perovskite, which is one of the most promising candidates for non-volatile random access memory applications because of its excellent fatigue-resistant properties. A numerical method is used to separate the real pyroelectric current from the other thermally stimulated processes. A discharge due to the space-charge injected during the poling process, the pyroelectric response, and a conductive process are discussed in a wide temperature range from ferroelectric to paraelectric phase. The pyroelectric response is separated from the other components to evaluate the polarization behavior and some pyroelectric parameters. The remanent polarization, the pyroelectric coefficient, and the merit figure are evaluated, which show good results.

  18. Development and Application of Eddy Current Sensor Arrays for Process Integrated Inspection of Carbon Fibre Preforms.

    PubMed

    Berger, Dietrich; Lanza, Gisela

    2017-12-21

    This publication presents the realisation of a sensor concept, which is based on eddy current testing, to detect textile defects during preforming of semi-finished carbon fibre parts. The presented system has the potential for 100% control of manufactured carbon fibre based components, allowing the immediate exclusion of defective parts from further process steps. The core innovation of this system is given by the high degree of process integration, which has not been implemented in the state of the art. The publication presents the functional principle of the sensor that is based on half-transmission probes as well as the signals that can be gained by its application. Furthermore, a method to determine the optimum sensor resolution is presented as well as the sensor housing and its integration in the preforming process.

  19. An overview of thixoforming process

    NASA Astrophysics Data System (ADS)

    Husain, N. H.; Ahmad, A. H.; Rashidi, M. M.

    2017-10-01

    Thixoforming is a forming process which exploits metal rheological behaviour during solidus and liquidus range temperature. Many research works in thixoforming are currently focusing on the raw material used to produce superior mechanical properties and excellent formability components, especially in automotive industries. Furthermore, the thixoforming process also produced less casting defect component such as macrosegration, shrinkage and porosity. These advantages are sufficient to attract more exploration works of thixoforming operation. However, the weakness of this process such as high production cost due to leftover billet which cannot be recycled, encourage researcher works to overcome thixoforming limitations by using various methods. The thixoforming methods that widely used are thixocasting, thixoforging, thixorolling, thixoextrusion and thixomoulding. Each method provides varieties of final product characteristics; hence offer the extensive possibility of component invention. On the other hand, new thixoforming method leads to exploration research such as microstructure evolution, heating and pouring temperature, die temperature, mechanical properties, viscosity and final product quality. This review paper presents findings in the rheological material behaviour of thixoforming, advantages and disadvantanges of thixoforming, parameters affecting the thixoforming operation, morphology of thixoforming and various methods which have been used in this research area.

  20. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  1. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  2. On eco-efficient technologies to minimize industrial water consumption

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  3. Adapting Western research methods to indigenous ways of knowing.

    PubMed

    Simonds, Vanessa W; Christopher, Suzanne

    2013-12-01

    Indigenous communities have long experienced exploitation by researchers and increasingly require participatory and decolonizing research processes. We present a case study of an intervention research project to exemplify a clash between Western research methodologies and Indigenous methodologies and how we attempted reconciliation. We then provide implications for future research based on lessons learned from Native American community partners who voiced concern over methods of Western deductive qualitative analysis. Decolonizing research requires constant reflective attention and action, and there is an absence of published guidance for this process. Continued exploration is needed for implementing Indigenous methods alone or in conjunction with appropriate Western methods when conducting research in Indigenous communities. Currently, examples of Indigenous methods and theories are not widely available in academic texts or published articles, and are often not perceived as valid.

  4. Modeling of heat transfer in a vascular tissue-like medium during an interstitial hyperthermia process.

    PubMed

    Hassanpour, Saeid; Saboonchi, Ahmad

    2016-12-01

    This paper aims to evaluate the role of small vessels in heat transfer mechanisms of a tissue-like medium during local intensive heating processes, for example, an interstitial hyperthermia treatment. To this purpose, a cylindrical tissue with two co- and counter-current vascular networks and a central heat source is introduced. Next, the energy equations of tissue, supply fluid (arterial blood), and return fluid (venous blood) are derived using porous media approach. Then, a 2D computer code is developed to predict the temperature of blood (fluid phase) and tissue (solid phase) by conventional volume averaging method and a more realistic solution method. In latter method, despite the volume averaging the blood of interconnect capillaries is separated from the arterial and venous blood phases. It is found that in addition to blood perfusion rate, the arrangement of vascular network has considerable effects on the pattern and amount of the achieved temperature. In contrast to counter-current network, the co-current network of vessels leads to considerable asymmetric pattern of temperature contours and relocation of heat affected zone along the blood flow direction. However this relocation can be prevented by changing the site of hyperthermia heat source. The results show that the cooling effect of co-current blood vessels during of interstitial heating is more efficient. Despite much anatomical dissimilarities, these findings can be useful in designing of protocols for hyperthermia cancer treatment of living tissue. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Su, Gloria H; Emmert-Buck, Michael R

    2005-02-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This article reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies, and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing, and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high quality, appropriately anatomically tagged scientific results. In optimized protocols is a source of inefficiency in current life science research. Improvement in this area will significantly increase life science quality and productivity. The article is divided into introduction, materials, protocols, and notes sections. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this article, readers are advised to read through the entire article first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.

  6. Clarifying values: an updated review

    PubMed Central

    2013-01-01

    Background Consensus guidelines have recommended that decision aids include a process for helping patients clarify their values. We sought to examine the theoretical and empirical evidence related to the use of values clarification methods in patient decision aids. Methods Building on the International Patient Decision Aid Standards (IPDAS) Collaboration’s 2005 review of values clarification methods in decision aids, we convened a multi-disciplinary expert group to examine key definitions, decision-making process theories, and empirical evidence about the effects of values clarification methods in decision aids. To summarize the current state of theory and evidence about the role of values clarification methods in decision aids, we undertook a process of evidence review and summary. Results Values clarification methods (VCMs) are best defined as methods to help patients think about the desirability of options or attributes of options within a specific decision context, in order to identify which option he/she prefers. Several decision making process theories were identified that can inform the design of values clarification methods, but no single “best” practice for how such methods should be constructed was determined. Our evidence review found that existing VCMs were used for a variety of different decisions, rarely referenced underlying theory for their design, but generally were well described in regard to their development process. Listing the pros and cons of a decision was the most common method used. The 13 trials that compared decision support with or without VCMs reached mixed results: some found that VCMs improved some decision-making processes, while others found no effect. Conclusions Values clarification methods may improve decision-making processes and potentially more distal outcomes. However, the small number of evaluations of VCMs and, where evaluations exist, the heterogeneity in outcome measures makes it difficult to determine their overall effectiveness or the specific characteristics that increase effectiveness. PMID:24625261

  7. Making the Failure More Productive: Scaffolding the Invention Process to Improve Inquiry Behaviors and Outcomes in Invention Activities

    ERIC Educational Resources Information Center

    Holmes, N. G.; Day, James; Park, Anthony H. K.; Bonn, D. A.; Roll, Ido

    2014-01-01

    Invention activities are Productive Failure activities in which students attempt (and often fail) to invent methods that capture deep properties of a construct before being taught expert solutions. The current study evaluates the effect of scaffolding on the invention processes and outcomes, given that students are not expected to succeed in their…

  8. Application of Formalised Developmental Feedback for Feed-Forward to Foster Student Ownership of the Learning Process

    ERIC Educational Resources Information Center

    Todd, Valerie J.; McIlroy, David

    2014-01-01

    There has been considerable criticism of assessment methods because of inconsistencies across modules and a focus on the measurement of learning rather than assessment for learning. The aim of the current study was to formalise the process of assessment feedback to feed-forward, and assess the impact on student learning. A cohort of undergraduate…

  9. How Students Choose a College: Understanding the Role of Internet Based Resources in the College Choice Process

    ERIC Educational Resources Information Center

    Burdett, Kimberli R.

    2013-01-01

    The purpose of this study was to gain a better understanding of how current internet-based resources are affecting the college choice process. An explanatory mixed methods design was used, and the study involved collecting qualitative data after a quantitative phase to explain the quantitative data in greater depth. An additional study was…

  10. Effect of Warning Placement on the Information Processing of College Students Reading an OTC Drug Facts Panel

    ERIC Educational Resources Information Center

    Bhansali, Archita H.; Sangani, Darshan S.; Mhatre, Shivani K.; Sansgiry, Sujit S.

    2018-01-01

    Objective: To compare three over-the-counter (OTC) Drug Facts panel versions for information processing optimization among college students.Participants: University of Houston students (N = 210) participated in a cross-sectional survey from January to May 2010.Methods: A current FDA label was compared to two experimental labels developed using the…

  11. Low cost hydrogen/novel membrane technology for hydrogen separation from synthesis gas. Task 1, Literature survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-02-01

    To make the coal-to-hydrogen route economically attractive, improvements are being sought in each step of the process: coal gasification, water-carbon monoxide shift reaction, and hydrogen separation. This report addresses the use of membranes in the hydrogen separation step. The separation of hydrogen from synthesis gas is a major cost element in the manufacture of hydrogen from coal. Separation by membranes is an attractive, new, and still largely unexplored approach to the problem. Membrane processes are inherently simple and efficient and often have lower capital and operating costs than conventional processes. In this report current ad future trends in hydrogen productionmore » and use are first summarized. Methods of producing hydrogen from coal are then discussed, with particular emphasis on the Texaco entrained flow gasifier and on current methods of separating hydrogen from this gas stream. The potential for membrane separations in the process is then examined. In particular, the use of membranes for H{sub 2}/CO{sub 2}, H{sub 2}/CO, and H{sub 2}/N{sub 2} separations is discussed. 43 refs., 14 figs., 6 tabs.« less

  12. Patch-based models and algorithms for image processing: a review of the basic principles and methods, and their application in computed tomography.

    PubMed

    Karimi, Davood; Ward, Rabab K

    2016-10-01

    Image models are central to all image processing tasks. The great advancements in digital image processing would not have been made possible without powerful models which, themselves, have evolved over time. In the past decade, "patch-based" models have emerged as one of the most effective models for natural images. Patch-based methods have outperformed other competing methods in many image processing tasks. These developments have come at a time when greater availability of powerful computational resources and growing concerns over the health risks of the ionizing radiation encourage research on image processing algorithms for computed tomography (CT). The goal of this paper is to explain the principles of patch-based methods and to review some of their recent applications in CT. We first review the central concepts in patch-based image processing and explain some of the state-of-the-art algorithms, with a focus on aspects that are more relevant to CT. Then, we review some of the recent application of patch-based methods in CT. Patch-based methods have already transformed the field of image processing, leading to state-of-the-art results in many applications. More recently, several studies have proposed patch-based algorithms for various image processing tasks in CT, from denoising and restoration to iterative reconstruction. Although these studies have reported good results, the true potential of patch-based methods for CT has not been yet appreciated. Patch-based methods can play a central role in image reconstruction and processing for CT. They have the potential to lead to substantial improvements in the current state of the art.

  13. Method for Examination and Documentation of Basic Information and Metadata from Published Reports Relevant to the Study of Stormwater Runoff Quality

    USGS Publications Warehouse

    Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.

    1999-01-01

    A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.

  14. Bridging the clinician/researcher gap with systemic research: the case for process research, dyadic, and sequential analysis.

    PubMed

    Oka, Megan; Whiting, Jason

    2013-01-01

    In Marriage and Family Therapy (MFT), as in many clinical disciplines, concern surfaces about the clinician/researcher gap. This gap includes a lack of accessible, practical research for clinicians. MFT clinical research often borrows from the medical tradition of randomized control trials, which typically use linear methods, or follow procedures distanced from "real-world" therapy. We review traditional research methods and their use in MFT and propose increased use of methods that are more systemic in nature and more applicable to MFTs: process research, dyadic data analysis, and sequential analysis. We will review current research employing these methods, as well as suggestions and directions for further research. © 2013 American Association for Marriage and Family Therapy.

  15. Advanced Near Net Shape Technology

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The objective of the Advanced Near Net Shape Technology (ANNST) project is to radically improve near net shape manufacturing methods from the current Technology/ Manufacturing Readiness Levels (TRL/MRL 3-4) to the point where they are viable candidates (TRL/ MRL-6) for shortening the time and cost for insertion of new aluminum alloys and revolutionary manufacturing methods into the development/improvement of space structures. Conventional cyrotank manufacturing processes require fabrication of multiple pieces welded together to form a complete tank. A variety of near net shape manufacturing processes has demonstrated excellent potential for enabling single-piece construction of components such as domes, barrels, and ring frames. Utilization of such processes can dramatically reduce the extent of welding and joining needed to construct cryogenic tanks and other aerospace structures. The specific focus of this project is to successfully mature the integrally stiffened cylinder (ISC) process in which a single-piece cylinder with integral stiffeners is formed in one spin/flow forming process. Structural launch vehicle components, like cryogenic fuel tanks (e.g., space shuttle external tank), are currently fabricated via multipiece assembly of parts produced through subtractive manufacturing techniques. Stiffened structural panels are heavily machined from thick plate, which results in excessive scrap rates. Multipiece construction requires welds to assemble the structure, which increases the risk for defects and catastrophic failures.

  16. A Review of Algorithms for Segmentation of Optical Coherence Tomography from Retina

    PubMed Central

    Kafieh, Raheleh; Rabbani, Hossein; Kermani, Saeed

    2013-01-01

    Optical coherence tomography (OCT) is a recently established imaging technique to describe different information about the internal structures of an object and to image various aspects of biological tissues. OCT image segmentation is mostly introduced on retinal OCT to localize the intra-retinal boundaries. Here, we review some of the important image segmentation methods for processing retinal OCT images. We may classify the OCT segmentation approaches into five distinct groups according to the image domain subjected to the segmentation algorithm. Current researches in OCT segmentation are mostly based on improving the accuracy and precision, and on reducing the required processing time. There is no doubt that current 3-D imaging modalities are now moving the research projects toward volume segmentation along with 3-D rendering and visualization. It is also important to develop robust methods capable of dealing with pathologic cases in OCT imaging. PMID:24083137

  17. Market-Based Approaches to Managing Science Return from Planetary Missions

    NASA Technical Reports Server (NTRS)

    Wessen, Randii R.; Porter, David; Hanson, Robin

    1996-01-01

    A research plan is described for the design and testing of a method for the planning and negotiation of science observations. The research plan is presented in relation to the fact that the current method, which involves a hierarchical process of science working groups, is unsuitable for the planning of the Cassini mission. The research plan involves the market-based approach in which participants are allocated budgets of scheduling points. The points are used to provide an intensity of preference for the observations being scheduled. In this way, the schedulers do not have to limit themselves to solving major conflicts, but try to maximize the number of scheduling points that result in a conflict-free timeline. Incentives are provided for the participants by the fixed budget concerning their tradeoff decisions. A degree of feedback is provided in the process so that the schedulers may rebid based on the current timeline.

  18. Neurocognitive poetics: methods and models for investigating the neuronal and cognitive-affective bases of literature reception.

    PubMed

    Jacobs, Arthur M

    2015-01-01

    A long tradition of research including classical rhetoric, esthetics and poetics theory, formalism and structuralism, as well as current perspectives in (neuro)cognitive poetics has investigated structural and functional aspects of literature reception. Despite a wealth of literature published in specialized journals like Poetics, however, still little is known about how the brain processes and creates literary and poetic texts. Still, such stimulus material might be suited better than other genres for demonstrating the complexities with which our brain constructs the world in and around us, because it unifies thought and language, music and imagery in a clear, manageable way, most often with play, pleasure, and emotion (Schrott and Jacobs, 2011). In this paper, I discuss methods and models for investigating the neuronal and cognitive-affective bases of literary reading together with pertinent results from studies on poetics, text processing, emotion, or neuroaesthetics, and outline current challenges and future perspectives.

  19. Electrochemical and mechanical polishing and shaping method and system

    NASA Technical Reports Server (NTRS)

    Engelhaupt, Darell E. (Inventor); Gubarev, Mikhail V. (Inventor); Jones, William David (Inventor); Ramsey, Brian D. (Inventor); Benson, Carl M. (Inventor)

    2011-01-01

    A method and system are provided for the shaping and polishing of the surface of a material selected from the group consisting of electrically semi-conductive materials and conductive materials. An electrically non-conductive polishing lap incorporates a conductive electrode such that, when the polishing lap is placed on the material's surface, the electrode is placed in spaced-apart juxtaposition with respect to the material's surface. A liquid electrolyte is disposed between the material's surface and the electrode. The electrolyte has an electrochemical stability constant such that cathodic material deposition on the electrode is not supported when a current flows through the electrode, the electrolyte and the material. As the polishing lap and the material surface experience relative movement, current flows through the electrode based on (i) adherence to Faraday's Law, and (ii) a pre-processing profile of the surface and a desired post-processing profile of the surface.

  20. 2D all-solid state fabric supercapacitor fabricated via an all solution process for use in smart textiles

    NASA Astrophysics Data System (ADS)

    Jang, Yunseok; Jo, Jeongdai; Woo, Kyoohee; Lee, Seung-Hyun; Kwon, Sin; Kim, Kwang-Young; Kang, Dongwoo

    2017-05-01

    We propose a method to fabricate a supercapacitor for smart textiles using silver (Ag) nanoparticle (NP) ink, simple spray patterning systems, and intense pulsed light (IPL) sintering systems. The Ag NP current collectors provided as high conductivity as the metal current collectors. The spray patterning technique is useful for fabricating supercapacitors because it is simple, fast, and cheap. IPL systems reduced the sintering temperature of Ag NPs and prevented thermal damage to the textiles during the Ag NP sintering process. The two-dimensional (2D) all-solid state fabric supercapacitor with an interdigitated configuration, developed here, exhibited a specific capacitance of 25.7 F/g and an energy density of 1.5 Wh/kg at a power density of 64.3 W/kg. These results support the utility of our proposed method in the development of energy textiles.

  1. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  2. Visualization of DNA in highly processed botanical materials.

    PubMed

    Lu, Zhengfei; Rubinsky, Maria; Babajanian, Silva; Zhang, Yanjun; Chang, Peter; Swanson, Gary

    2018-04-15

    DNA-based methods have been gaining recognition as a tool for botanical authentication in herbal medicine; however, their application in processed botanical materials is challenging due to the low quality and quantity of DNA left after extensive manufacturing processes. The low amount of DNA recovered from processed materials, especially extracts, is "invisible" by current technology, which has casted doubt on the presence of amplifiable botanical DNA. A method using adapter-ligation and PCR amplification was successfully applied to visualize the "invisible" DNA in botanical extracts. The size of the "invisible" DNA fragments in botanical extracts was around 20-220 bp compared to fragments of around 600 bp for the more easily visualized DNA in botanical powders. This technique is the first to allow characterization and visualization of small fragments of DNA in processed botanical materials and will provide key information to guide the development of appropriate DNA-based botanical authentication methods in the future. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes

    PubMed Central

    Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin

    2012-01-01

    Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993

  4. A Study of Light Level Effect on the Accuracy of Image Processing-based Tomato Grading

    NASA Astrophysics Data System (ADS)

    Prijatna, D.; Muhaemin, M.; Wulandari, R. P.; Herwanto, T.; Saukat, M.; Sugandi, W. K.

    2018-05-01

    Image processing method has been used in non-destructive tests of agricultural products. Compared to manual method, image processing method may produce more objective and consistent results. Image capturing box installed in currently used tomato grading machine (TEP-4) is equipped with four fluorescence lamps to illuminate the processed tomatoes. Since the performance of any lamp will decrease if its service time has exceeded its lifetime, it is predicted that this will affect tomato classification. The objective of this study was to determine the minimum light levels which affect classification accuracy. This study was conducted by varying light level from minimum and maximum on tomatoes in image capturing boxes and then investigates its effects on image characteristics. Research results showed that light intensity affects two variables which are important for classification, for example, area and color of captured image. Image processing program was able to determine correctly the weight and classification of tomatoes when light level was 30 lx to 140 lx.

  5. Determination of Mercury in Aqueous and Geologic Materials by Continuous Flow-Cold Vapor-Atomic Fluorescence Spectrometry (CVAFS)

    USGS Publications Warehouse

    Hageman, Philip L.

    2007-01-01

    New methods for the determination of total mercury in geologic materials and dissolved mercury in aqueous samples have been developed that will replace the methods currently (2006) in use. The new methods eliminate the use of sodium dichromate (Na2Cr2O7 ?2H2O) as an oxidizer and preservative and significantly lower the detection limit for geologic and aqueous samples. The new methods also update instrumentation from the traditional use of cold vapor-atomic absorption spectrometry to cold vapor-atomic fluorescence spectrometry. At the same time, the new digestion procedures for geologic materials use the same size test tubes, and the same aluminum heating block and hot plate as required by the current methods. New procedures for collecting and processing of aqueous samples use the same procedures that are currently (2006) in use except that the samples are now preserved with concentrated hydrochloric acid/bromine monochloride instead of sodium dichromate/nitric acid. Both the 'old' and new methods have the same analyst productivity rates. These similarities should permit easy migration to the new methods. Analysis of geologic and aqueous reference standards using the new methods show that these procedures provide mercury recoveries that are as good as or better than the previously used methods.

  6. Particle Acceleration and Heating by Turbulent Reconnection

    NASA Astrophysics Data System (ADS)

    Vlahos, Loukas; Pisokas, Theophilos; Isliker, Heinz; Tsiolis, Vassilis; Anastasiadis, Anastasios

    2016-08-01

    Turbulent flows in the solar wind, large-scale current sheets, multiple current sheets, and shock waves lead to the formation of environments in which a dense network of current sheets is established and sustains “turbulent reconnection.” We constructed a 2D grid on which a number of randomly chosen grid points are acting as scatterers (I.e., magnetic clouds or current sheets). Our goal is to examine how test particles respond inside this large-scale collection of scatterers. We study the energy gain of individual particles, the evolution of their energy distribution, and their escape time distribution. We have developed a new method to estimate the transport coefficients from the dynamics of the interaction of the particles with the scatterers. Replacing the “magnetic clouds” with current sheets, we have proven that the energization processes can be more efficient depending on the strength of the effective electric fields inside the current sheets and their statistical properties. Using the estimated transport coefficients and solving the Fokker-Planck (FP) equation, we can recover the energy distribution of the particles only for the stochastic Fermi process. We have shown that the evolution of the particles inside a turbulent reconnecting volume is not a solution of the FP equation, since the interaction of the particles with the current sheets is “anomalous,” in contrast to the case of the second-order Fermi process.

  7. PARTICLE ACCELERATION AND HEATING BY TURBULENT RECONNECTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlahos, Loukas; Pisokas, Theophilos; Isliker, Heinz

    2016-08-10

    Turbulent flows in the solar wind, large-scale current sheets, multiple current sheets, and shock waves lead to the formation of environments in which a dense network of current sheets is established and sustains “turbulent reconnection.” We constructed a 2D grid on which a number of randomly chosen grid points are acting as scatterers (i.e., magnetic clouds or current sheets). Our goal is to examine how test particles respond inside this large-scale collection of scatterers. We study the energy gain of individual particles, the evolution of their energy distribution, and their escape time distribution. We have developed a new method tomore » estimate the transport coefficients from the dynamics of the interaction of the particles with the scatterers. Replacing the “magnetic clouds” with current sheets, we have proven that the energization processes can be more efficient depending on the strength of the effective electric fields inside the current sheets and their statistical properties. Using the estimated transport coefficients and solving the Fokker–Planck (FP) equation, we can recover the energy distribution of the particles only for the stochastic Fermi process. We have shown that the evolution of the particles inside a turbulent reconnecting volume is not a solution of the FP equation, since the interaction of the particles with the current sheets is “anomalous,” in contrast to the case of the second-order Fermi process.« less

  8. The Highly Robust Electrical Interconnects and Ultrasensitive Biosensors Based on Embedded Carbon Nanotube Arrays

    NASA Technical Reports Server (NTRS)

    Li, Jun; Cassell, Alan; Koehne, Jessica; Chen, Hua; Ng, Hou Tee; Ye, Qi; Stevens, Ramsey; Han, Jie; Meyyappan, M.

    2003-01-01

    We report on our recent breakthroughs in two different applications using well-aligned carbon nanotube (CNT) arrays on Si chips, including (1) a novel processing solution for highly robust electrical interconnects in integrated circuit manufacturing, and (2) the development of ultrasensitive electrochemical DNA sensors. Both of them rely on the invention of a bottom-up fabrication scheme which includes six steps, including: (a) lithographic patterning, (b) depositing bottom conducting contacts, (c) depositing metal catalysts, (d) CNT growth by plasma enhanced chemical vapor deposition (PECVD), (e) dielectric gap-filling, and (f) chemical mechanical polishing (CMP). Such processes produce a stable planarized surface with only the open end of CNTs exposed, whch can be further processed or modified for different applications. By depositing patterned top contacts, the CNT can serve as vertical interconnects between the two conducting layers. This method is fundamentally different fiom current damascene processes and avoids problems associated with etching and filling of high aspect ratio holes at nanoscales. In addition, multiwalled CNTs (MWCNTs) are highly robust and can carry a current density of 10(exp 9) A/square centimeters without degradation. It has great potential to help extending the current Si technology. The embedded MWCNT array without the top contact layer can be also used as a nanoelectrode array in electrochemical biosensors. The cell time-constant and sensitivity can be dramatically improved. By functionalizing the tube ends with specific oligonucleotide probes, specific DNA targets can be detected with electrochemical methods down to subattomoles.

  9. Chemical coagulation-based processes for trace organic contaminant removal: current state and future potential.

    PubMed

    Alexander, Jonathan T; Hai, Faisal I; Al-Aboud, Turki M

    2012-11-30

    Trace organic contaminants have become an increasing cause of concern for governments and water authorities as they attempt to respond to the potential challenges posed by climate change by implementing sustainable water cycle management practices. The augmentation of potable water supplies through indirect potable water reuse is one such method currently being employed. Given the uncertainty surrounding the potential human health impacts of prolonged ingestion of trace organic contaminants, it is vital that effective and sustainable treatment methods are utilized. The purpose of this article is to provide a comprehensive literature review of the performance of the chemical coagulation process in removing trace organic contaminants from water. This study evaluated the removal data collated from recent research relating to various trace organic contaminants during the coagulation process. It was observed that there is limited research data relating to the removal of trace organic contaminants using coagulation. The findings of this study suggest that there is a gap in the current research investigating the potential of new types of coagulants and exploring coagulation-based hybrid processes to remove trace organic contaminants from water. The data analysed in this study regarding removal efficiency suggests that, even for the significantly hydrophobic compounds, hydrophobicity is not the sole factor governing removal of trace organic contaminants by coagulation. This has important implications in that the usual practice of screening coagulants based on turbidity (suspended solid) removal proves inadequate in the case of trace organic contaminant removal. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. PRESERVATION OF FOOD BY LOW-DOSE IONIZING ENERGY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1961-01-01

    A review is presented of the current status of investigations on the radiation processing of foods. The technical feasibility of this preservation method is well established and the economic feasibility of the method appears promising, particularly in low-dose applications. The current status of development of radiation sources is discussed. Pork has responded best among the meats tested for radiation processing. Sausage, luncheon meats, and chicken demonstrate good potential. Beef appears acceptable at low radiation dose ranges but presents flavor problems at high dosages. The storage life of refrigerated and unrefrigerated marine products is increased by radiation processing, Vegetable s aremore » easily damaged by comparatively small doses of radiation. Shredded cabbage treated at 300,000 rad is an excellent product and asparagus, snap beans, lima beans, broccoli, carrots, and corn are promising vegetables for radiation processing. Radiation treatment inhibits sprouting of potatoes and onions. Radiation processing of strawberries, grapes, peaches, tomatoes, and citrus fruits at doses between 200,000 and 800,000 rad affects molds that cause rotting and increases the storage life of these fruits. Radiation processing of cereal grains, cereal products, and military ration components destroys adult insects, larvae, and eggs of insect pests that infest these foods. No radioactivity has been induced in food products by high radiation doses. Extensive studies have shown that radiation processing has no effect on the wholesomeness of foods. The economic feasibility and potentialities of the radiation processing of foods are discussed. (C.H.)« less

  11. Improving the medical records department processes by lean management

    PubMed Central

    Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine

    2015-01-01

    Background: Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. Aims: The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. Materials and Methods: This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. Statistical Analysis Used: The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. Results: The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. Conclusion: The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. Originality/Value: The study represents one of the few attempts trying to eliminate wastes in the MRD. PMID:26097862

  12. Biotechnology in Food Production and Processing

    NASA Astrophysics Data System (ADS)

    Knorr, Dietrich; Sinskey, Anthony J.

    1985-09-01

    The food processing industry is the oldest and largest industry using biotechnological processes. Further development of food products and processes based on biotechnology depends upon the improvement of existing processes, such as fermentation, immobilized biocatalyst technology, and production of additives and processing aids, as well as the development of new opportunities for food biotechnology. Improvements are needed in the characterization, safety, and quality control of food materials, in processing methods, in waste conversion and utilization processes, and in currently used food microorganism and tissue culture systems. Also needed are fundamental studies of the structure-function relationship of food materials and of the cell physiology and biochemistry of raw materials.

  13. Analytical evaluation of current starch methods used in the international sugar industry: Part I.

    PubMed

    Cole, Marsha; Eggleston, Gillian; Triplett, Alexa

    2017-08-01

    Several analytical starch methods exist in the international sugar industry to mitigate starch-related processing challenges and assess the quality of traded end-products. These methods use iodometric chemistry, mostly potato starch standards, and utilize similar solubilization strategies, but had not been comprehensively compared. In this study, industrial starch methods were compared to the USDA Starch Research method using simulated raw sugars. Type of starch standard, solubilization approach, iodometric reagents, and wavelength detection affected total starch determination in simulated raw sugars. Simulated sugars containing potato starch were more accurately detected by the industrial methods, whereas those containing corn starch, a better model for sugarcane starch, were only accurately measured by the USDA Starch Research method. Use of a potato starch standard curve over-estimated starch concentrations. Among the variables studied, starch standard, solubilization approach, and wavelength detection affected the sensitivity, accuracy/precision, and limited the detection/quantification of the current industry starch methods the most. Published by Elsevier Ltd.

  14. [Development and validation of an analytical method to quantify residues of cleaning products able to inactivate prion].

    PubMed

    Briot, T; Robelet, A; Morin, N; Riou, J; Lelièvre, B; Lebelle-Dehaut, A-V

    2016-07-01

    In this study, a novel analytical method to quantify prion inactivating detergent in rinsing waters coming from the washer-disinfector of a hospital sterilization unit has been developed. The final aim was to obtain an easy and functional method in a routine hospital process which does not need the cleaning product manufacturer services. An ICP-MS method based on the potassium dosage of the washer-disinfector's rinsing waters was developed. Potassium hydroxide is present on the composition of the three prion inactivating detergent currently on the French market. The detergent used in this study was the Actanios LDI(®) (Anios laboratories). A Passing and Bablok regression compares concentrations measured with this developed method and with the HPLC-UV manufacturer method. According to results obtained, the developed method is easy to use in a routine hospital process. The Passing and Bablok regression showed that there is no statistical difference between the two analytical methods during the second rinsing step. Besides, both methods were linear on the third rinsing step, with a 1.5ppm difference between the concentrations measured for each method. This study shows that the ICP-MS method developed is nonspecific for the detergent, but specific for the potassium element which is present in all prion inactivating detergent currently on the French market. This method should be functional for all the prion inactivating detergent containing potassium, if the sensibility of the method is sufficient when the potassium concentration is very low in the prion inactivating detergent formulation. Copyright © 2016. Published by Elsevier Masson SAS.

  15. Ordinal preference elicitation methods in health economics and health services research: using discrete choice experiments and ranking methods.

    PubMed

    Ali, Shehzad; Ronaldson, Sarah

    2012-09-01

    The predominant method of economic evaluation is cost-utility analysis, which uses cardinal preference elicitation methods, including the standard gamble and time trade-off. However, such approach is not suitable for understanding trade-offs between process attributes, non-health outcomes and health outcomes to evaluate current practices, develop new programmes and predict demand for services and products. Ordinal preference elicitation methods including discrete choice experiments and ranking methods are therefore commonly used in health economics and health service research. Cardinal methods have been criticized on the grounds of cognitive complexity, difficulty of administration, contamination by risk and preference attitudes, and potential violation of underlying assumptions. Ordinal methods have gained popularity because of reduced cognitive burden, lower degree of abstract reasoning, reduced measurement error, ease of administration and ability to use both health and non-health outcomes. The underlying assumptions of ordinal methods may be violated when respondents use cognitive shortcuts, or cannot comprehend the ordinal task or interpret attributes and levels, or use 'irrational' choice behaviour or refuse to trade-off certain attributes. CURRENT USE AND GROWING AREAS: Ordinal methods are commonly used to evaluate preference for attributes of health services, products, practices, interventions, policies and, more recently, to estimate utility weights. AREAS FOR ON-GOING RESEARCH: There is growing research on developing optimal designs, evaluating the rationalization process, using qualitative tools for developing ordinal methods, evaluating consistency with utility theory, appropriate statistical methods for analysis, generalizability of results and comparing ordinal methods against each other and with cardinal measures.

  16. Current advances in molecular methods for detection of nitrite-dependent anaerobic methane oxidizing bacteria in natural environments.

    PubMed

    Chen, Jing; Dick, Richard; Lin, Jih-Gaw; Gu, Ji-Dong

    2016-12-01

    Nitrite-dependent anaerobic methane oxidation (n-damo) process uniquely links microbial nitrogen and carbon cycles. Research on n-damo bacteria progresses quickly with experimental evidences through enrichment cultures. Polymerase chain reaction (PCR)-based methods for detecting them in various natural ecosystems and engineered systems play a very important role in the discovery of their distribution, abundance, and biodiversity in the ecosystems. Important characteristics of n-damo enrichments were obtained and their key significance in microbial nitrogen and carbon cycles was investigated. The molecular methods currently used in detecting n-damo bacteria were comprehensively reviewed and discussed for their strengths and limitations in applications with a wide range of samples. The pmoA gene-based PCR primers for n-damo bacterial detection were evaluated and, in particular, several incorrectly stated PCR primer nucleotide sequences in the published papers were also pointed out to allow correct applications of the PCR primers in current and future investigations. Furthermore, this review also offers the future perspectives of n-damo bacteria based on current information and methods available for a better acquisition of new knowledge about this group of bacteria.

  17. [Oxidative stress. Should it be measured in the diabetic patient?].

    PubMed

    Villa-Caballero, L; Nava-Ocampo, A A; Frati-Munari, A C; Ponce-Monter, H

    2000-01-01

    Oxidative stress has been defined as a loss of counterbalance between free radical or reactive oxygen species production and the antioxidant systems, with negative effects on carbohydrates, lipids, and proteins. It is also involved in the progression of different chronic diseases and apoptosis. Diabetes mellitus is associated to a high oxidative stress level through different biochemical pathways, i.e. protein glycosylation, glucose auto-oxidation, and the polyol pathway, mainly induced by hyperglycemia. Oxidative stress could also be involved in the pathogenesis of atherosclerotic lesions and other chronic diabetic complications. Measurement of oxidative stress could be useful to investigate its role in the initiation and development processes of chronic diabetic complications and also to evaluate preventive actions, including antioxidative therapy. Different attempts have been made to obtain a practical, accurate, specific, and sensitive method to evaluate oxidative stress in clinical practice. However, this ideal method is not currently available to date and the usefulness of the current methods needs to be confirmed in daily practice. We suggest quantifying oxidated and reduced glutation (GSSG/GSH) and the thiobarbituric reactive substances (TBARS) with currently alternatives. Currently available alternative methods while we await better options.

  18. Parameter optimization of flux-aided backing-submerged arc welding by using Taguchi method

    NASA Astrophysics Data System (ADS)

    Pu, Juan; Yu, Shengfu; Li, Yuanyuan

    2017-07-01

    Flux-aided backing-submerged arc welding has been conducted on D36 steel with thickness of 20 mm. The effects of processing parameters such as welding current, voltage, welding speed and groove angle on welding quality were investigated by Taguchi method. The optimal welding parameters were predicted and the individual importance of each parameter on welding quality was evaluated by examining the signal-to-noise ratio and analysis of variance (ANOVA) results. The importance order of the welding parameters for the welding quality of weld bead was: welding current > welding speed > groove angle > welding voltage. The welding quality of weld bead increased gradually with increasing welding current and welding speed and decreasing groove angle. The optimum values of the welding current, welding speed, groove angle and welding voltage were found to be 1050 A, 27 cm/min, 40∘ and 34 V, respectively.

  19. A kinetic Monte Carlo simulation method of van der Waals epitaxy for atomistic nucleation-growth processes of transition metal dichalcogenides.

    PubMed

    Nie, Yifan; Liang, Chaoping; Cha, Pil-Ryung; Colombo, Luigi; Wallace, Robert M; Cho, Kyeongjae

    2017-06-07

    Controlled growth of crystalline solids is critical for device applications, and atomistic modeling methods have been developed for bulk crystalline solids. Kinetic Monte Carlo (KMC) simulation method provides detailed atomic scale processes during a solid growth over realistic time scales, but its application to the growth modeling of van der Waals (vdW) heterostructures has not yet been developed. Specifically, the growth of single-layered transition metal dichalcogenides (TMDs) is currently facing tremendous challenges, and a detailed understanding based on KMC simulations would provide critical guidance to enable controlled growth of vdW heterostructures. In this work, a KMC simulation method is developed for the growth modeling on the vdW epitaxy of TMDs. The KMC method has introduced full material parameters for TMDs in bottom-up synthesis: metal and chalcogen adsorption/desorption/diffusion on substrate and grown TMD surface, TMD stacking sequence, chalcogen/metal ratio, flake edge diffusion and vacancy diffusion. The KMC processes result in multiple kinetic behaviors associated with various growth behaviors observed in experiments. Different phenomena observed during vdW epitaxy process are analysed in terms of complex competitions among multiple kinetic processes. The KMC method is used in the investigation and prediction of growth mechanisms, which provide qualitative suggestions to guide experimental study.

  20. Stem mortality in surface fires: Part II, experimental methods for characterizing the thermal response of tree stems to heating by fires

    Treesearch

    D. M. Jimenez; B. W. Butler; J. Reardon

    2003-01-01

    Current methods for predicting fire-induced plant mortality in shrubs and trees are largely empirical. These methods are not readily linked to duff burning, soil heating, and surface fire behavior models. In response to the need for a physics-based model of this process, a detailed model for predicting the temperature distribution through a tree stem as a function of...

  1. Estimating Logistics Support of Reusable Launch Vehicles During Conceptual Design

    NASA Technical Reports Server (NTRS)

    Morris, W. D.; White, N. H.; Davies, W. T.; Ebeling, C. E.

    1997-01-01

    Methods exist to define the logistics support requirements for new aircraft concepts but are not directly applicable to new launch vehicle concepts. In order to define the support requirements and to discriminate among new technologies and processing choices for these systems, NASA Langley Research Center (LaRC) is developing new analysis methods. This paper describes several methods under development, gives their current status, and discusses the benefits and limitations associated with their use.

  2. Air Emissions Damages from Municipal Drinking Water Treatment Under Current and Proposed Regulatory Standards.

    PubMed

    Gingerich, Daniel B; Mauter, Meagan S

    2017-09-19

    Water treatment processes present intersectoral and cross-media risk trade-offs that are not presently considered in Safe Drinking Water Act regulatory analyses. This paper develops a method for assessing the air emission implications of common municipal water treatment processes used to comply with recently promulgated and proposed regulatory standards, including concentration limits for, lead and copper, disinfection byproducts, chromium(VI), strontium, and PFOA/PFOS. Life-cycle models of electricity and chemical consumption for individual drinking water unit processes are used to estimate embedded NO x , SO 2 , PM 2.5 , and CO 2 emissions on a cubic meter basis. We estimate air emission damages from currently installed treatment processes at U.S. drinking water facilities to be on the order of $500 million USD annually. Fully complying with six promulgated and proposed rules would increase baseline air emission damages by approximately 50%, with three-quarters of these damages originating from chemical manufacturing. Despite the magnitude of these air emission damages, the net benefit of currently implemented rules remains positive. For some proposed rules, however, the promise of net benefits remains contingent on technology choice.

  3. Study protocol: improving the transition of care from a non-network hospital back to the patient's medical home.

    PubMed

    Ayele, Roman A; Lawrence, Emily; McCreight, Marina; Fehling, Kelty; Peterson, Jamie; Glasgow, Russell E; Rabin, Borsika A; Burke, Robert; Battaglia, Catherine

    2017-02-10

    The process of transitioning Veterans to primary care following a non-Veterans Affairs (VA) hospitalization can be challenging. Poor transitions result in medical complications and increased hospital readmissions. The goal of this transition of care quality improvement (QI) project is to identify gaps in the current transition process and implement an intervention that bridges the gap and improves the current transition of care process within the Eastern Colorado Health Care System (ECHCS). We will employ qualitative methods to understand the current transition of care process back to VA primary care for Veterans who received care in a non-VA hospital in ECHCS. We will conduct in-depth semi-structured interviews with Veterans hospitalized in 2015 in non-VA hospitals as well as both VA and non-VA providers, staff, and administrators involved in the current care transition process. Participants will be recruited using convenience and snowball sampling. Qualitative data analysis will be guided by conventional content analysis and Lean Six Sigma process improvement tools. We will use VA claim data to identify the top ten non-VA hospitals serving rural and urban Veterans by volume and Veterans that received inpatient services at non-VA hospitals. Informed by both qualitative and quantitative data, we will then develop a transitions care coordinator led intervention to improve the transitions process. We will test the transition of care coordinator intervention using repeated improvement cycles incorporating salient factors in value stream mapping that are important for an efficient and effective transition process. Furthermore, we will complete a value stream map of the transition process at two other VA Medical Centers and test whether an implementation strategy of audit and feedback (the value stream map of the current transition process with the Transition of Care Dashboard) versus audit and feedback with Transition Nurse facilitation of the process using the Resource Guide and Transition of Care Dashboard improves the transition process, continuity of care, patient satisfaction and clinical outcomes. Our current transition of care process has shortcomings. An intervention utilizing a transition care coordinator has the potential to improve this process. Transitioning Veterans to primary care following a non-VA hospitalization is a crucial step for improving care coordination for Veterans.

  4. Application of multi response optimization with grey relational analysis and fuzzy logic method

    NASA Astrophysics Data System (ADS)

    Winarni, Sri; Wahyu Indratno, Sapto

    2018-01-01

    Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.

  5. Extraction and purification methods in downstream processing of plant-based recombinant proteins.

    PubMed

    Łojewska, Ewelina; Kowalczyk, Tomasz; Olejniczak, Szymon; Sakowicz, Tomasz

    2016-04-01

    During the last two decades, the production of recombinant proteins in plant systems has been receiving increased attention. Currently, proteins are considered as the most important biopharmaceuticals. However, high costs and problems with scaling up the purification and isolation processes make the production of plant-based recombinant proteins a challenging task. This paper presents a summary of the information regarding the downstream processing in plant systems and provides a comprehensible overview of its key steps, such as extraction and purification. To highlight the recent progress, mainly new developments in the downstream technology have been chosen. Furthermore, besides most popular techniques, alternative methods have been described. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Evaluation criteria for commercially oriented materials processing in space proposals

    NASA Technical Reports Server (NTRS)

    Moore, W. F.; Mcdowell, J. R.

    1979-01-01

    An approach and criteria for evaluating NASA funded experiments and demonstrations which have commercial potential were developed. Methods for insuring quick initial screening of commercial proposals are presented. Recommendations are given for modifying the current evaluation approach. New criteria for evaluating commercially orientated materials processing in space (MPS) proposals are introduced. The process for selection of qualified individuals to evaluate the phases of this approach and criteria is considered and guidelines are set for its implementation.

  7. [Physiotherapeutic care marketing research: current state-of-the art].

    PubMed

    Babaskin, D V

    2011-01-01

    Successful introduction of modern technologies into the national health care systems strongly depends on the current pharmaceutical market situation. The present article is focused on the peculiarities of marketing research with special reference to physiotherapeutic services and commodities. Analysis of the structure and sequence of marketing research processes is described along with the methods applied for the purpose including their support by the use of Internet resources and technologies.

  8. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  9. Removal of Giardia and Cryptosporidium in drinking water treatment: a pilot-scale study.

    PubMed

    Hsu, Bing Mu; Yeh, Hsuan Hsien

    2003-03-01

    Giardia and Cryptosporidium have emerged as waterborne pathogens of concern for public health. The aim of this study is to examine both parasites in the water samples taken from three pilot-scale plant processes located in southern Taiwan, to upgrade the current facilities. Three processes include: conventional process without prechlorination (Process 1), conventional process plus ozonation and pellet softening (Process 2), and integrated membrane process (MF plus NF) followed conventional process (Process 3). The detection methods of both parasites are modified from USEPA Methods 1622 and 1623. Results indicated that coagulation, sedimentation and filtration removed the most percentage of both protozoan parasites. The pre-ozonation step can destruct both parasites, especially for Giardia cysts. The microfiltration systems can intercept Giardia cysts and Cryptosporidium oocysts completely. A significant correlation between water turbidity and Cryptosporidium oocysts was found in this study. The similar results were also found between three kinds of particles (phi=3-5,5-8 and 8-10 microm) and Cryptosporidium oocysts.

  10. A two-step process of nitrous oxide before carbon dioxide for humanely euthanizing piglets: on-farm trials

    USDA-ARS?s Scientific Manuscript database

    The current methods of euthanizing neonatal piglets are raising concerns from the public and scientists. Our experiment tests the use of a two-step euthanasia method using nitrous oxide (N2O) for six minutes and then carbon dioxide (CO2) as a more humane way to euthanize piglets compared to just usi...

  11. The Most Preferred and Effective Reviewer of L2 Writing among Automated Grading System, Peer Reviewer and Teacher

    ERIC Educational Resources Information Center

    Tsai, Min-Hsiu

    2017-01-01

    Who is the most preferred and deemed the most helpful reviewer in improving student writing? This study exercised a blended teaching method which consists of three currently prevailing reviewers: the automated grading system (AGS, a web-based method), the peer review (a process-oriented approach), and the teacher grading technique (the…

  12. Do Teachers Make Decisions Like Firefighters? Applying Naturalistic Decision-Making Methods to Teachers' In-Class Decision Making in Mathematics

    ERIC Educational Resources Information Center

    Jazby, Dan

    2014-01-01

    Research into human decision making (DM) processes from outside of education paint a different picture of DM than current DM models in education. This pilot study assesses the use of critical decision method (CDM)--developed from observations of firefighters' DM -- in the context of primary mathematics teachers' in-class DM. Preliminary results…

  13. Coupling the biophysical and social dimensions of wildfire risk to improve wildfire mitigation planning

    Treesearch

    Alan A. Ager; Jeffrey D. Kline; A. Paige Fisher

    2015-01-01

    We describe recent advances in biophysical and social aspects of risk and their potential combined contribution to improve mitigation planning on fire-prone landscapes. The methods and tools provide an improved method for defining the spatial extent of wildfire risk to communities compared to current planning processes. They also propose an expanded role for social...

  14. New charging strategy for lithium-ion batteries based on the integration of Taguchi method and state of charge estimation

    NASA Astrophysics Data System (ADS)

    Vo, Thanh Tu; Chen, Xiaopeng; Shen, Weixiang; Kapoor, Ajay

    2015-01-01

    In this paper, a new charging strategy of lithium-polymer batteries (LiPBs) has been proposed based on the integration of Taguchi method (TM) and state of charge estimation. The TM is applied to search an optimal charging current pattern. An adaptive switching gain sliding mode observer (ASGSMO) is adopted to estimate the SOC which controls and terminates the charging process. The experimental results demonstrate that the proposed charging strategy can successfully charge the same types of LiPBs with different capacities and cycle life. The proposed charging strategy also provides much shorter charging time, narrower temperature variation and slightly higher energy efficiency than the equivalent constant current constant voltage charging method.

  15. Thermoelectrics in Coulomb-coupled quantum dots: Cotunneling and energy-dependent lead couplings

    NASA Astrophysics Data System (ADS)

    Walldorf, Nicklas; Jauho, Antti-Pekka; Kaasbjerg, Kristen

    2017-09-01

    We study thermoelectric effects in Coulomb-coupled quantum-dot (CCQD) systems beyond lowest-order tunneling processes and the often applied wide-band approximation. To this end, we present a master-equation (ME) approach based on a perturbative T -matrix calculation of the charge and heat tunneling rates and transport currents. Applying the method to transport through a noninteracting single-level QD, we demonstrate excellent agreement with the Landauer-Büttiker theory when higher-order (cotunneling) processes are included in the ME. Next, we study the effect of cotunneling and energy-dependent lead couplings on the heat currents in a system of two CCQDs. We find that cotunneling processes (i) can dominate the off-resonant heat currents at low temperature and bias compared to the interdot interaction, and (ii) give rise to a pronounced reduction of the cooling power achievable with the recently demonstrated Maxwell's demon cooling mechanism. Furthermore, we demonstrate that the cooling power can be boosted significantly by carefully engineering the energy dependence of the lead couplings to filter out undesired transport processes. Our findings emphasize the importance of higher-order cotunneling processes as well as engineered energy-dependent lead couplings in the optimization of the thermoelectric performance of CCQD systems.

  16. Electron beam additive manufacturing with wire - Analysis of the process

    NASA Astrophysics Data System (ADS)

    Weglowski, Marek St.; Błacha, Sylwester; Pilarczyk, Jan; Dutkiewicz, Jan; Rogal, Łukasz

    2018-05-01

    The electron beam additive manufacturing process with wire is a part of global trend to find fast and efficient methods for producing complex shapes elements from costly metal alloys such as stainless steels, nickel alloys, titanium alloys etc. whose production by other conventional technologies is unprofitable or technically impossible. Demand for additive manufacturing is linked to the development of new technologies in the automotive, aerospace and machinery industries. The aim of the presented work was to carried out research on electron beam additive manufacturing with a wire as a deposited (filler) material. The scope of the work was to investigate the influence of selected technological parameters such as: wire feed rate, beam current, travelling speed, acceleration voltage on stability of the deposition process and geometric dimensions of the padding welds. The research revealed that, at low beam currents, the deposition process is unstable. The padding weld reinforcement is non-uniform. Irregularity of the width, height and straightness of the padding welds can be observed. At too high acceleration voltage and beam current, burn-through of plate and excess penetration weld can be revealed. The achieved results and gained knowledge allowed to produce, based on EBAM with wire process, whole structure from stainless steel.

  17. The maturing of the quality improvement paradigm in the SEL

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1993-01-01

    The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.

  18. A review of bioinformatic methods for forensic DNA analyses.

    PubMed

    Liu, Yao-Yuan; Harbison, SallyAnn

    2018-03-01

    Short tandem repeats, single nucleotide polymorphisms, and whole mitochondrial analyses are three classes of markers which will play an important role in the future of forensic DNA typing. The arrival of massively parallel sequencing platforms in forensic science reveals new information such as insights into the complexity and variability of the markers that were previously unseen, along with amounts of data too immense for analyses by manual means. Along with the sequencing chemistries employed, bioinformatic methods are required to process and interpret this new and extensive data. As more is learnt about the use of these new technologies for forensic applications, development and standardization of efficient, favourable tools for each stage of data processing is being carried out, and faster, more accurate methods that improve on the original approaches have been developed. As forensic laboratories search for the optimal pipeline of tools, sequencer manufacturers have incorporated pipelines into sequencer software to make analyses convenient. This review explores the current state of bioinformatic methods and tools used for the analyses of forensic markers sequenced on the massively parallel sequencing (MPS) platforms currently most widely used. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. ASTEP user's guide and software documentation

    NASA Technical Reports Server (NTRS)

    Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.

    1974-01-01

    The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.

  20. Development of processing procedures for advanced silicon solar cells. [antireflection coatings and short circuit currents

    NASA Technical Reports Server (NTRS)

    Scott-Monck, J. A.; Stella, P. M.; Avery, J. E.

    1975-01-01

    Ten ohm-cm silicon solar cells, 0.2 mm thick, were produced with short circuit current efficiencies up to thirteen percent and using a combination of recent technical advances. The cells were fabricated in conventional and wraparound contact configurations. Improvement in cell collection efficiency from both the short and long wavelengths region of the solar spectrum was obtained by coupling a shallow junction and an optically transparent antireflection coating with back surface field technology. Both boron diffusion and aluminum alloying techniques were evaluated for forming back surface field cells. The latter method is less complicated and is compatible with wraparound cell processing.

  1. Linen Most Useful: Perspectives on Structure, Chemistry, and Enzymes for Retting Flax

    PubMed Central

    Akin, Danny E.

    2013-01-01

    The components of flax (Linum usitatissimum) stems are described and illustrated, with reference to the anatomy and chemical makeup and to applications in processing and products. Bast fiber, which is a major economic product of flax along with linseed and linseed oil, is described with particular reference to its application in textiles, composites, and specialty papers. A short history of retting methods, which is the separation of bast fiber from nonfiber components, is presented with emphasis on water retting, field retting (dew retting), and experimental methods. Past research on enzyme retting, particularly by the use of pectinases as a potential replacement for the current commercial practice of field retting, is reviewed. The importance and mechanism of Ca2+ chelators with pectinases in retting are described. Protocols are provided for retting of both fiber-type and linseed-type flax stems with different types of pectinases. Current and future applications are listed for use of a wide array of enzymes to improve processed fibers and blended yarns. Finally, potential lipid and aromatic coproducts derived from the dust and shive waste streams of fiber processing are indicated. PMID:25969769

  2. Event-scale relationships between surface velocity, temperature and chlorophyll in the coastal ocean, as seen by satellite

    NASA Technical Reports Server (NTRS)

    Strub, P. Ted

    1991-01-01

    The overall goal of this project was to increase our understanding of processes which determine the temporally varying distributions of surface chlorophyll pigment concentration and surface temperature in the California Current System (CCS) on the time-scale of 'events', i.e., several days to several weeks. We also proposed to investigate seasonal and regional differences in these events. Additionally, we proposed to evaluate methods of estimating surface velocities and horizontal transport of pigment and heat from sequences of AVHRR and CZCS images. The four specific objectives stated in the original proposal were to: (1) test surface current estimates made from sequences of both SST and color images using variations of the statistical method of Emery et al. (1986) and estimate the uncertainties in these satellite-derived surface currents; (2) characterize the spatial and temporal relationships of chlorophyll and temperature in rapidly evolving features for which adequate imagery exist and evaluate the contribution of these events to monthly and seasonal averages; (3) use the methods tested in (1) to determine the nature of the velocity fields in the CCS; and (4) compare the currents, temperature, and currents in different seasons and in different geographic regions.

  3. Review: magnetically assisted resistance spot welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y. B.; Li, D. L.; Lin, Z. Q.

    2016-02-25

    Currently, the use of advanced high strength steels (AHSSs) is the most cost effective means of reducing vehicle body weight and maintaining structural integrity at the same time. However, AHSSs present a big challenge to the traditional resistance spot welding (RSW) widely applied in automotive industries because the rapid heating and cooling procedures during RSW produce hardened weld microstructures, which lower the ductility and fatigue properties of welded joints and raise the probability of interfacial failure under external loads. Changing process parameters or post-weld heat treatment may reduce the weld brittleness, but those traditional quality control methods also increase energymore » consumption and prolong cycle time. In recent years, a magnetically assisted RSW (MA-RSW) method was proposed, in which an externally applied magnetic field would interact with the conduction current to produce a Lorentz force that would affect weld nugget formation. This paper is a review of an experimental MA-RSW platform, the mode of the external magnetic field and the mechanism that controls nugget shape, weld microstructures and joint performance. In conclusion, the advantages of the MA-RSW method in improving the weldability of AHSSs are given, a recent application of the MA-RSW process to light metals is described and the outlook for the MA-RSW process is presented.« less

  4. Improving operational anodising process performance using simulation approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist ofmore » five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.« less

  5. Bring on the heat

    NASA Astrophysics Data System (ADS)

    Schierning, Gabi

    2018-02-01

    One third of industrial processes occur at high temperatures above 1300 K, but current methods of waste heat recovery at these temperatures are limited. Now, reduced graphene oxide is shown to be a highly efficient and reliable thermoelectric material up to 3000 K.

  6. Probing amyloid protein aggregation with optical superresolution methods: from the test tube to models of disease

    PubMed Central

    Kaminski, Clemens F.; Kaminski Schierle, Gabriele S.

    2016-01-01

    Abstract. The misfolding and self-assembly of intrinsically disordered proteins into insoluble amyloid structures are central to many neurodegenerative diseases such as Alzheimer’s and Parkinson’s diseases. Optical imaging of this self-assembly process in vitro and in cells is revolutionizing our understanding of the molecular mechanisms behind these devastating conditions. In contrast to conventional biophysical methods, optical imaging and, in particular, optical superresolution imaging, permits the dynamic investigation of the molecular self-assembly process in vitro and in cells, at molecular-level resolution. In this article, current state-of-the-art imaging methods are reviewed and discussed in the context of research into neurodegeneration. PMID:27413767

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Townsend, D.W.; Linnhoff, B.

    In Part I, criteria for heat engine and heat pump placement in chemical process networks were derived, based on the ''temperature interval'' (T.I) analysis of the heat exchanger network problem. Using these criteria, this paper gives a method for identifying the best outline design for any combined system of chemical process, heat engines, and heat pumps. The method eliminates inferior alternatives early, and positively leads on to the most appropriate solution. A graphical procedure based on the T.I. analysis forms the heart of the approach, and the calculations involved are simple enough to be carried out on, say, a programmablemore » calculator. Application to a case study is demonstrated. Optimization methods based on this procedure are currently under research.« less

  8. Elasticity measurement of nasal cartilage as a function of temperature using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Liu, Chih Hao; Skryabina, M. N.; Singh, Manmohan; Li, Jiasong; Wu, Chen; Sobol, E.; Larin, Kirill V.

    2015-03-01

    Current clinical methods of reconstruction surgery involve laser reshaping of nasal cartilage. The process of stress relaxation caused by laser heating is the primary method to achieve nasal cartilage reshaping. Based on this, a rapid, non-destructive and accurate elasticity measurement would allow for a more robust reshaping procedure. In this work, we have utilized a phase-stabilized swept source optical coherence elastography (PhSSSOCE) to quantify the Young's modulus of porcine nasal septal cartilage during the relaxation process induced by heating. The results show that PhS-SSOCE was able to monitor changes in elasticity of hyaline cartilage, and this method could potentially be applied in vivo during laser reshaping therapies.

  9. Experimental investigation of the excess charge and time constant of minority carriers in the thin diffused layer of 0.1 ohm-cm silicon solar cells

    NASA Technical Reports Server (NTRS)

    Godlewski, M. P.; Brandhorst, H. W., Jr.; Lindholm, F. A.; Sah, C. T.

    1976-01-01

    An experimental method is presented that can be used to interpret the relative roles of bandgap narrowing and recombination processes in the diffused layer. This method involves measuring the device time constant by open-circuit voltage decay and the base region diffusion length by X-ray excitation. A unique illuminated diode method is used to obtain the diode saturation current. These data are interpreted using a simple model to determine individually the minority carrier lifetime and the excess charge. These parameters are then used to infer the relative importance of bandgap narrowing and recombination processes in the diffused layer.

  10. Current Understanding of the Pathophysiology of Myocardial Fibrosis and Its Quantitative Assessment in Heart Failure

    PubMed Central

    Liu, Tong; Song, Deli; Dong, Jianzeng; Zhu, Pinghui; Liu, Jie; Liu, Wei; Ma, Xiaohai; Zhao, Lei; Ling, Shukuan

    2017-01-01

    Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis. PMID:28484397

  11. Rapid curing of solution-processed zinc oxide films by pulse-light annealing for thin-film transistor applications

    NASA Astrophysics Data System (ADS)

    Kim, Dong Wook; Park, Jaehoon; Hwang, Jaeeun; Kim, Hong Doo; Ryu, Jin Hwa; Lee, Kang Bok; Baek, Kyu Ha; Do, Lee-Mi; Choi, Jong Sun

    2015-01-01

    In this study, a pulse-light annealing method is proposed for the rapid fabrication of solution-processed zinc oxide (ZnO) thinfilm transistors (TFTs). Transistors that were fabricated by the pulse-light annealing method, with the annealing being carried out at 90℃ for 15 s, exhibited a mobility of 0.05 cm2/Vs and an on/off current ratio of 106. Such electrical properties are quite close to those of devices that are thermally annealed at 165℃ for 40 min. X-ray photoelectron spectroscopy analysis of ZnO films showed that the activation energy required to form a Zn-O bond is entirely supplied within 15 s of pulse-light exposure. We conclude that the pulse-light annealing method is viable for rapidly curing solution-processable oxide semiconductors for TFT applications.

  12. Application of GIS technologies to monitor secondary radioactive contamination in the Delegen mountain massif

    NASA Astrophysics Data System (ADS)

    Alipbeki, O.; Kabzhanova, G.; Kurmanova, G.; Alipbekova, Ch.

    2016-06-01

    The territory of the Degelen mountain massif is located within territory of the former Semipalatinsk nuclear test site and it is an area of ecological disaster. Currently there is a process of secondary radioactive contamination that is caused by geodynamic processes activated at the Degelen array, violation of underground hydrological cycles and as a consequence, water seepage into the tunnels. One of the methods of monitoring of geodynamic processes is the modern technology of geographic information systems (GIS), methods of satellite radar interferometry and high accuracy satellite navigation system in conjunction with radioecological methods. This paper discusses on the creation of a GIS-project for the Degelen array, facilitated by quality geospatial analysis of the situation and simulation of the phenomena, in order to maximize an objective assessment of the radiation situation in this protected area.

  13. Frozen Fractals all Around: Solar flares, Ampere’s Law, and the Search for Units in Scale-Free Processes.

    NASA Astrophysics Data System (ADS)

    McAteer, R. T. James

    2015-08-01

    My soul is spiraling in frozen fractals all around, And one thought crystallizes like an icy blast, I'm never going back, the past is in the past.Elsa, from Disney’s Frozen, characterizes two fundamental aspects of scale-free processes in Nature: fractals are everywhere in space; fractals can be used to probe changes in time. Self-Organized Criticality provides a powerful set of tools to study scale-free processes. It connects spatial fractals (more generically, multifractals) to temporal evolution. The drawback is that this usually results in scale-free, unit-less, indices, which can be difficult to connect to everyday physics. Here, I show a novel method that connects one of the most powerful SOC tools - the wavelet transform modulus maxima approach to calculating multifractality - to one of the most powerful equations in all of physics - Ampere’s law. In doing so I show how the multifractal spectra can be expressed in terms of current density, and how current density can then be used for the prediction of future energy release from such a system.Our physical understanding of the solar magnetic field structure, and hence our ability to predict solar activity, is limited by the type of data currently available. I show that the multifractal spectrum provides a powerful physical connection between the details of photospheric magnetic gradients of current data and the coronal magnetic structure. By decomposing Ampere’s law and comparing it to the wavelet transform modulus maximum method, I show how the scale-free Holder exponent provides a direct measure of current density across all relevant sizes. The prevalence of this current density across various scales is connected to its stability in time, and hence to the ability of the magnetic structure to store and then release energy. Hence (spatial) multifractals inform us of (future) solar activity.Finally I discuss how such an approach can be used in any study of scale-free processes, and highlight the necessary key steps in identifying the nature of the mother wavelet to ensuring the viability of this powerful connection.

  14. Decoding of Ankle Flexion and Extension from Cortical Current Sources Estimated from Non-invasive Brain Activity Recording Methods.

    PubMed

    Mejia Tobar, Alejandra; Hyoudou, Rikiya; Kita, Kahori; Nakamura, Tatsuhiro; Kambara, Hiroyuki; Ogata, Yousuke; Hanakawa, Takashi; Koike, Yasuharu; Yoshimura, Natsue

    2017-01-01

    The classification of ankle movements from non-invasive brain recordings can be applied to a brain-computer interface (BCI) to control exoskeletons, prosthesis, and functional electrical stimulators for the benefit of patients with walking impairments. In this research, ankle flexion and extension tasks at two force levels in both legs, were classified from cortical current sources estimated by a hierarchical variational Bayesian method, using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) recordings. The hierarchical prior for the current source estimation from EEG was obtained from activated brain areas and their intensities from an fMRI group (second-level) analysis. The fMRI group analysis was performed on regions of interest defined over the primary motor cortex, the supplementary motor area, and the somatosensory area, which are well-known to contribute to movement control. A sparse logistic regression method was applied for a nine-class classification (eight active tasks and a resting control task) obtaining a mean accuracy of 65.64% for time series of current sources, estimated from the EEG and the fMRI signals using a variational Bayesian method, and a mean accuracy of 22.19% for the classification of the pre-processed of EEG sensor signals, with a chance level of 11.11%. The higher classification accuracy of current sources, when compared to EEG classification accuracy, was attributed to the high number of sources and the different signal patterns obtained in the same vertex for different motor tasks. Since the inverse filter estimation for current sources can be done offline with the present method, the present method is applicable to real-time BCIs. Finally, due to the highly enhanced spatial distribution of current sources over the brain cortex, this method has the potential to identify activation patterns to design BCIs for the control of an affected limb in patients with stroke, or BCIs from motor imagery in patients with spinal cord injury.

  15. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boddu, S; Morrow, A; Krishnamurthy, N

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality,more » undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.« less

  16. About an Extreme Achievable Current in Plasma Focus Installation of Mather Type

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikulin, V. Ya.; Polukhin, S. N.; Vikhrev, V. V.

    A computer simulation and analytical analysis of the discharge process in Plasma Focus has shown that there is an upper limit to the current which can be achieved in Plasma Focus installation of Mather type by only increasing the capacity of the condenser bank. The maximum current achieved for various plasma focus installations of 1 MJ level is discussed. For example, for the PF-1000 (IFPiLM) and 1 MJ Frascati PF, the maximum current is near 2 MA. Thus, the commonly used method of increasing the energy of the PF installation by increasing of the capacity has no merit. Alternative optionsmore » in order to increase the current are discussed.« less

  17. Fundamentals in Biostatistics for Investigation in Pediatric Dentistry: Part II -Biostatistical Methods.

    PubMed

    Pozos-Guillén, Amaury; Ruiz-Rodríguez, Socorro; Garrocho-Rangel, Arturo

    The main purpose of the second part of this series was to provide the reader with some basic aspects of the most common biostatistical methods employed in health sciences, in order to better understand the validity, significance and reliability of the results from any article on Pediatric Dentistry. Currently, as mentioned in the first paper, Pediatric Dentists need basic biostatistical knowledge to be able to apply it when critically appraise a dental article during the Evidence-based Dentistry (EBD) process, or when participating in the development of a clinical study with dental pediatric patients. The EBD process provides a systematic approach of collecting, review and analyze current and relevant published evidence about oral health care in order to answer a particular clinical question; then this evidence should be applied in everyday practice. This second report describes the most commonly used statistical methods for analyzing and interpret collected data, and the methodological criteria to be considered when choosing the most appropriate tests for a specific study. These are available to Pediatric Dentistry practicants interested in reading or designing original clinical or epidemiological studies.

  18. Ion beam figuring of small optical components

    NASA Astrophysics Data System (ADS)

    Drueding, Thomas W.; Fawcett, Steven C.; Wilson, Scott R.; Bifano, Thomas G.

    1995-12-01

    Ion beam figuring provides a highly deterministic method for the final precision figuring of optical components with advantages over conventional methods. The process involves bombarding a component with a stable beam of accelerated particles that selectively removes material from the surface. Figure corrections are achieved by rastering the fixed-current beam across the workplace at appropriate, time-varying velocities. Unlike conventional methods, ion figuring is a noncontact technique and thus avoids such problems as edge rolloff effects, tool wear, and force loading of the workpiece. This work is directed toward the development of the precision ion machining system at NASA's Marshall Space Flight Center. This system is designed for processing small (approximately equals 10-cm diam) optical components. Initial experiments were successful in figuring 8-cm-diam fused silica and chemical-vapor-deposited SiC samples. The experiments, procedures, and results of figuring the sample workpieces to shallow spherical, parabolic (concave and convex), and non-axially-symmetric shapes are discussed. Several difficulties and limitations encountered with the current system are discussed. The use of a 1-cm aperture for making finer corrections on optical components is also reported.

  19. Method to Eliminate Flux Linkage DC Component in Load Transformer for Static Transfer Switch

    PubMed Central

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2~30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method. PMID:25133255

  20. Method to eliminate flux linkage DC component in load transformer for static transfer switch.

    PubMed

    He, Yu; Mao, Chengxiong; Lu, Jiming; Wang, Dan; Tian, Bing

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2 ~ 30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method.

  1. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  2. A new sono-electrochemical method for enhanced detoxification of hydrophilic chloroorganic pollutants in water.

    PubMed

    Yasman, Yakov; Bulatov, Valery; Gridin, Vladimir V; Agur, Sabina; Galil, Noah; Armon, Robert; Schechter, Israel

    2004-09-01

    A new method for detoxification of hydrophilic chloroorganic pollutants in effluent water was developed, using a combination of ultrasound waves, electrochemistry and Fenton's reagent. The advantages of the method are exemplified using two target compounds: the common herbicide 2,4-dichlorophenoxyacetic acid (2,4-D) and its derivative 2,4-dichlorophenol (2,4-DCP). The high degradation power of this process is due to the large production of oxidizing hydroxyl radicals and high mass transfer due to sonication. Application of this sono-electrochemical Fenton process (SEF) treatment (at 20 kHz) with quite a small current density, accomplished almost 50% oxidation of 2,4-D solution (300 ppm, 1.2 mM) in just 60 s. Similar treatments ran for 600 s resulted in practically full degradation of the herbicide; sizable oxidation of 2,4-DCP also occurs. The main intermediate compounds produced in the SEF process were identified. Their kinetic profile was measured and a chemical reaction scheme was suggested. The efficiency of the SEF process is tentatively much higher than the reference degradation methods and the time required for full degradation is considerably shorter. The SEF process maintains high performance up to concentrations which are higher than reference methods. The optimum concentration of Fe2+ ions required for this process was found to be of about 2 mM, which is lower than that in reference techniques. These findings indicate that SEF process may be an effective method for detoxification of environmental water.

  3. Improved InGaN LED System Efficacy and Cost via Droop Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wildeson, Isaac

    Efficiency droop is a non-thermal process intrinsic to indium gallium nitride light emitting diodes (LEDs) in which the external quantum efficiency (EQE) decreases with increasing drive current density. Mitigating droop would allow one to reduce the size of LEDs driven at a given current or to drive LEDs of given size at higher current while maintaining high efficiencies. In other words, droop mitigation can lead to significant gains in light output per dollar and/or light output per watt of input power. This project set an EQE improvement goal at high drive current density which was to be attained by improvingmore » the LED active region design and growth process following a droop mitigation strategy. The interactions between LED active region design parameters and efficiency droop were studied by modeling and experiments. The crystal defects that tend to form in more complex LED designs intended to mitigate droop were studied with advanced characterization methods that provided insight into the structural and electronic properties of the material. This insight was applied to improve the epitaxy process both in terms of active region design and optimization of growth parameters. The final project goals were achieved on schedule and an epitaxy process leading to LEDs with EQE exceeding the project target was demonstrated.« less

  4. NDE Process Development Specification for SRB Composite Nose Cap

    NASA Technical Reports Server (NTRS)

    Suits, M.

    1999-01-01

    The Shuttle Upgrade program is a continuing improvement process to enable the Space Shuttle to be an effective space transportation vehicle for the next few decades. The Solid Rocket Booster (SRB), as a component of that system, is currently undergoing such an improvement. Advanced materials, such as composites, have given us a chance to improve performance and to reduce weight. The SRB Composite Nose Cap (CNC) program aims to replace the current aluminum nose cap, which is coated with a Thermal Protection System and poses a possible debris hazard, with a lighter, stronger, CNC. For the next 2 years, this program will evaluate the design, material selection, properties, and verification of the CNC. This particular process specification cites the methods and techniques for verifying the integrity of such a nose cap with nondestructive evaluation.

  5. Community detection for fluorescent lifetime microscopy image segmentation

    NASA Astrophysics Data System (ADS)

    Hu, Dandan; Sarder, Pinaki; Ronhovde, Peter; Achilefu, Samuel; Nussinov, Zohar

    2014-03-01

    Multiresolution community detection (CD) method has been suggested in a recent work as an efficient method for performing unsupervised segmentation of fluorescence lifetime (FLT) images of live cell images containing fluorescent molecular probes.1 In the current paper, we further explore this method in FLT images of ex vivo tissue slices. The image processing problem is framed as identifying clusters with respective average FLTs against a background or "solvent" in FLT imaging microscopy (FLIM) images derived using NIR fluorescent dyes. We have identified significant multiresolution structures using replica correlations in these images, where such correlations are manifested by information theoretic overlaps of the independent solutions ("replicas") attained using the multiresolution CD method from different starting points. In this paper, our method is found to be more efficient than a current state-of-the-art image segmentation method based on mixture of Gaussian distributions. It offers more than 1:25 times diversity based on Shannon index than the latter method, in selecting clusters with distinct average FLTs in NIR FLIM images.

  6. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  7. Thermally stimulated processes in samarium-modified lead titanate ferroelectric ceramics

    NASA Astrophysics Data System (ADS)

    Peláiz-Barranco, A.; García-Wong, A. C.; González-Abreu, Y.; Gagou, Y.; Saint-Grégoire, P.

    2013-08-01

    The thermally stimulated processes in a samarium-modified lead titanate ferroelectric system are analyzed from the thermally stimulated depolarization discharge current. The discharge due to the space charge injected during the poling process, the pyroelectric response and a conduction process related to oxygen vacancies are evaluated considering a theoretical decomposition by using a numerical method. The pyroelectric response is separated from other components to evaluate the polarization behavior and some pyroelectric parameters. High remanent polarization, pyroelectric coefficient and merit figure values are obtained at room temperature.

  8. An Analysis of the United States Air Force Energy Savings Performance Contracts

    DTIC Science & Technology

    2007-12-01

    key element of the ESPC system. Chapter IV uses the standard contracting processes to review the USAF implementations of strategic purchasing with...process and each level facilitates regionalization, which is the current implementation method of strategic purchasing for energy service management...the existing regulations that are inconsistent with the ESPC intent , and 3) to formulate substitute regulations consistent with laws governing Federal

  9. Green chemistry: development trajectory

    NASA Astrophysics Data System (ADS)

    Moiseev, I. I.

    2013-07-01

    Examples of applications of green chemistry methods in heavy organic synthesis are analyzed. Compounds, which can be produced by the processing of the biomass, and the criteria for the selection of the most promising products are summarized. The current status of the ethanol production and processing is considered. The possibilities of the use of high fatty acid triglycerides, glycerol, succinic acid, and isoprene are briefly discussed. The bibliography includes 67 references.

  10. Process methods and levels of automation of wood pallet repair in the United States

    Treesearch

    Jonghun Park; Laszlo Horvath; Robert J. Bush

    2016-01-01

    This study documented the current status of wood pallet repair in the United States by identifying the types of processing and equipment usage in repair operations from an automation prespective. The wood pallet repair firms included in the sudy received an average of approximately 1.28 million cores (i.e., used pallets) for recovery in 2012. A majority of the cores...

  11. A Comparison of Mother-Tongue Curricula of Successful Countries in PISA and Turkey by Higher-Order Thinking Processes

    ERIC Educational Resources Information Center

    Cer, Erkan

    2018-01-01

    Purpose: The purpose of the current study is to reveal general qualities of the objectives in the mother-tongue curricula of Hong Kong and Shanghai-China, South Korea, Singapore, and Turkey in terms of higher-order thinking processes specified by PISA tests. Research Methods: In this study, the researcher used a qualitative research design.…

  12. Functional relationship-based alarm processing

    DOEpatents

    Corsberg, D.R.

    1987-04-13

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). 11 figs.

  13. Mismatch Negativity with Visual-only and Audiovisual Speech

    PubMed Central

    Ponton, Curtis W.; Bernstein, Lynne E.; Auer, Edward T.

    2009-01-01

    The functional organization of cortical speech processing is thought to be hierarchical, increasing in complexity and proceeding from primary sensory areas centrifugally. The current study used the mismatch negativity (MMN) obtained with electrophysiology (EEG) to investigate the early latency period of visual speech processing under both visual-only (VO) and audiovisual (AV) conditions. Current density reconstruction (CDR) methods were used to model the cortical MMN generator locations. MMNs were obtained with VO and AV speech stimuli at early latencies (approximately 82-87 ms peak in time waveforms relative to the acoustic onset) and in regions of the right lateral temporal and parietal cortices. Latencies were consistent with bottom-up processing of the visible stimuli. We suggest that a visual pathway extracts phonetic cues from visible speech, and that previously reported effects of AV speech in classical early auditory areas, given later reported latencies, could be attributable to modulatory feedback from visual phonetic processing. PMID:19404730

  14. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  15. Medicare Part D Beneficiaries' Plan Switching Decisions and Information Processing.

    PubMed

    Han, Jayoung; Urmie, Julie

    2017-03-01

    Medicare Part D beneficiaries tend not to switch plans despite the government's efforts to engage beneficiaries in the plan switching process. Understanding current and alternative plan features is a necessary step to make informed plan switching decisions. This study explored beneficiaries' plan switching using a mixed-methods approach, with a focus on the concept of information processing. We found large variation in beneficiary comprehension of plan information among both switchers and nonswitchers. Knowledge about alternative plans was especially poor, with only about half of switchers and 2 in 10 nonswitchers being well informed about plans other than their current plan. We also found that helpers had a prominent role in plan decision making-nearly twice as many switchers as nonswitchers worked with helpers for their plan selection. Our study suggests that easier access to helpers as well as helpers' extensive involvement in the decision-making process promote informed plan switching decisions.

  16. Comparing Sensory Information Processing and Alexithymia between People with Substance Dependency and Normal

    PubMed Central

    Bashapoor, Sajjad; Hosseini-Kiasari, Seyyedeh Tayebeh; Daneshvar, Somayeh; Kazemi-Taskooh, Zeinab

    2015-01-01

    Background Sensory information processing and alexithymia are two important factors in determining behavioral reactions. Some studies explain the effect of the sensitivity of sensory processing and alexithymia in the tendency to substance abuse. Giving that, the aim of the current study was to compare the styles of sensory information processing and alexithymia between substance-dependent people and normal ones. Methods The research method was cross-sectional and the statistical population of the current study comprised of all substance-dependent men who are present in substance quitting camps of Masal, Iran, in October 2013 (n = 78). 36 persons were selected randomly by simple randomly sampling method from this population as the study group, and 36 persons were also selected among the normal population in the same way as the comparison group. Both groups was evaluated by using Toronto alexithymia scale (TAS) and adult sensory profile, and the multivariate analysis of variance (MANOVA) test was applied to analyze data. Findings The results showed that there are significance differences between two groups in low registration (P < 0.020, F = 5.66), sensation seeking (P < 0.050, F = 1.92), and sensory avoidance (P < 0.008, F = 7.52) as a components of sensory processing and difficulty in describing emotions (P < 0.001, F = 15.01) and difficulty in identifying emotions (P < 0.002, F = 10.54) as a components of alexithymia. However, no significant difference were found between two groups in components of sensory sensitivity (P < 0.170, F = 1.92) and external oriented thinking style (P < 0.060, F = 3.60). Conclusion These results showed that substance-dependent people process sensory information in a different way than normal people and show more alexithymia features than them. PMID:26885354

  17. Mapping DNA methylation by transverse current sequencing: Reduction of noise from neighboring nucleotides

    NASA Astrophysics Data System (ADS)

    Alvarez, Jose; Massey, Steven; Kalitsov, Alan; Velev, Julian

    Nanopore sequencing via transverse current has emerged as a competitive candidate for mapping DNA methylation without needed bisulfite-treatment, fluorescent tag, or PCR amplification. By eliminating the error producing amplification step, long read lengths become feasible, which greatly simplifies the assembly process and reduces the time and the cost inherent in current technologies. However, due to the large error rates of nanopore sequencing, single base resolution has not been reached. A very important source of noise is the intrinsic structural noise in the electric signature of the nucleotide arising from the influence of neighboring nucleotides. In this work we perform calculations of the tunneling current through DNA molecules in nanopores using the non-equilibrium electron transport method within an effective multi-orbital tight-binding model derived from first-principles calculations. We develop a base-calling algorithm accounting for the correlations of the current through neighboring bases, which in principle can reduce the error rate below any desired precision. Using this method we show that we can clearly distinguish DNA methylation and other base modifications based on the reading of the tunneling current.

  18. Advanced image based methods for structural integrity monitoring: Review and prospects

    NASA Astrophysics Data System (ADS)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  19. Adapting Western Research Methods to Indigenous Ways of Knowing

    PubMed Central

    Christopher, Suzanne

    2013-01-01

    Indigenous communities have long experienced exploitation by researchers and increasingly require participatory and decolonizing research processes. We present a case study of an intervention research project to exemplify a clash between Western research methodologies and Indigenous methodologies and how we attempted reconciliation. We then provide implications for future research based on lessons learned from Native American community partners who voiced concern over methods of Western deductive qualitative analysis. Decolonizing research requires constant reflective attention and action, and there is an absence of published guidance for this process. Continued exploration is needed for implementing Indigenous methods alone or in conjunction with appropriate Western methods when conducting research in Indigenous communities. Currently, examples of Indigenous methods and theories are not widely available in academic texts or published articles, and are often not perceived as valid. PMID:23678897

  20. Automatic processing of high-rate, high-density multibeam echosounder data

    NASA Astrophysics Data System (ADS)

    Calder, B. R.; Mayer, L. A.

    2003-06-01

    Multibeam echosounders (MBES) are currently the best way to determine the bathymetry of large regions of the seabed with high accuracy. They are becoming the standard instrument for hydrographic surveying and are also used in geological studies, mineral exploration and scientific investigation of the earth's crustal deformations and life cycle. The significantly increased data density provided by an MBES has significant advantages in accurately delineating the morphology of the seabed, but comes with the attendant disadvantage of having to handle and process a much greater volume of data. Current data processing approaches typically involve (computer aided) human inspection of all data, with time-consuming and subjective assessment of all data points. As data rates increase with each new generation of instrument and required turn-around times decrease, manual approaches become unwieldy and automatic methods of processing essential. We propose a new method for automatically processing MBES data that attempts to address concerns of efficiency, objectivity, robustness and accuracy. The method attributes each sounding with an estimate of vertical and horizontal error, and then uses a model of information propagation to transfer information about the depth from each sounding to its local neighborhood. Embedded in the survey area are estimation nodes that aim to determine the true depth at an absolutely defined location, along with its associated uncertainty. As soon as soundings are made available, the nodes independently assimilate propagated information to form depth hypotheses which are then tracked and updated on-line as more data is gathered. Consequently, we can extract at any time a "current-best" estimate for all nodes, plus co-located uncertainties and other metrics. The method can assimilate data from multiple surveys, multiple instruments or repeated passes of the same instrument in real-time as data is being gathered. The data assimilation scheme is sufficiently robust to deal with typical survey echosounder errors. Robustness is improved by pre-conditioning the data, and allowing the depth model to be incrementally defined. A model monitoring scheme ensures that inconsistent data are maintained as separate but internally consistent depth hypotheses. A disambiguation of these competing hypotheses is only carried out when required by the user. The algorithm has a low memory footprint, runs faster than data can currently be gathered, and is suitable for real-time use. We call this algorithm CUBE (Combined Uncertainty and Bathymetry Estimator). We illustrate CUBE on two data sets gathered in shallow water with different instruments and for different purposes. We show that the algorithm is robust to even gross failure modes, and reliably processes the vast majority of the data. In both cases, we confirm that the estimates made by CUBE are statistically similar to those generated by hand.

  1. Method of Manufacturing a Micromechanical Oscillating Mass Balance

    NASA Technical Reports Server (NTRS)

    Altemir, David A. (Inventor)

    1999-01-01

    A micromechanical oscillating mass balance and method adapted for measuring minute quantities of material deposited at a selected location, such as during a vapor deposition process. The invention comprises a vibratory composite beam which includes a dielectric layer sandwiched between two conductive layers.The beam is positioned in a magnetic field. An alternating current passes through one conductive layers, the beam oscillates, inducing an output current in the second conductive layer, which is analyzed to determine the resonant frequency of the beam. As material is deposited on the beam, the mass of the beam increases and the resonant frequency of the beam shifts, and the mass added is determined.

  2. Enhancing Neurosurgical Education in Low- and Middle-income Countries: Current Methods and New Advances

    PubMed Central

    LIANG, Kevin E; BERNSTEIN, Ilia; KATO, Yoko; KAWASE, Takeshi; HODAIE, Mojgan

    2016-01-01

    Low- and middle-income countries (LMICs) face a critical shortage of basic surgical services. Adequate neurosurgical services can have a far-reaching positive impact on society’s health care and, consequently, the economic development in LMICs. Yet surgery, and specifically neurosurgery has been a long neglected sector of global health. This article reviews the current efforts to enhance neurosurgery education in LMICs and outlines ongoing approaches for improvement. In addition, we introduce the concept of a sustainable and cost-effective model to enhance neurosurgical resources in LMICs and describe the process and methods of online curriculum development. PMID:27616319

  3. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  4. Automatic identification of bacterial types using statistical imaging methods

    NASA Astrophysics Data System (ADS)

    Trattner, Sigal; Greenspan, Hayit; Tepper, Gapi; Abboud, Shimon

    2003-05-01

    The objective of the current study is to develop an automatic tool to identify bacterial types using computer-vision and statistical modeling techniques. Bacteriophage (phage)-typing methods are used to identify and extract representative profiles of bacterial types, such as the Staphylococcus Aureus. Current systems rely on the subjective reading of plaque profiles by human expert. This process is time-consuming and prone to errors, especially as technology is enabling the increase in the number of phages used for typing. The statistical methodology presented in this work, provides for an automated, objective and robust analysis of visual data, along with the ability to cope with increasing data volumes.

  5. Fabrication of fiber supported ionic liquids and methods of use

    DOEpatents

    Luebke, David R; Wickramanayake, Shan

    2013-02-26

    One or more embodiments relates to the production of a fabricated fiber having an asymmetric polymer network and having an immobilized liquid such as an ionic liquid within the pores of the polymer network. The process produces the fabricated fiber in a dry-wet spinning process using a homogenous dope solution, providing significant advantage over current fabrication methods for liquid-supporting polymers. The fabricated fibers may be effectively utilized for the separation of a chemical species from a mixture based on the selection of the polymer, the liquid, and the solvent utilized in the dope.

  6. Influence of High-Current-Density Impulses on the Compression Behavior: Experiments with Iron and a Nickel-Based Alloy

    NASA Astrophysics Data System (ADS)

    Demler, E.; Gerstein, G.; Dalinger, A.; Epishin, A.; Rodman, D.; Nürnberger, F.

    2017-01-01

    Difficulties of processing of high strength and/or brittle materials by plastic deformation, e.g., by forging, require to develop new industrial technologies. In particular, the feasible deformation rates are limited for low-ductile metallic materials. For this reason, processes were investigated to improve the deformability in which electrical impulses are to be applied to lower the yield strength. However, owing to the impulse duration and low current densities, concomitant effects always occur, e.g., as a result of Joule heating. Current developments in power electronics allow now to transmit high currents as short pulses. By reducing the impulse duration and increasing the current density, the plasticity of metallic materials can be correspondingly increased. Using the examples of polycrystalline iron and a single-crystal, nickel-based alloy (PWA 1480), current advances in the development of methods for forming materials by means of high-current-density impulses are demonstrated. For this purpose, appropriate specimens were loaded in compression and, using novel testing equipment, subjected to a current strength of 10 kA with an impulse duration of 2 ms. For a pre-defined strain, the test results show a significant decrease in the compressive stress during the compression test and a significant change in the dislocation distribution following the current impulse treatment.

  7. Kinetics of porous silicon growth studied using flicker-noise spectroscopy

    NASA Astrophysics Data System (ADS)

    Parkhutik, V.; Timashev, S.

    2000-05-01

    The mechanism of the formation of porous silicon (PS) is studied using flicker noise spectroscopy (FNS), a new phenomenological method that allows us to analyze the evolution of nonlinear dissipative systems in time, space and energy. FNS is based on the ideas of deterministic chaos in complex macro- and microsystems. It allows us to obtain a set of empiric parameters ("passport data") which characterize the state of the system and change of its properties due to the evolution in time, energy, and space. The FNS method permits us to get new information about the kinetics of growth of PS and its properties. Thus, the PS formation mechanisms at n-Si and p-Si, as revealed using the FNS, seem to be essentially different. p-Si shows larger "memory" in the sequence of individual events involved in PS growth than n-Si (if anodized without light illumination). The influence of the anodization variables (such as current density, HF concentration, duration of the process, light illumination) onto the "passport data" of PS is envisaged. The increase of the current density increases memory of the PS formation process, when each forthcoming individual event is more correlated with the preceding one. Increasing current density triggers electrochemical reactions that are negligible at lower currents. Light illumination also produces a positive effect onto the "memory" of the system. The FNS makes it possible to distinguish different stages of the continuous anodization process which are apparently associated with increasing pore length. Thus, FNS is a very sensitive tool in analysis of the PS formation and other complex electrochemical systems as well.

  8. Eddy current technique for predicting burst pressure

    DOEpatents

    Petri, Mark C.; Kupperman, David S.; Morman, James A.; Reifman, Jaques; Wei, Thomas Y. C.

    2003-01-01

    A signal processing technique which correlates eddy current inspection data from a tube having a critical tubing defect with a range of predicted burst pressures for the tube is provided. The method can directly correlate the raw eddy current inspection data representing the critical tubing defect with the range of burst pressures using a regression technique, preferably an artificial neural network. Alternatively, the technique deconvolves the raw eddy current inspection data into a set of undistorted signals, each of which represents a separate defect of the tube. The undistorted defect signal which represents the critical tubing defect is related to a range of burst pressures utilizing a regression technique.

  9. Electrochemical Solution Growth of Magnetic Nitrides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monson, Todd C.; Pearce, Charles

    Magnetic nitrides, if manufactured in bulk form, would provide designers of transformers and inductors with a new class of better performing and affordable soft magnetic materials. According to experimental results from thin films and/or theoretical calculations, magnetic nitrides would have magnetic moments well in excess of current state of the art soft magnets. Furthermore, magnetic nitrides would have higher resistivities than current transformer core materials and therefore not require the use of laminates of inactive material to limit eddy current losses. However, almost all of the magnetic nitrides have been elusive except in difficult to reproduce thin films or asmore » inclusions in another material. Now, through its ability to reduce atmospheric nitrogen, the electrochemical solution growth (ESG) technique can bring highly sought after (and previously inaccessible) new magnetic nitrides into existence in bulk form. This method utilizes a molten salt as a solvent to solubilize metal cations and nitrogen ions produced electrochemically and form nitrogen compounds. Unlike other growth methods, the scalable ESG process can sustain high growth rates (~mm/hr) even under reasonable operating conditions (atmospheric pressure and 500 °C). Ultimately, this translates into a high throughput, low cost, manufacturing process. The ESG process has already been used successfully to grow high quality GaN. Below, the experimental results of an exploratory express LDRD project to access the viability of the ESG technique to grow magnetic nitrides will be presented.« less

  10. Evaluation of current Australian health service accreditation processes (ACCREDIT-CAP): protocol for a mixed-method research project.

    PubMed

    Hinchcliff, Reece; Greenfield, David; Moldovan, Max; Pawsey, Marjorie; Mumford, Virginia; Westbrook, Johanna Irene; Braithwaite, Jeffrey

    2012-01-01

    Accreditation programmes aim to improve the quality and safety of health services, and have been widely implemented. However, there is conflicting evidence regarding the outcomes of existing programmes. The Accreditation Collaborative for the Conduct of Research, Evaluation and Designated Investigations through Teamwork-Current Accreditation Processes (ACCREDIT-CAP) project is designed to address key gaps in the literature by evaluating the current processes of three accreditation programmes used across Australian acute, primary and aged care services. The project comprises three mixed-method studies involving documentary analyses, surveys, focus groups and individual interviews. Study samples will comprise stakeholders from across the Australian healthcare system: accreditation agencies; federal and state government departments; consumer advocates; professional colleges and associations; and staff of acute, primary and aged care services. Sample sizes have been determined to ensure results allow robust conclusions. Qualitative information will be thematically analysed, supported by the use of textual grouping software. Quantitative data will be subjected to a variety of analytical procedures, including descriptive and comparative statistics. The results are designed to inform health system policy and planning decisions in Australia and internationally. The project has been approved by the University of New South Wales Human Research Ethics Committee (approval number HREC 10274). Results will be reported to partner organisations, healthcare consumers and other stakeholders via peer-reviewed publications, conference and seminar presentations, and a publicly accessible website.

  11. One-step fabrication of an organ-on-a-chip with spatial heterogeneity using a 3D bioprinting technology.

    PubMed

    Lee, Hyungseok; Cho, Dong-Woo

    2016-07-05

    Although various types of organs-on-chips have been introduced recently as tools for drug discovery, the current studies are limited in terms of fabrication methods. The fabrication methods currently available not only need a secondary cell-seeding process and result in severe protein absorption due to the material used, but also have difficulties in providing various cell types and extracellular matrix (ECM) environments for spatial heterogeneity in the organs-on-chips. Therefore, in this research, we introduce a novel 3D bioprinting method for organ-on-a-chip applications. With our novel 3D bioprinting method, it was possible to prepare an organ-on-a-chip in a simple one-step fabrication process. Furthermore, protein absorption on the printed platform was very low, which will lead to accurate measurement of metabolism and drug sensitivity. Moreover, heterotypic cell types and biomaterials were successfully used and positioned at the desired position for various organ-on-a-chip applications, which will promote full mimicry of the natural conditions of the organs. The liver organ was selected for the evaluation of the developed method, and liver function was shown to be significantly enhanced on the liver-on-a-chip, which was prepared by 3D bioprinting. Consequently, the results demonstrate that the suggested 3D bioprinting method is easier and more versatile for production of organs-on-chips.

  12. High temperature superconductor materials and applications

    NASA Technical Reports Server (NTRS)

    Doane, George B., III.; Banks, Curtis; Golben, John

    1990-01-01

    Research on processing methods leading to a significant enhancement in the critical current densities (Jc) and the critical temperature (Tc) of high temperature superconducting in thin bulk and thin film forms. The fabrication of important devices for NASA unique applications (sensors) is investigated.

  13. Economic study of future aircraft fuels (1970-2000)

    NASA Technical Reports Server (NTRS)

    Alexander, A. D., III

    1972-01-01

    Future aircraft fuels are evaluated in terms of fuel resource availability and pricing, processing methods, and economic projections over the period 1970-2000. Liquefied hydrogen, methane and propane are examined as potential turbine engine aircraft fuels relative to current JP fuel.

  14. Novel hermetic packaging methods for MOEMS

    NASA Astrophysics Data System (ADS)

    Stark, David

    2003-01-01

    Hermetic packaging of micro-optoelectromechanical systems (MOEMS) is an immature technology, lacking industry-consensus methods and standards. Off-the-shelf, catalog window assemblies are not yet available. Window assemblies are in general custom designed and manufactured for each new product, resulting in longer than acceptable cycle times, high procurement costs and questionable reliability. There are currently two dominant window-manufacturing methods wherein a metal frame is attached to glass, as well as a third, less-used method. The first method creates a glass-to-metal seal by heating the glass above its Tg to fuse it to the frame. The second method involves first metallizing the glass where it is to be attached to the frame, and then soldering the glass to the frame. The third method employs solder-glass to bond the glass to the frame. A novel alternative with superior features compared to the three previously described window-manufacturing methods is proposed. The new approach lends itself to a plurality of glass-to-metal attachment techniques. Benefits include lower temperature processing than two of the current methods and potentially more cost-effective manufacturing than all three of today"s attachment methods.

  15. Long-term care information systems: an overview of the selection process.

    PubMed

    Nahm, Eun-Shim; Mills, Mary Etta; Feege, Barbara

    2006-06-01

    Under the current Medicare Prospective Payment System method and the ever-changing managed care environment, the long-term care information system is vital to providing quality care and to surviving in business. system selection process should be an interdisciplinary effort involving all necessary stakeholders for the proposed system. The system selection process can be modeled following the Systems Developmental Life Cycle: identifying problems, opportunities, and objectives; determining information requirements; analyzing system needs; designing the recommended system; and developing and documenting software.

  16. Multiscale Modeling of Damage Processes in fcc Aluminum: From Atoms to Grains

    NASA Technical Reports Server (NTRS)

    Glaessgen, E. H.; Saether, E.; Yamakov, V.

    2008-01-01

    Molecular dynamics (MD) methods are opening new opportunities for simulating the fundamental processes of material behavior at the atomistic level. However, current analysis is limited to small domains and increasing the size of the MD domain quickly presents intractable computational demands. A preferred approach to surmount this computational limitation has been to combine continuum mechanics-based modeling procedures, such as the finite element method (FEM), with MD analyses thereby reducing the region of atomic scale refinement. Such multiscale modeling strategies can be divided into two broad classifications: concurrent multiscale methods that directly incorporate an atomistic domain within a continuum domain and sequential multiscale methods that extract an averaged response from the atomistic simulation for later use as a constitutive model in a continuum analysis.

  17. In-line Fourier-transform infrared spectroscopy as a versatile process analytical technology for preparative protein chromatography.

    PubMed

    Großhans, Steffen; Rüdt, Matthias; Sanden, Adrian; Brestrich, Nina; Morgenstern, Josefine; Heissler, Stefan; Hubbuch, Jürgen

    2018-04-27

    Fourier-transform infrared spectroscopy (FTIR) is a well-established spectroscopic method in the analysis of small molecules and protein secondary structure. However, FTIR is not commonly applied for in-line monitoring of protein chromatography. Here, the potential of in-line FTIR as a process analytical technology (PAT) in downstream processing was investigated in three case studies addressing the limits of currently applied spectroscopic PAT methods. A first case study exploited the secondary structural differences of monoclonal antibodies (mAbs) and lysozyme to selectively quantify the two proteins with partial least squares regression (PLS) giving root mean square errors of cross validation (RMSECV) of 2.42 g/l and 1.67 g/l, respectively. The corresponding Q 2 values are 0.92 and, respectively, 0.99, indicating robust models in the calibration range. Second, a process separating lysozyme and PEGylated lysozyme species was monitored giving an estimate of the PEGylation degree of currently eluting species with RMSECV of 2.35 g/l for lysozyme and 1.24 g/l for PEG with Q 2 of 0.96 and 0.94, respectively. Finally, Triton X-100 was added to a feed of lysozyme as a typical process-related impurity. It was shown that the species could be selectively quantified from the FTIR 3D field without PLS calibration. In summary, the proposed PAT tool has the potential to be used as a versatile option for monitoring protein chromatography. It may help to achieve a more complete implementation of the PAT initiative by mitigating limitations of currently used techniques. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Global GNSS processing based on the raw observation approach

    NASA Astrophysics Data System (ADS)

    Strasser, Sebastian; Zehentner, Norbert; Mayer-Gürr, Torsten

    2017-04-01

    Many global navigation satellite system (GNSS) applications, e.g. Precise Point Positioning (PPP), require high-quality GNSS products, such as precise GNSS satellite orbits and clocks. These products are routinely determined by analysis centers of the International GNSS Service (IGS). The current processing methods of the analysis centers make use of the ionosphere-free linear combination to reduce the ionospheric influence. Some of the analysis centers also form observation differences, in general double-differences, to eliminate several additional error sources. The raw observation approach is a new GNSS processing approach that was developed at Graz University of Technology for kinematic orbit determination of low Earth orbit (LEO) satellites and subsequently adapted to global GNSS processing in general. This new approach offers some benefits compared to well-established approaches, such as a straightforward incorporation of new observables due to the avoidance of observation differences and linear combinations. This becomes especially important in view of the changing GNSS landscape with two new systems, the European system Galileo and the Chinese system BeiDou, currently in deployment. GNSS products generated at Graz University of Technology using the raw observation approach currently comprise precise GNSS satellite orbits and clocks, station positions and clocks, code and phase biases, and Earth rotation parameters. To evaluate the new approach, products generated using the Global Positioning System (GPS) constellation and observations from the global IGS station network are compared to those of the IGS analysis centers. The comparisons show that the products generated at Graz University of Technology are on a similar level of quality to the products determined by the IGS analysis centers. This confirms that the raw observation approach is applicable to global GNSS processing. Some areas requiring further work have been identified, enabling future improvements of the method.

  19. Aconitum in traditional Chinese medicine: a valuable drug or an unpredictable risk?

    PubMed

    Singhuber, Judith; Zhu, Ming; Prinz, Sonja; Kopp, Brigitte

    2009-10-29

    Aconitum species have been used in China as an essential drug in Traditional Chinese Medicine (TCM) for 2000 years. Reviewing the clinical application of Aconitum, their pharmacological effects, toxicity and detoxifying measures, herb-herb interactions, clinical taboos, famous herbal formulas, traditional and current herbal processing methods based upon a wide range of literature investigations serve as a case study to explore the multidisciplinary implications of botanicals used in TCM. The toxicological risk of improper usage of Aconitum remains very high, especially in countries like China, India and Japan. The toxicity of Aconitum mainly derives from the diester diterpene alkaloids (DDAs) including aconitine (AC), mesaconitine (MA) and hypaconitine (HA). They can be decomposed into less or non-toxic derivatives through Chinese traditional processing methods (Paozhi), which play an essential role in detoxification. Using Paozhi, the three main forms of processed aconite -- yanfuzi, heishunpian and baifupian -- can be obtained (CPCommission, 2005). Moreover, some new processing techniques have been developed in China such as pressure-steaming. The current development of fingerprint assays, in particular HPLC, has set a good basis to conduct an appropriate quality control for TCM crude herbs and their ready-made products. Therefore, a stipulation for a maximum level of DDA content of Aconitum is highly desirable in order to guarantee the clinical safety and its low toxicity in decoctions. Newly developed HPLC methods have made the accurate and simultaneous determination and quantification of DDA content interesting.

  20. A mixed-methods study on perceptions towards use of Rapid Ethical Assessment to improve informed consent processes for health research in a low-income setting.

    PubMed

    Addissie, Adamu; Davey, Gail; Newport, Melanie J; Addissie, Thomas; MacGregor, Hayley; Feleke, Yeweyenhareg; Farsides, Bobbie

    2014-05-02

    Rapid Ethical Assessment (REA) is a form of rapid ethnographic assessment conducted at the beginning of research project to guide the consent process with the objective of reconciling universal ethical guidance with specific research contexts. The current study is conducted to assess the perceived relevance of introducing REA as a mainstream tool in Ethiopia. Mixed methods research using a sequential explanatory approach was conducted from July to September 2012, including 241 cross-sectional, self-administered and 19 qualitative, in-depth interviews among health researchers and regulators including ethics committee members in Ethiopian health research institutions and universities. In their evaluation of the consent process, only 40.2% thought that the consent process and information given were adequately understood by study participants; 84.6% claimed they were not satisfied with the current consent process and 85.5% thought the best interests of study participants were not adequately considered. Commonly mentioned consent-related problems included lack of clarity (48.1%), inadequate information (34%), language barriers (28.2%), cultural differences (27.4%), undue expectations (26.6%) and power imbalances (20.7%). About 95.4% believed that consent should be contextualized to the study setting and 39.4% thought REA would be an appropriate approach to improve the perceived problems. Qualitative findings helped to further explore the gaps identified in the quantitative findings and to map-out concerns related to the current research consent process in Ethiopia. Suggestions included, conducting REA during the pre-test (pilot) phase of studies when applicable. The need for clear guidance for researchers on issues such as when and how to apply the REA tools was stressed. The study findings clearly indicated that there are perceived to be correctable gaps in the consent process of medical research in Ethiopia. REA is considered relevant by researchers and stakeholders to address these gaps. Exploring further the feasibility and applicability of REA is recommended.

  1. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  2. Kernelized Locality-Sensitive Hashing for Fast Image Landmark Association

    DTIC Science & Technology

    2011-03-24

    based Simultaneous Localization and Mapping ( SLAM ). The problem, however, is that vision-based navigation techniques can re- quire excessive amounts of...up and optimizing the data association process in vision-based SLAM . Specifically, this work studies the current methods that algorithms use to...required for location identification than that of other methods. This work can then be extended into a vision- SLAM implementation to subsequently

  3. Enhancements and Algorithms for Avionic Information Processing System Design Methodology.

    DTIC Science & Technology

    1982-06-16

    programming algorithm is enhanced by incorporating task precedence constraints and hardware failures. Stochastic network methods are used to analyze...allocations in the presence of random fluctuations. Graph theoretic methods are used to analyze hardware designs, and new designs are constructed with...There, spatial dynamic programming (SDP) was used to solve a static, deterministic software allocation problem. Under the current contract the SDP

  4. The role of spatial data and geomatic approaches in treeline mapping: a review of methods and limitations

    Treesearch

    Vanina Fissore; Renzo Motta; Brian J. Palik; Enrico Borgogno Mondino

    2015-01-01

    In the debate over global warming, treeline position is considered an important ecological indicator of climate change. Currently, analysis of upward treeline shift is often based on various spatial data processed by geomatic techniques. In this work, considering a selection of 31 reference papers, we assessed how the scientific community is using different methods to...

  5. Adapting viral safety assurance strategies to continuous processing of biological products.

    PubMed

    Johnson, Sarah A; Brown, Matthew R; Lute, Scott C; Brorson, Kurt A

    2017-01-01

    There has been a recent drive in commercial large-scale production of biotechnology products to convert current batch mode processing to continuous processing manufacturing. There have been reports of model systems capable of adapting and linking upstream and downstream technologies into a continuous manufacturing pipeline. However, in many of these proposed continuous processing model systems, viral safety has not been comprehensively addressed. Viral safety and detection is a highly important and often expensive regulatory requirement for any new biological product. To ensure success in the adaption of continuous processing to large-scale production, there is a need to consider the development of approaches that allow for seamless incorporation of viral testing and clearance/inactivation methods. In this review, we outline potential strategies to apply current viral testing and clearance/inactivation technologies to continuous processing, as well as modifications of existing unit operations to ensure the successful integration of viral clearance into the continuous processing of biological products. Biotechnol. Bioeng. 2017;114: 21-32. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  6. Volumetric calibration of a plenoptic camera.

    PubMed

    Hall, Elise Munz; Fahringer, Timothy W; Guildenbecher, Daniel R; Thurow, Brian S

    2018-02-01

    The volumetric calibration of a plenoptic camera is explored to correct for inaccuracies due to real-world lens distortions and thin-lens assumptions in current processing methods. Two methods of volumetric calibration based on a polynomial mapping function that does not require knowledge of specific lens parameters are presented and compared to a calibration based on thin-lens assumptions. The first method, volumetric dewarping, is executed by creation of a volumetric representation of a scene using the thin-lens assumptions, which is then corrected in post-processing using a polynomial mapping function. The second method, direct light-field calibration, uses the polynomial mapping in creation of the initial volumetric representation to relate locations in object space directly to image sensor locations. The accuracy and feasibility of these methods is examined experimentally by capturing images of a known dot card at a variety of depths. Results suggest that use of a 3D polynomial mapping function provides a significant increase in reconstruction accuracy and that the achievable accuracy is similar using either polynomial-mapping-based method. Additionally, direct light-field calibration provides significant computational benefits by eliminating some intermediate processing steps found in other methods. Finally, the flexibility of this method is shown for a nonplanar calibration.

  7. Advanced Sulfur-Silicon Full Cell Architecture for Lithium Ion Batteries.

    PubMed

    Ye, Rachel; Bell, Jeffrey; Patino, Daisy; Ahmed, Kazi; Ozkan, Mihri; Ozkan, Cengiz S

    2017-12-08

    Lithium-ion batteries are crucial to the future of energy storage. However, the energy density of current lithium-ion batteries is insufficient for future applications. Sulfur cathodes and silicon anodes have garnered a lot of attention in the field due their high capacity potential. Although recent developments in sulfur and silicon electrodes show exciting results in half cell formats, neither electrode can act as a lithium source when put together into a full cell format. Current methods toward incorporating lithium in sulfur-silicon full cells involves prelithiating silicon or using lithium sulfide. These methods however, complicate material processing and creates safety hazards. Herein, we present a novel full cell battery architecture that bypasses the issues associated with current methods. This battery architecture gradually integrates controlled amounts of pure lithium into the system by allowing lithium the access to external circuit. A high specific energy density of 350 Wh/kg after 250 cycles at C/10 was achieved using this method. This work should pave the way for future researches into sulfur-silicon full cells.

  8. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  9. Contact resistance and normal zone formation in coated yttrium barium copper oxide superconductors

    NASA Astrophysics Data System (ADS)

    Duckworth, Robert Calvin

    2001-11-01

    This project presents a systematic study of contact resistance and normal zone formation in silver coated YBa2CU3Ox (YBCO) superconductors. A unique opportunity exists in YBCO superconductors because of the ability to use oxygen annealing to influence the interfacial properties and the planar geometry of this type of superconductor to characterize the contact resistance between the silver and YBCO. The interface represents a region that current must cross when normal zones form in the superconductor and a high contact resistance could impede the current transfer or produce excess Joule heating that would result in premature quench or damage of the sample. While it has been shown in single-crystalline YBCO processing methods that the contact resistance of the silver/YBCO interface can be influenced by post-process oxygen annealing, this has not previously been confirmed for high-density films, nor for samples with complete layers of silver deposited on top of the YBCO. Both the influence of contact resistance and the knowledge of normal zone formation on conductor sized samples is essential for their successful implementation into superconducting applications such as transmission lines and magnets. While normal zone formation and propagation have been studied in other high temperature superconductors, the amount of information with respect to YBCO has been very limited. This study establishes that the processing method for the YBCO does not affect the contact resistance and mirrors the dependence of contact resistance on oxygen annealing temperature observed in earlier work. It has also been experimentally confirmed that the current transfer length provides an effective representation of the contact resistance when compared to more direct measurements using the traditional four-wire method. Finally for samples with low contact resistance, a combination of experiments and modeling demonstrate an accurate understanding of the key role of silver thickness and substrate thickness on the stability of silver-coated YBCO Rolling Assisted Bi-Axially Textured Substrates conductors. Both the experimental measurements and the one-dimensional model show that increasing the silver thickness results in an increased thermal runaway current; that is, the current above which normal zones continue to grow due to insufficient local cooling.

  10. Decision Makers' Allocation of Home-Care Therapy Services: A Process Map

    PubMed Central

    Poss, Jeff; Egan, Mary; Rappolt, Susan; Berg, Katherine

    2013-01-01

    ABSTRACT Purpose: To explore decision-making processes currently used in allocating occupational and physical therapy services in home care for complex long-stay clients in Ontario. Method: An exploratory study using key-informant interviews and client vignettes was conducted with home-care decision makers (case managers and directors) from four home-care regions in Ontario. The interview data were analyzed using the framework analysis method. Results: The decision-making process for allocating therapy services has four stages: intake, assessment, referral to service provider, and reassessment. There are variations in the management processes deployed at each stage. The major variation is in the process of determining the volume of therapy services across home-care regions, primarily as a result of financial constraints affecting the home-care programme. Government funding methods and methods of information sharing also significantly affect home-care therapy allocation. Conclusion: Financial constraints in home care are the primary contextual factor affecting allocation of therapy services across home-care regions. Given the inflation of health care costs, new models of funding and service delivery need to be developed to ensure that the right person receives the right care before deteriorating and requiring more costly long-term care. PMID:24403672

  11. Effect of Annealing Processes on Cu-Zr Alloy Film for Copper Metallization

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Li, Fu-yin; Tang, Bin-han

    2017-12-01

    The effect of two different annealing processes on the microstructure and barrier-forming ability of Cu-Zr alloy films has been investigated. Cu-Zr alloy films were deposited directly onto SiO2/Si substrates via direct current magnetron sputtering and subsequently annealed by the vacuum annealing process (VAP) or rapid annealing process under argon atmosphere at temperatures 350°C, 450°C, and 550°C. Then, the microstructure, interface characteristics, and electrical properties of the samples were measured. After annealing, the samples showed a preferential (111) crystal orientation, independent of the annealing process. After two annealing methods, Zr aggregated at the Cu-Zr/SiO2 interface and no serious interdiffusion occurred between Cu and Si. The leakage current measurements revealed that the samples annealed by VAP show a higher reliability. According to the results, the vacuum annealing has better barrier performance than the rapid annealing when used for the fabrication of Cu-based interconnects.

  12. Research on Technology Innovation Management in Big Data Environment

    NASA Astrophysics Data System (ADS)

    Ma, Yanhong

    2018-02-01

    With the continuous development and progress of the information age, the demand for information is getting larger. The processing and analysis of information data is also moving toward the direction of scale. The increasing number of information data makes people have higher demands on processing technology. The explosive growth of information data onto the current society have prompted the advent of the era of big data. At present, people have more value and significance in producing and processing various kinds of information and data in their lives. How to use big data technology to process and analyze information data quickly to improve the level of big data management is an important stage to promote the current development of information and data processing technology in our country. To some extent, innovative research on the management methods of information technology in the era of big data can enhance our overall strength and make China be an invincible position in the development of the big data era.

  13. Two-dimensional time-dependent modelling of fume formation in a pulsed gas metal arc welding process

    NASA Astrophysics Data System (ADS)

    Boselli, M.; Colombo, V.; Ghedini, E.; Gherardi, M.; Sanibondi, P.

    2013-06-01

    Fume formation in a pulsed gas metal arc welding (GMAW) process is investigated by coupling a time-dependent axi-symmetric two-dimensional model, which takes into account both droplet detachment and production of metal vapour, with a model for fume formation and transport based on the method of moments for the solution of the aerosol general dynamic equation. We report simulative results of a pulsed process (peak current = 350 A, background current 30 A, period = 9 ms) for a 1 mm diameter iron wire, with Ar shielding gas. Results showed that metal vapour production occurs mainly at the wire tip, whereas fume formation is concentrated in the fringes of the arc in the spatial region close to the workpiece, where metal vapours are transported by convection. The proposed modelling approach allows time-dependent tracking of fumes also in plasma processes where temperature-time variations occur faster than nanoparticle transport from the nucleation region to the surrounding atmosphere, as is the case for most pulsed GMAW processes.

  14. Improved stability, magnetic field preservation and recovery speed in (RE)Ba2Cu3O x -based no-insulation magnets via a graded-resistance approach

    NASA Astrophysics Data System (ADS)

    Kan Chan, Wan; Schwartz, Justin

    2017-07-01

    The no-insulation (NI) approach to winding (RE)Ba2Cu3O x (REBCO) high temperature superconductor solenoids has shown significant promise for maximizing the efficient usage of conductor while providing self-protecting operation. Self-protection in a NI coil, however, does not diminish the likelihood that a recoverable quench occurs. During a disturbance resulting in a recoverable quench, owing to the low turn-to-turn contact resistance, transport current bypasses the normal zone by flowing directly from the current input lead to the output lead, leading to a near total loss of the azimuthal current responsible for magnetic field generation. The consequences are twofold. First, a long recovery process is needed to recharge the coil to full operational functionality. Second, a fast magnetic field transient is created due to the sudden drop in magnetic field in the quenching coil. The latter could induce a global inductive quench propagation in other coils of a multi-coil NI magnet, increasing the likelihood of quenching and accelerating the depletion of useful current in other coils, lengthening the post-quench recovery process. Here a novel graded-resistance method is proposed to tackle the mentioned problems while maintaining the superior thermal stability and self-protecting capability of NI magnets. Through computational modeling and analysis on a hybrid multiphysics model, patterned resistive-conductive layers are inserted between selected turn-to-turn contacts to contain hot-spot heat propagation while maintaining the turn-wise current sharing required for self-protection, resulting in faster post-quench recovery and reduced magnetic field transient. Effectiveness of the method is studied at 4.2 and 77 K. Through the proposed method, REBCO magnets with high current density, high thermal stability, low likelihood of quenching, and rapid, passive recovery emerge with high operational reliability and availability.

  15. Survey of electrochemical metal winning processes. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaaler, L.E.

    1979-03-01

    The subject program was undertaken to find electrometallurgical technology that could be developed into energy saving commercial metal winning processes. Metals whose current production processes consume significant energy (excepting copper and aluminum) are magnesium, zinc, lead, chromium, manganese, sodium, and titanium. The technology of these metals, with the exception of titanium, was reviewed. Growth of titanium demand has been too small to justify the installation of an electrolyte process that has been developed. This fact and the uncertainty of estimates of future demand dissuaded us from reviewing titanium technology. Opportunities for developing energy saving processes were found for magnesium, zinc,more » lead, and sodium. Costs for R and D and demonstration plants have been estimated. It appeared that electrolytic methods for chromium and manganese cannot compete energywise or economically with the pyrometallurgical methods of producing the ferroalloys, which are satisfactory for most uses of chromium and manganese.« less

  16. Aspects of food processing and its effect on allergen structure.

    PubMed

    Paschke, Angelika

    2009-08-01

    The article summarizes current physical and chemical methods in food processing as storage, preparation, separation, isolation or purification and thermal application on the one hand as well as enzymatic treatment on the other and their impact on the properties of food proteins. Novel methods of food processing like high pressure, electric field application or irradiation and their impact on food allergens are presented. The EU project REDALL (Reduced Allergenicity of Processed Foods, Containing Animal Allergens: QLK1-CT-2002-02687) showed that by a combination of enzyme and heat treatment the allergic potential of hen's egg decreased about 100 fold. Clinical reactions do not appear anymore. An AiF-FV 12024 N project worked with fruits like mango, lychee and apple. Processed mango and lychee had no change in allergenic potential during heating while e. g. canning. Apple almost lost its allergenic potential after pasteurization in juice production.

  17. Methods for Dissecting Motivation and Related Psychological Processes in Rodents.

    PubMed

    Ward, Ryan D

    2016-01-01

    Motivational impairments are increasingly recognized as being critical to functional deficits and decreased quality of life in patients diagnosed with psychiatric disease. Accordingly, much preclinical research has focused on identifying psychological and neurobiological processes which underlie motivation . Inferring motivation from changes in overt behavioural responding in animal models, however, is complicated, and care must be taken to ensure that the observed change is accurately characterized as a change in motivation , and not due to some other, task-related process. This chapter discusses current methods for assessing motivation and related psychological processes in rodents. Using an example from work characterizing the motivational impairments in an animal model of the negative symptoms of schizophrenia, we highlight the importance of careful and rigorous experimental dissection of motivation and the related psychological processes when characterizing motivational deficits in rodent models . We suggest that such work is critical to the successful translation of preclinical findings to therapeutic benefits for patients.

  18. Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.

    PubMed

    Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth

    2016-05-15

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.

  19. Improvement of the System of Training of Specialists by University for Coal Mining Enterprises

    NASA Astrophysics Data System (ADS)

    Mikhalchenko, Vadim; Seredkina, Irina

    2017-11-01

    In the article the ingenious technique of the Quality Function Deployment with reference to the process of training of specialists with higher education by university is considered. The method is based on the step-by-step conversion of customer requirements into specific organizational, meaningful and functional transformations of the technological process of the university. A fully deployed quality function includes four stages of tracking customer requirements while creating a product: product planning and design, process design, production design. The Quality Function Deployment can be considered as one of the methods for optimizing the technological processes of training of specialists with higher education in the current economic conditions. Implemented at the initial stages of the life cycle of the technological process, it ensures not only the high quality of the "product" of graduate school, but also the fullest possible satisfaction of consumer's requests and expectations.

  20. High-conductance low-voltage organic thin film transistor with locally rearranged poly(3-hexylthiophene) domain by current annealing on plastic substrate

    NASA Astrophysics Data System (ADS)

    Pei, Zingway; Tsai, Hsing-Wang; Lai, Hsin-Cheng

    2016-02-01

    The organic material based thin film transistors (TFTs) are attractive for flexible optoelectronics applications due to the ability of lager area fabrication by solution and low temperature process on plastic substrate. Recently, the research of organic TFT focus on low operation voltage and high output current to achieve a low power organic logic circuit for optoelectronic device,such as e-paper or OLED displayer. To obtain low voltage and high output current, high gate capacitance and high channel mobility are key factors. The well-arranged polymer chain by a high temperature postannealing, leading enhancement conductivity of polymer film was a general method. However, the thermal annealing applying heat for all device on the substrate and may not applicable to plastic substrate. Therefore, in this work, the low operation voltage and high output current of polymer TFTs was demonstrated by locally electrical bias annealing. The poly(styrene-comethyl methacrylate) (PS-r-PMMA) with ultra-thin thickness is used as gate dielectric that the thickness is controlled by thermal treatment after spin coated on organic electrode. In electrical bias-annealing process, the PS-r- PMMA is acted a heating layer. After electrical bias-annealing, the polymer TFTs obtain high channel mobility at low voltage that lead high output current by a locally annealing of P3HT film. In the future, the locally electrical biasannealing method could be applied on plastic substrate for flexible optoelectronic application.

  1. The full-scale process and design changes for elimination of insulation edge separations and voids in tang flap area

    NASA Technical Reports Server (NTRS)

    Danforth, Richard A.

    1991-01-01

    Qualification of the full-scale process and design changes for elimination of redesigned solid rocket motor tang nitrile butadiene rubber insulation edge separations and voids was performed from 24 March to 3 December 1990. The objectives of this test were: to qualify design and process changes on flight hardware using a tie ply between the redesigned solid rocket motor steel case and the nitrile butadiene rubber insulation over the tang capture features; to qualify the use of methyl ethyl ketone in the tang flap region to reduce voids; and to determine if holes in the separator film reduce voids in the tang flap region. The tie ply is intended to aid insulation flow during the insulation cure process, and thus reduce or eliminate edge unbonds. Methyl ethyl ketone is intended to reduce voids in the tang flap area by providing better tacking characteristics. The perforated film was intended to provide possible vertical breathe paths to reduce voids in the tang area. Tang tie ply testing consisted of 270 deg of the tang circumference using a new layup method and 90 deg of the tang circumference using the current layup methods. Tie ply process success was defined as a reduction of insulation unbonds. Lack of any insulation edge unbonds on the tang area where the new process was used, and the presence of 17 unbonds with the current process, proves the test to be a success. Successful completion of this test has qualified the new processes.

  2. Research on fast Fourier transforms algorithm of huge remote sensing image technology with GPU and partitioning technology.

    PubMed

    Yang, Xue; Li, Xue-You; Li, Jia-Guo; Ma, Jun; Zhang, Li; Yang, Jan; Du, Quan-Ye

    2014-02-01

    Fast Fourier transforms (FFT) is a basic approach to remote sensing image processing. With the improvement of capacity of remote sensing image capture with the features of hyperspectrum, high spatial resolution and high temporal resolution, how to use FFT technology to efficiently process huge remote sensing image becomes the critical step and research hot spot of current image processing technology. FFT algorithm, one of the basic algorithms of image processing, can be used for stripe noise removal, image compression, image registration, etc. in processing remote sensing image. CUFFT function library is the FFT algorithm library based on CPU and FFTW. FFTW is a FFT algorithm developed based on CPU in PC platform, and is currently the fastest CPU based FFT algorithm function library. However there is a common problem that once the available memory or memory is less than the capacity of image, there will be out of memory or memory overflow when using the above two methods to realize image FFT arithmetic. To address this problem, a CPU and partitioning technology based Huge Remote Fast Fourier Transform (HRFFT) algorithm is proposed in this paper. By improving the FFT algorithm in CUFFT function library, the problem of out of memory and memory overflow is solved. Moreover, this method is proved rational by experiment combined with the CCD image of HJ-1A satellite. When applied to practical image processing, it improves effect of the image processing, speeds up the processing, which saves the time of computation and achieves sound result.

  3. Matrix decomposition graphics processing unit solver for Poisson image editing

    NASA Astrophysics Data System (ADS)

    Lei, Zhao; Wei, Li

    2012-10-01

    In recent years, gradient-domain methods have been widely discussed in the image processing field, including seamless cloning and image stitching. These algorithms are commonly carried out by solving a large sparse linear system: the Poisson equation. However, solving the Poisson equation is a computational and memory intensive task which makes it not suitable for real-time image editing. A new matrix decomposition graphics processing unit (GPU) solver (MDGS) is proposed to settle the problem. A matrix decomposition method is used to distribute the work among GPU threads, so that MDGS will take full advantage of the computing power of current GPUs. Additionally, MDGS is a hybrid solver (combines both the direct and iterative techniques) and has two-level architecture. These enable MDGS to generate identical solutions with those of the common Poisson methods and achieve high convergence rate in most cases. This approach is advantageous in terms of parallelizability, enabling real-time image processing, low memory-taken and extensive applications.

  4. The use of computational inspection to identify process window limiting hotspots and predict sub-15nm defects with high capture rate

    NASA Astrophysics Data System (ADS)

    Ham, Boo-Hyun; Kim, Il-Hwan; Park, Sung-Sik; Yeo, Sun-Young; Kim, Sang-Jin; Park, Dong-Woon; Park, Joon-Soo; Ryu, Chang-Hoon; Son, Bo-Kyeong; Hwang, Kyung-Bae; Shin, Jae-Min; Shin, Jangho; Park, Ki-Yeop; Park, Sean; Liu, Lei; Tien, Ming-Chun; Nachtwein, Angelique; Jochemsen, Marinus; Yan, Philip; Hu, Vincent; Jones, Christopher

    2017-03-01

    As critical dimensions for advanced two dimensional (2D) DUV patterning continue to shrink, the exact process window becomes increasingly difficult to determine. The defect size criteria shrink with the patterning critical dimensions and are well below the resolution of current optical inspection tools. As a result, it is more challenging for traditional bright field inspection tools to accurately discover the hotspots that define the process window. In this study, we use a novel computational inspection method to identify the depth-of-focus limiting features of a 10 nm node mask with 2D metal structures (single exposure) and compare the results to those obtained with a traditional process windows qualification (PWQ) method based on utilizing a focus modulated wafer and bright field inspection (BFI) to detect hotspot defects. The method is extended to litho-etch litho-etch (LELE) on a different test vehicle to show that overlay related bridging hotspots also can be identified.

  5. Qumquad: a UML-based approach for remodeling of legacy systems in health care.

    PubMed

    Garde, Sebastian; Knaup, Petra; Herold, Ralf

    2003-07-01

    Health care information systems still comprise legacy systems to a certain extent. For reengineering legacy systems a thorough remodeling is inalienable. Current modeling techniques like the Unified Modeling Language (UML) do not offer a systematic and comprehensive process-oriented method for remodeling activities. We developed a systematic method for remodeling legacy systems in health care called Qumquad. Qumquad consists of three major steps: (i) modeling the actual state of the application system, (ii) systematic identification of weak points in this model and (iii) development of a target concept for the reimplementation considering the identified weak points. We applied Qumquad for remodeling a documentation and therapy planning system for pediatric oncology (DOSPO). As a result of our remodeling activities we regained an abstract model of the system, an analysis of the current weak points of DOSPO and possible (partly alternative) solutions to overcome the weak points. Qumquad proved to be very helpful in the reengineering process of DOSPO since we now have at our disposal a comprehensive model for the reimplementation of DOSPO that current users of the system agree on. Qumquad can easily be applied to other reengineering projects in health care.

  6. Quantile Regression Models for Current Status Data

    PubMed Central

    Ou, Fang-Shu; Zeng, Donglin; Cai, Jianwen

    2016-01-01

    Current status data arise frequently in demography, epidemiology, and econometrics where the exact failure time cannot be determined but is only known to have occurred before or after a known observation time. We propose a quantile regression model to analyze current status data, because it does not require distributional assumptions and the coefficients can be interpreted as direct regression effects on the distribution of failure time in the original time scale. Our model assumes that the conditional quantile of failure time is a linear function of covariates. We assume conditional independence between the failure time and observation time. An M-estimator is developed for parameter estimation which is computed using the concave-convex procedure and its confidence intervals are constructed using a subsampling method. Asymptotic properties for the estimator are derived and proven using modern empirical process theory. The small sample performance of the proposed method is demonstrated via simulation studies. Finally, we apply the proposed method to analyze data from the Mayo Clinic Study of Aging. PMID:27994307

  7. Improved automated lumen contour detection by novel multifrequency processing algorithm with current intravascular ultrasound system.

    PubMed

    Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro

    2013-02-01

    The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.

  8. Pseudo and conditional score approach to joint analysis of current count and current status data.

    PubMed

    Wen, Chi-Chung; Chen, Yi-Hau

    2018-04-17

    We develop a joint analysis approach for recurrent and nonrecurrent event processes subject to case I interval censorship, which are also known in literature as current count and current status data, respectively. We use a shared frailty to link the recurrent and nonrecurrent event processes, while leaving the distribution of the frailty fully unspecified. Conditional on the frailty, the recurrent event is assumed to follow a nonhomogeneous Poisson process, and the mean function of the recurrent event and the survival function of the nonrecurrent event are assumed to follow some general form of semiparametric transformation models. Estimation of the models is based on the pseudo-likelihood and the conditional score techniques. The resulting estimators for the regression parameters and the unspecified baseline functions are shown to be consistent with rates of square and cubic roots of the sample size, respectively. Asymptotic normality with closed-form asymptotic variance is derived for the estimator of the regression parameters. We apply the proposed method to a fracture-osteoporosis survey data to identify risk factors jointly for fracture and osteoporosis in elders, while accounting for association between the two events within a subject. © 2018, The International Biometric Society.

  9. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  10. Machine Learning: A Crucial Tool for Sensor Design

    PubMed Central

    Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.

    2009-01-01

    Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110

  11. Development of an Ointment Formulation Using Hot-Melt Extrusion Technology.

    PubMed

    Bhagurkar, Ajinkya M; Angamuthu, Muralikrishnan; Patil, Hemlata; Tiwari, Roshan V; Maurya, Abhijeet; Hashemnejad, Seyed Meysam; Kundu, Santanu; Murthy, S Narasimha; Repka, Michael A

    2016-02-01

    Ointments are generally prepared either by fusion or by levigation methods. The current study proposes the use of hot-melt extrusion (HME) processing for the preparation of a polyethylene glycol base ointment. Lidocaine was used as a model drug. A modified screw design was used in this process, and parameters such as feeding rate, barrel temperature, and screw speed were optimized to obtain a uniform product. The product characteristics were compared with an ointment of similar composition prepared by conventional fusion method. The rheological properties, drug release profile, and texture characteristics of the hot-melt extruded product were similar to the conventionally prepared product. This study demonstrates a novel application of the hot-melt extrusion process in the manufacturing of topical semi-solids.

  12. A modified low-temperature wafer bonding method using spot pressing bonding technique and water glass adhesive layer

    NASA Astrophysics Data System (ADS)

    Xu, Yang; Wang, Shengkai; Wang, Yinghui; Chen, Dapeng

    2018-02-01

    A modified low-temperature wafer bonding method using a spot pressing bonding technique and a water glass adhesive layer is proposed. The electrical properties of the water glass layer has been studied by capacitance-voltage (C-V) and electric current-voltage (I-V) measurements. It is found that the adhesive layer can be regarded as a good insulator in terms of leakage current density. The bonding mechanism and the motion of bubbles during the thermal treatment are investigated. The dominant factor for the bubble motion in the modified bonding process is the gradient of pressure introduced by the spot pressing force. It is proved that the modified method achieves low-temperature adhesive bonding, minimizes the effect of water desorption, and provides good bonding performance.

  13. Multiplexed 3D FRET imaging in deep tissue of live embryos

    PubMed Central

    Zhao, Ming; Wan, Xiaoyang; Li, Yu; Zhou, Weibin; Peng, Leilei

    2015-01-01

    Current deep tissue microscopy techniques are mostly restricted to intensity mapping of fluorophores, which significantly limit their applications in investigating biochemical processes in vivo. We present a deep tissue multiplexed functional imaging method that probes multiple Förster resonant energy transfer (FRET) sensors in live embryos with high spatial resolution. The method simultaneously images fluorescence lifetimes in 3D with multiple excitation lasers. Through quantitative analysis of triple-channel intensity and lifetime images, we demonstrated that Ca2+ and cAMP levels of live embryos expressing dual FRET sensors can be monitored simultaneously at microscopic resolution. The method is compatible with a broad range of FRET sensors currently available for probing various cellular biochemical functions. It opens the door to imaging complex cellular circuitries in whole live organisms. PMID:26387920

  14. The current role of on-line extraction approaches in clinical and forensic toxicology.

    PubMed

    Mueller, Daniel M

    2014-08-01

    In today's clinical and forensic toxicological laboratories, automation is of interest because of its ability to optimize processes, to reduce manual workload and handling errors and to minimize exposition to potentially infectious samples. Extraction is usually the most time-consuming step; therefore, automation of this step is reasonable. Currently, from the field of clinical and forensic toxicology, methods using the following on-line extraction techniques have been published: on-line solid-phase extraction, turbulent flow chromatography, solid-phase microextraction, microextraction by packed sorbent, single-drop microextraction and on-line desorption of dried blood spots. Most of these published methods are either single-analyte or multicomponent procedures; methods intended for systematic toxicological analysis are relatively scarce. However, the use of on-line extraction will certainly increase in the near future.

  15. Processing of higher count rates in Troitsk nu-mass experiment

    NASA Astrophysics Data System (ADS)

    Nozik, Alexander; Chernov, Vaslily

    2018-04-01

    In this article we give a short outline of current status of search for sterile neutrinos with masses up to 4 keV in "Troitsk nu-mass experiment". We also discuss major sources of systematic uncertainties and methods to lower them.

  16. Scales and erosion

    USDA-ARS?s Scientific Manuscript database

    There is a need to develop scale explicit understanding of erosion to overcome existing conceptual and methodological flaws in our modelling methods currently applied to understand the process of erosion, transport and deposition at the catchment scale. These models need to be based on a sound under...

  17. Annual Review of Sociology. Volume 9, 1983.

    ERIC Educational Resources Information Center

    Turner, Ralph H., Ed.; Short, James F., Jr., Ed.

    Twenty-six essays describing current research in sociology are included in this publication. The essays fall into 10 categories: differentiation and stratification; political sociology; social processes; institutions; individual and society; formal organizations; urban sociology; theory and methods; sociology of world regions; and historical…

  18. Method and apparatus for resonant frequency waveform modulation

    DOEpatents

    Taubman, Matthew S [Richland, WA

    2011-06-07

    A resonant modulator device and process are described that provide enhanced resonant frequency waveforms to electrical devices including, e.g., laser devices. Faster, larger, and more complex modulation waveforms are obtained than can be obtained by use of conventional current controllers alone.

  19. A Proven Method for Meeting Export Control Objectives in Postal and Shipping Sectors

    DTIC Science & Technology

    2015-02-01

    months, the USPIS team developed and implemented an export screening standard operating procedure, implemented new and updated processes and systems ...support and protect the U.S. Postal Service and its employees, infrastructure, and customers; enforce the laws that defend the nation’s mail system ...the incidence of mail shipments violating export control laws, regulations, and standards . • Evaluate current processes and systems and identify

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hensley, Jesse; Ruddy, Daniel A.; Schaidle, Joshua A.

    Catalysts and processes designed to convert DME and/or methanol and hydrogen (H.sub.2) to desirable liquid fuels are described. These catalysts produce the fuels efficiently and with a high selectivity and yield, and reduce the formation of aromatic hydrocarbons by incorporating H.sub.2 into the products. Also described are process methods to further upgrade these fuels to higher molecular weight liquid fuel mixtures, which have physical properties comparable with current commercially used liquid fuels.

Top